Skip to main content
Open on GitHub

ModelScope

ModelScope is a big repository of the models and datasets.

This page covers how to use the modelscope ecosystem within LangChain. It is broken into two parts: installation and setup, and then references to specific modelscope wrappers.

Installation

pip install -U langchain-modelscope-integration

Head to ModelScope to sign up to ModelScope and generate an SDK token. Once you've done this set the MODELSCOPE_SDK_TOKEN environment variable:

export MODELSCOPE_SDK_TOKEN=<your_sdk_token>

Chat Models

ModelScopeChatEndpoint class exposes chat models from ModelScope. See available models here.

from langchain_modelscope import ModelScopeChatEndpoint

llm = ModelScopeChatEndpoint(model="Qwen/Qwen2.5-Coder-32B-Instruct")
llm.invoke("Sing a ballad of LangChain.")

Embeddings

ModelScopeEmbeddings class exposes embeddings from ModelScope.

from langchain_modelscope import ModelScopeEmbeddings

embeddings = ModelScopeEmbeddings(model_id="damo/nlp_corom_sentence-embedding_english-base")
embeddings.embed_query("What is the meaning of life?")

LLMs

ModelScopeLLM class exposes LLMs from ModelScope.

from langchain_modelscope import ModelScopeLLM

llm = ModelScopeLLM(model="Qwen/Qwen2.5-Coder-32B-Instruct")
llm.invoke("The meaning of life is")

Was this page helpful?