langchain.js使用ollama的本地向量模型

2025-02-20 09:37:43 阅读:9 编辑

https://js.langchain.com/docs/integrations/text_embedding/ollama/

https://ollama.com/library/mxbai-embed-large

import { OllamaEmbeddings } from "@langchain/ollama";
const embeddings = new OllamaEmbeddings({
    model: "mxbai-embed-large", // Default value
    baseUrl: "http://139.9.222.224:11434", // Default value
});
let res = await embeddings.embedQuery("Hello, world!");
console.log(res);

部署

docker镜像源

https://www.coderjia.cn/archives/dba3f94c-a021-468a-8ac6-e840f85867ea

{
  "registry-mirrors": [
    "https://docker-0.unsee.tech"
  ]
}
拉取镜像
docker pull ollama/ollama
创建容器
docker run -d -v /opt/ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama
创建本地向量模型
docker exec -it ollama ollama pull mxbai-embed-large