Ollama Embeddings
Select a language
- Python
- JavaScript
Chroma provides a convenient wrapper around Ollama'
s embeddings API. You can use
the OllamaEmbeddingFunction
embedding function to generate embeddings for your documents with
a model of your choice.
import chromadb.utils.embedding_functions as embedding_functions
ollama_ef = embedding_functions.OllamaEmbeddingFunction(
url="http://localhost:11434/api/embeddings",
model_name="llama2",
)
embeddings = ollama_ef(["This is my first text to embed",
"This is my second document"])
// const {OllamaEmbeddingFunction} = require('chromadb'); //CJS import
import {OllamaEmbeddingFunction} from "chromadb"; //ESM import
const embedder = new OllamaEmbeddingFunction({
url: "http://127.0.0.1:11434/api/embeddings",
model: "llama2"
})
// use directly
const embeddings = embedder.generate(["document1", "document2"])
// pass documents to query for .add and .query
const collection = await client.createCollection({
name: "name",
embeddingFunction: embedder
})
const collection = await client.getCollection({
name: "name",
embeddingFunction: embedder
})
You can pass in an optional model_name
argument, which lets you choose which OpenAI embeddings model to use. By
default, Chroma uses text-embedding-ada-002
. You can see a list of all available
models here.