Skip to main content
The SDK automatically detects the provider based on the model name. Below is the complete list of supported embedding models.
Anthropic does not offer native embedding models. Use OpenAI, Google, or Amazon Bedrock for embeddings.

OpenAI

ModelDimensionsDescription
text-embedding-3-small1536Recommended for most use cases. Best balance of cost and performance.
text-embedding-3-large3072Higher quality embeddings with more dimensions. Supports dimensions parameter for reduced output.
text-embedding-ada-0021536Legacy model. Use text-embedding-3-small for new projects.

Dimension Override

OpenAI’s text-embedding-3-small and text-embedding-3-large, and Google’s text-embedding-004 support the dimensions parameter to reduce the output size:
# Reduce text-embedding-3-large from 3072 to 256 dimensions
result = client.run_embedding(
    model="text-embedding-3-large",
    input="Hello world",
    dimensions=256,
)

print(len(result.embeddings[0].values))  # 256

Google

ModelDimensionsDescription
text-embedding-004768Google’s latest text embedding model.
result = client.run_embedding(
    model="text-embedding-004",
    input="Hello world",
)

print(len(result.embeddings[0].values))  # 768

Amazon Bedrock

ModelDimensionsDescription
amazon.titan-embed-text-v2:01024Amazon Titan Text Embeddings v2.
cohere.embed-english-v31024Cohere Embed English v3 via Bedrock.
result = client.run_embedding(
    model="amazon.titan-embed-text-v2:0",
    input="Hello world",
)

print(result.provider)  # LLMProvider.AMAZON_BEDROCK
Amazon Bedrock models require AWS credentials (AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, AWS_REGION) to be set in your environment.

Voyage AI

ModelDimensionsDescription
voyage-31024General-purpose embedding model.
voyage-3-large1024Higher quality, larger model.
voyage-3-lite512Lightweight, lower cost.
voyage-code-31024Optimized for code retrieval and understanding.
voyage-finance-21024Fine-tuned for financial documents.
voyage-law-21024Fine-tuned for legal documents.
result = client.run_embedding(
    model="voyage-3",
    input="Hello world",
)

print(result.provider)  # LLMProvider.VOYAGE
Voyage AI requires a VOYAGE_API_KEY environment variable. Voyage is an embedding-only provider and cannot be used with run_local().

Using Custom Models

For embedding models not in the built-in list (fine-tuned or new releases), specify the provider explicitly:
from tracia import LLMProvider

result = client.run_embedding(
    model="custom-embedding-model",
    provider=LLMProvider.OPENAI,
    input="Hello world",
)
When using custom models, always specify the provider parameter to ensure the correct SDK is used.