The SDK automatically detects the provider based on the model name. Below is the complete list of supported embedding models.
Anthropic does not offer native embedding models. Use OpenAI, Google, or Amazon Bedrock for embeddings.
OpenAI
| Model | Dimensions | Description |
|---|
text-embedding-3-small | 1536 | Recommended for most use cases. Best balance of cost and performance. |
text-embedding-3-large | 3072 | Higher quality embeddings with more dimensions. Supports dimensions parameter for reduced output. |
text-embedding-ada-002 | 1536 | Legacy model. Use text-embedding-3-small for new projects. |
Dimension Override
OpenAI’s text-embedding-3-small and text-embedding-3-large, and Google’s text-embedding-004 support the dimensions parameter to reduce the output size:
# Reduce text-embedding-3-large from 3072 to 256 dimensions
result = client.run_embedding(
model="text-embedding-3-large",
input="Hello world",
dimensions=256,
)
print(len(result.embeddings[0].values)) # 256
Google
| Model | Dimensions | Description |
|---|
text-embedding-004 | 768 | Google’s latest text embedding model. |
result = client.run_embedding(
model="text-embedding-004",
input="Hello world",
)
print(len(result.embeddings[0].values)) # 768
Amazon Bedrock
| Model | Dimensions | Description |
|---|
amazon.titan-embed-text-v2:0 | 1024 | Amazon Titan Text Embeddings v2. |
cohere.embed-english-v3 | 1024 | Cohere Embed English v3 via Bedrock. |
result = client.run_embedding(
model="amazon.titan-embed-text-v2:0",
input="Hello world",
)
print(result.provider) # LLMProvider.AMAZON_BEDROCK
Amazon Bedrock models require AWS credentials (AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, AWS_REGION) to be set in your environment.
Voyage AI
| Model | Dimensions | Description |
|---|
voyage-3 | 1024 | General-purpose embedding model. |
voyage-3-large | 1024 | Higher quality, larger model. |
voyage-3-lite | 512 | Lightweight, lower cost. |
voyage-code-3 | 1024 | Optimized for code retrieval and understanding. |
voyage-finance-2 | 1024 | Fine-tuned for financial documents. |
voyage-law-2 | 1024 | Fine-tuned for legal documents. |
result = client.run_embedding(
model="voyage-3",
input="Hello world",
)
print(result.provider) # LLMProvider.VOYAGE
Voyage AI requires a VOYAGE_API_KEY environment variable. Voyage is an embedding-only provider and cannot be used with run_local().
Using Custom Models
For embedding models not in the built-in list (fine-tuned or new releases), specify the provider explicitly:
from tracia import LLMProvider
result = client.run_embedding(
model="custom-embedding-model",
provider=LLMProvider.OPENAI,
input="Hello world",
)
When using custom models, always specify the provider parameter to ensure the correct SDK is used.