Skip to main content
Tracia supports Amazon Bedrock via the Vercel AI SDK Amazon Bedrock provider.

Installation

npm install ai @ai-sdk/amazon-bedrock

Environment Variables

Set your AWS credentials:
AWS_ACCESS_KEY_ID=your-access-key-id
AWS_SECRET_ACCESS_KEY=your-secret-access-key
AWS_REGION=us-east-1

Usage

Bedrock hosts models from multiple vendors. Use the full Bedrock model ID:

Amazon Nova

const result = await tracia.runLocal({
  model: 'amazon.nova-lite-v1:0',
  messages: [
    { role: 'user', content: 'What are the benefits of cloud computing?' }
  ],
  temperature: 0.7,
  maxOutputTokens: 500
});

Anthropic Claude (via Bedrock)

const result = await tracia.runLocal({
  model: 'anthropic.claude-sonnet-4-5-20250929-v1:0',
  messages: [
    { role: 'system', content: 'You are a helpful assistant.' },
    { role: 'user', content: 'Explain quantum computing.' }
  ],
  temperature: 0.7,
  maxOutputTokens: 1000
});

Meta Llama (via Bedrock)

const result = await tracia.runLocal({
  model: 'meta.llama4-scout-17b-instruct-v1:0',
  messages: [
    { role: 'user', content: 'Write a haiku about programming.' }
  ],
});

Streaming

const stream = tracia.runLocal({
  model: 'amazon.nova-lite-v1:0',
  messages: [{ role: 'user', content: 'Write a poem about the cloud.' }],
  stream: true,
});

for await (const chunk of stream) {
  process.stdout.write(chunk);
}

Available Models

Bedrock provides access to models from multiple vendors:
VendorExample Model IDDescription
Amazonamazon.nova-lite-v1:0Amazon’s Nova family
Anthropicanthropic.claude-sonnet-4-5-20250929-v1:0Claude models via Bedrock
Metameta.llama4-scout-17b-instruct-v1:0Llama models via Bedrock
Mistralmistral.mistral-large-3-675b-instructMistral models via Bedrock
DeepSeekdeepseek.v3-v1:0DeepSeek models via Bedrock
Coherecohere.command-r-plus-v1:0Cohere models via Bedrock
See the AWS Bedrock documentation for the complete list of available models.