Skip to content
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -6,15 +6,10 @@ import { IAIProvider } from './ai-provider.interface.js';
export class AmazonBedrockAiProvider implements IAIProvider {
private readonly bedrockRuntimeClient: BedrockRuntimeClient;
private readonly modelId: string = 'global.anthropic.claude-sonnet-4-5-20250929-v1:0';
private readonly temperature: number = 0.7;
private readonly maxTokens: number = 1024;
private readonly region: string = 'us-west-2';
private readonly topP: number = 0.9;

constructor() {
this.bedrockRuntimeClient = new BedrockRuntimeClient({
region: this.region,
});
this.bedrockRuntimeClient = new BedrockRuntimeClient();
Copy link

Copilot AI Dec 23, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Removing the explicit region configuration means the BedrockRuntimeClient will now rely on the AWS SDK's default region resolution mechanism (environment variables, AWS config files, or instance metadata). This could lead to inconsistent behavior across different deployment environments if the region is not explicitly configured elsewhere. Consider whether the region should remain explicit to ensure consistent behavior, or document the requirement for proper AWS configuration in deployment environments.

Suggested change
this.bedrockRuntimeClient = new BedrockRuntimeClient();
const region = process.env.AWS_REGION ?? process.env.AWS_DEFAULT_REGION;
if (!region) {
throw new Error(
'AWS region is not configured. Please set AWS_REGION or AWS_DEFAULT_REGION environment variable.',
);
}
this.bedrockRuntimeClient = new BedrockRuntimeClient({ region });

Copilot uses AI. Check for mistakes.
}
public async generateResponse(prompt: string): Promise<string> {
const conversation = [
Expand All @@ -27,7 +22,7 @@ export class AmazonBedrockAiProvider implements IAIProvider {
const command = new ConverseCommand({
modelId: this.modelId,
messages: conversation,
inferenceConfig: { maxTokens: this.maxTokens, temperature: this.temperature, topP: this.topP },
inferenceConfig: { maxTokens: this.maxTokens },
Copy link

Copilot AI Dec 23, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Removing the temperature and topP parameters from the inferenceConfig changes the behavior of the AI model. Without explicit values, AWS Bedrock will use its default values, which may differ from the previously configured values (temperature: 0.7, topP: 0.9). This could lead to inconsistent AI responses compared to the previous implementation. Consider whether the default AWS values align with your requirements for response consistency and creativity, or if these parameters should be retained with explicit values.

Suggested change
inferenceConfig: { maxTokens: this.maxTokens },
inferenceConfig: { maxTokens: this.maxTokens, temperature: 0.7, topP: 0.9 },

Copilot uses AI. Check for mistakes.
});
try {
const response = await this.bedrockRuntimeClient.send(command);
Expand Down
Loading