使用适用于 Swift 的软件开发工具包的亚马逊 Bedrock - AWS SDK 代码示例

文档 AWS SDK 示例 GitHub 存储库中还有更多 S AWS DK 示例

本文属于机器翻译版本。若本译文内容与英语原文存在差异,则一律以英文原文为准。

使用适用于 Swift 的软件开发工具包的亚马逊 Bedrock

以下代码示例向您展示了如何使用带有 Amazon Bedrock Runtime 的 Swift AWS 开发工具包来执行操作和实现常见场景。

每个示例都包含一个指向完整源代码的链接,您可以从中找到有关如何在上下文中设置和运行代码的说明。

亚马逊 Nova

以下代码示例展示了如何使用 Bedrock 的 Converse API 向 Amazon Nova 发送短信。

适用于 Swift 的 SDK
注意

还有更多相关信息 GitHub。在 AWS 代码示例存储库中查找完整示例,了解如何进行设置和运行。

使用 Bedrock 的 Converse API 向 Amazon Nova 发送短信。

// An example demonstrating how to use the Conversation API to send // a text message to Amazon Nova. import AWSBedrockRuntime func converse(_ textPrompt: String) async throws -> String { // Create a Bedrock Runtime client in the AWS Region you want to use. let config = try await BedrockRuntimeClient.BedrockRuntimeClientConfiguration( region: "us-east-1" ) let client = BedrockRuntimeClient(config: config) // Set the model ID. let modelId = "amazon.nova-micro-v1:0" // Start a conversation with the user message. let message = BedrockRuntimeClientTypes.Message( content: [.text(textPrompt)], role: .user ) // Optionally use inference parameters let inferenceConfig = BedrockRuntimeClientTypes.InferenceConfiguration( maxTokens: 512, stopSequences: ["END"], temperature: 0.5, topp: 0.9 ) // Create the ConverseInput to send to the model let input = ConverseInput( inferenceConfig: inferenceConfig, messages: [message], modelId: modelId) // Send the ConverseInput to the model let response = try await client.converse(input: input) // Extract and return the response text. if case let .message(msg) = response.output { if case let .text(textResponse) = msg.content![0] { return textResponse } else { return "No text response found in message content" } } else { return "No message found in converse output" } }
  • 有关 API 的详细信息,请参阅 S wift 版AWS SDK 中的 Converse API 参考

以下代码示例展示了如何使用 Bedrock 的 Converse API 向 Amazon Nova 发送短信并实时处理响应流。

适用于 Swift 的 SDK
注意

还有更多相关信息 GitHub。在 AWS 代码示例存储库中查找完整示例,了解如何进行设置和运行。

使用 Bedrock 的 Converse API 向 Amazon Nova 发送短信并实时处理响应流。

// An example demonstrating how to use the Conversation API to send a text message // to Amazon Nova and print the response stream import AWSBedrockRuntime func printConverseStream(_ textPrompt: String) async throws { // Create a Bedrock Runtime client in the AWS Region you want to use. let config = try await BedrockRuntimeClient.BedrockRuntimeClientConfiguration( region: "us-east-1" ) let client = BedrockRuntimeClient(config: config) // Set the model ID. let modelId = "amazon.nova-lite-v1:0" // Start a conversation with the user message. let message = BedrockRuntimeClientTypes.Message( content: [.text(textPrompt)], role: .user ) // Optionally use inference parameters. let inferenceConfig = BedrockRuntimeClientTypes.InferenceConfiguration( maxTokens: 512, stopSequences: ["END"], temperature: 0.5, topp: 0.9 ) // Create the ConverseStreamInput to send to the model. let input = ConverseStreamInput( inferenceConfig: inferenceConfig, messages: [message], modelId: modelId) // Send the ConverseStreamInput to the model. let response = try await client.converseStream(input: input) // Extract the streaming response. guard let stream = response.stream else { print("No stream available") return } // Extract and print the streamed response text in real-time. for try await event in stream { switch event { case .messagestart(_): print("\nNova Lite:") case .contentblockdelta(let deltaEvent): if case .text(let text) = deltaEvent.delta { print(text, terminator: "") } default: break } } }
  • 如需了解 API 的详细信息,请参阅适用ConverseStream于 S wift 的AWS SDK API 参考

Anthropic Claude

以下代码示例展示了如何使用 Bedrock 的 Converse API 向 Anthropic Claude 发送短信。

适用于 Swift 的 SDK
注意

还有更多相关信息 GitHub。在 AWS 代码示例存储库中查找完整示例,了解如何进行设置和运行。

使用 Bedrock 的 Converse API 向 Anthropic Claude 发送文本消息。

// An example demonstrating how to use the Conversation API to send // a text message to Anthropic Claude. import AWSBedrockRuntime func converse(_ textPrompt: String) async throws -> String { // Create a Bedrock Runtime client in the AWS Region you want to use. let config = try await BedrockRuntimeClient.BedrockRuntimeClientConfiguration( region: "us-east-1" ) let client = BedrockRuntimeClient(config: config) // Set the model ID. let modelId = "anthropic.claude-3-haiku-20240307-v1:0" // Start a conversation with the user message. let message = BedrockRuntimeClientTypes.Message( content: [.text(textPrompt)], role: .user ) // Optionally use inference parameters let inferenceConfig = BedrockRuntimeClientTypes.InferenceConfiguration( maxTokens: 512, stopSequences: ["END"], temperature: 0.5, topp: 0.9 ) // Create the ConverseInput to send to the model let input = ConverseInput( inferenceConfig: inferenceConfig, messages: [message], modelId: modelId) // Send the ConverseInput to the model let response = try await client.converse(input: input) // Extract and return the response text. if case let .message(msg) = response.output { if case let .text(textResponse) = msg.content![0] { return textResponse } else { return "No text response found in message content" } } else { return "No message found in converse output" } }
  • 有关 API 的详细信息,请参阅 S wift 版AWS SDK 中的 Converse API 参考

以下代码示例展示了如何使用 Bedrock 的 Converse API 向 Anthropic Claude 发送短信并实时处理响应流。

适用于 Swift 的 SDK
注意

还有更多相关信息 GitHub。在 AWS 代码示例存储库中查找完整示例,了解如何进行设置和运行。

使用 Bedrock 的 Converse API 向 Anthropic Claude 发送文本消息并实时处理响应流。

// An example demonstrating how to use the Conversation API to send a text message // to Anthropic Claude and print the response stream import AWSBedrockRuntime func printConverseStream(_ textPrompt: String) async throws { // Create a Bedrock Runtime client in the AWS Region you want to use. let config = try await BedrockRuntimeClient.BedrockRuntimeClientConfiguration( region: "us-east-1" ) let client = BedrockRuntimeClient(config: config) // Set the model ID. let modelId = "anthropic.claude-3-haiku-20240307-v1:0" // Start a conversation with the user message. let message = BedrockRuntimeClientTypes.Message( content: [.text(textPrompt)], role: .user ) // Optionally use inference parameters. let inferenceConfig = BedrockRuntimeClientTypes.InferenceConfiguration( maxTokens: 512, stopSequences: ["END"], temperature: 0.5, topp: 0.9 ) // Create the ConverseStreamInput to send to the model. let input = ConverseStreamInput( inferenceConfig: inferenceConfig, messages: [message], modelId: modelId) // Send the ConverseStreamInput to the model. let response = try await client.converseStream(input: input) // Extract the streaming response. guard let stream = response.stream else { print("No stream available") return } // Extract and print the streamed response text in real-time. for try await event in stream { switch event { case .messagestart(_): print("\nAnthropic Claude:") case .contentblockdelta(let deltaEvent): if case .text(let text) = deltaEvent.delta { print(text, terminator: "") } default: break } } }
  • 如需了解 API 的详细信息,请参阅适用ConverseStream于 S wift 的AWS SDK API 参考