# Bedrock

## Selecting the Connection Type

Once you have [selected your connection type](https://docs.probe.splx.ai/ai-red-teaming/probe/add-target/integration-setup#selecting-a-connection-type), a configuration tab will appear on the next step, prompting you to input the required connection details.

## Integration Setup

<figure><img src="https://1029475228-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2Fi12bk7lo75SODuwcRCQp%2Fuploads%2FJfmj45KNz83hSBsT8E7j%2F2_bedrock.png?alt=media&#x26;token=6b35073f-0dbe-48d8-8d36-5762c903f1d1" alt=""><figcaption><p>Figure 1: Bedrock Integration Example</p></figcaption></figure>

* **System Prompt** - Your application’s system prompt. It sets the initial instructions or context for the AI model, defines the behavior, tone, and specific guidelines the AI should follow while interacting. For best practices, refer to the [OpenAI documentation on prompt engineering](https://platform.openai.com/docs/guides/prompt-engineering).
* **AWS Access Key ID & AWS Secret Access Key**&#x20;
  * These can be created and accessed via the AWS Management Console.&#x20;
  * Navigate to the [IAM section](https://console.aws.amazon.com/iam/), and under the Users tab, select the desired user.&#x20;
  * In the Security credentials tab, you can create a new key or view existing Access Key IDs.&#x20;
  * Note that AWS Secret Access Keys are only shown during creation.&#x20;
  * For step by step guide, explore the official AWS documentation, Updating [IAM user access keys (console) section](https://docs.aws.amazon.com/IAM/latest/UserGuide/id-credentials-access-keys-update.html#rotating_access_keys_console).
* **AWS Region** - The AWS Region where your resource is located, it can be found in the top-right corner of the [AWS Management Console](https://console.aws.amazon.com/).&#x20;
* **Model** - Specify one of the models supported by Amazon Bedrock by entering its Model ID, which can be found on the [supported models page](https://docs.aws.amazon.com/bedrock/latest/userguide/models-supported.html) within the Amazon Bedrock documentation or console.
* **Extra LLM Config** - Any request-body LLM configuration parameters the UI does not expose explicitly, such as temperature, top\_p, or max\_tokens. Leave the field blank to accept provider defaults. You can check all the optional fields in the [OpenAI cookbook](https://cookbook.openai.com/examples/how_to_format_inputs_to_chatgpt_models?utm_source=chatgpt.com).&#x20;

{% hint style="info" %}
For the extra LLM config, the **Add +** button needs to be pressed after the Key and Value textboxes are filled.
{% endhint %}

For more information, you can explore the official [Amazon Bedrock](https://docs.aws.amazon.com/bedrock/) documentation.
