Integration Setup
Last updated
Last updated
The first step in adding your target to the Probe is setting up the integration between them. Start by inputing your target's name within the Probe and selecting your integration based on your use case. With Probe, you can observe how your application performs across various layers, from the LLM to the platform level, which simulates real user interactions.
The following integration methods are currently supported:
API: REST API integration between your GenAI application and the Probe.
Platform: Probe’s test runs are executed on chatbots that are accessible through external platforms (e.g., Slack, WhatsApp). Probe uses the platform’s APIs to interact with the chatbots.
LLM: Tests are executed directly on the Large Language Model.
LLM Development Platform: The probe integrates with the APIs provided by LLM development platforms.
Once you’ve selected the appropriate integration, a configuration tab will appear on the next step, prompting you to input the required integration details. These inputs will be specific to the type of integration you’ve chosen, such as API keys, phone numbers, or endpoint URLs.
For details on the integration methods and descriptions of the input fields, find your preferred integration in the navigation bar under Integrations, or on the Integrations page.
Once all the required information is entered, click the “Continue” button. A connection test between the Probe and your target will run automatically in the background. The result of this test will be displayed in a dialog. You can proceed with the remaining configuration steps once the connection test is successful.