Theme
Integrations Overview
Integrations connect your OmniBots workspace to external services that power bot capabilities. Without integrations, your bots cannot generate AI responses, access external documents, hand off to live agents, or send push notifications.
Integration Types
OmniBots supports four categories of integrations.
| Type | Purpose | Providers |
|---|---|---|
| AI Models | Power LLM responses, RAG queries, and intent classification | Anthropic (Claude), OpenAI (GPT), Google Vertex AI (Gemini) |
| Storage | Connect external document sources to Knowledge Bases | SharePoint, Google Drive, S3, Dropbox, OneDrive, Box, Azure Blob |
| CCaaS | Enable live agent handoff from bot conversations | Genesys Cloud, Amazon Connect, 8x8, Google CCAI |
| Push Notifications | Send proactive messages to users | FCM, APNs, Web Push |
How Integrations Work
OmniBots uses a two-level integration model.
Platform-Level Integrations
Platform integrations are pre-configured by the OmniBots platform team and made available to all tenants. These define which providers and connection types are supported. You do not create platform integrations -- they are provided for you.
Tenant-Level Integrations
Tenant integrations are the connections you configure in your workspace. You supply credentials (API keys, service accounts, OAuth tokens) and settings specific to your accounts with each provider. Each tenant integration is linked to a platform integration type.
Integration Assignments
When a platform integration is assigned to your tenant, it appears in your integrations list and becomes available for use in bot flows. You supply credentials to activate it.
Integrations overview page showing provider cards organized by type (AI Models, Storage, CCaaS, Push Notifications) with status indicators and connection test results
TIP
You can create multiple integrations of the same type. For example, you might have two Anthropic integrations -- one for development and one for production -- each with different API keys and rate limits.
Adding an Integration
- Go to Settings > Integrations.
- Click Add Integration.
- Select the integration type (AI Model, Storage, CCaaS, or Push).
- Choose a provider from the list.
- Fill in the required credentials and configuration fields.
- Click Test Connection to verify the credentials work.
- Click Save.
Testing Connections
Every integration supports a Test Connection action. This sends a lightweight request to the external service to verify that your credentials are valid and the service is reachable.
| Test Result | Meaning |
|---|---|
| Success | Credentials are valid and the service responded correctly |
| Auth Failed | Credentials are invalid or expired -- check API key or service account |
| Timeout | The external service did not respond -- check network or service status |
| Error | An unexpected error occurred -- review the error details shown |
WARNING
A successful connection test does not guarantee the integration will work in all scenarios. For example, an AI model integration may pass the test but fail at runtime if rate limits are exceeded or the selected model is not available in your account.
Using Integrations in Flows
Once configured, integrations become available in the flow builder. Each node that requires an integration (LLM Response, RAG Query, Handoff, etc.) includes an Integration selector where you choose which connection to use.
Different nodes in the same flow can use different integrations. For example, one LLM Response node might use Anthropic Claude while another uses OpenAI GPT -- each referencing a separate integration you configured.
Managing Integrations
From the Settings > Integrations list, you can:
- Edit an integration to update credentials or settings
- Test the connection at any time to verify it still works
- Disable an integration to temporarily stop it from being used
- Delete an integration that is no longer needed
WARNING
Deleting an integration that is actively used by bot flows will cause those nodes to fail at runtime. Disable the integration first and update affected flows before deleting.
Next Steps
- Configure AI Models to power bot responses
- Connect Storage Providers for Knowledge Base document sources
- Set up CCaaS Platforms for live agent handoff
- Enable Push Notifications for proactive messaging