Configure a warehouse-hosted model as AI provider
When you configure AI features for your organization, you can use the AI models available in your connected data platform as your AI provider and power some Sigma AI features.
This feature isn't supported by all data platform connections. To check if your connection supports it, see Supported data platforms and feature compatibility.
The use of AI features is subject to the following disclaimer.
User requirements
You must be assigned the Admin account type.
BigQuery requirements
If you use BigQuery as your AI provider, Sigma uses Vertex AI to access Google-provided models.
Supported models: Sigma currently supports Gemini 2.5 Flash. You cannot choose which model to use.
To use BigQuery as your AI provider, the following requirements apply:
-
You must set up a connection to BigQuery in Sigma. See Connect to BigQuery.
-
Billing must be enabled for your Google Cloud project. See Enable, disable, or change billing for a project in the Google Cloud documentation.
-
Your BigQuery service account must be assigned the following IAM roles:
Vertex AI Userto grant access to use Vertex AIBigQuery Data ViewerBigQuery Job User- (Optional)
BigQuery Data Editorto enable write access to your connection. Required if you configure a usage dashboard for Sigma Assistant.
-
(Optional) Identify a region to access the Vertex AI API from. Sigma uses the Vertex AI API to access the model. Google provides several different endpoints to access models located in different regions. For a full list of supported endpoints, see the Service endpoint section in the Vertex AI API page of the Google Cloud documentation. To reduce latency, Sigma recommends identifying an endpoint that is located near the region where your Sigma organization is hosted. By default, Sigma uses the
us-central1endpoint.
Performance of the Vertex AI service used by Sigma depends on the current traffic and usage the service. You might experience higher latency during busy periods. For more details, see Dynamic shared quota (DSQ) in the Google Cloud documentation.
Databricks requirements
If you use Databricks as your AI provider, Sigma uses models made available as Foundation Models through Mosaic AI Model Serving. For more details, see Deploy models using Mosaic AI Model Serving in the Databricks documentation.
To use Databricks as your AI provider, the following requirements apply:
-
You must set up a connection to Databricks in Sigma. See Connect to Databricks.
-
The region of your Databricks workspace must support the Foundation Model API. See Model serving features availability on the Features with limited regional availability page in the Databricks documentation.
-
Supported models: The region of your Databricks workspace must support all of the following models:
- The
GTE v1.5model family. databricks-claude-sonnet-4-5
You cannot choose which model to use. To determine if your region includes the required models, see Supported foundation models on Mosaic AI Model Serving.
- The
-
Your Databricks account must have access to the
ai_queryfunction and meet the relevant requirements, such as a running Databricks Serverless or Pro SQL warehouse. For more details, see the Requirements section of theai_queryfunction page in the Databricks documentation. -
Users must have access to a workspace with the supported models.
Databricks models might have a rate limit set. If you see an error:
PERMISSION DENIED: The endpoint is temporarily disabled due to a Databricks-set rate limit of 0., contact your Databricks admin about the relevant rate limits set for your workspace and thedatabricks-claude-sonnet-4-5model. See Foundation Model APIs limits and quotas in the Databricks documentation.
Snowflake requirements
If you use Snowflake as your AI provider, Sigma uses models made available through the Cortex REST API in Snowflake. For more details, see Cortex REST API in the Snowflake documentation.
To use Snowflake as your AI provider, the following requirements apply:
-
You must set up a connection to Snowflake in Sigma. The connection must be authenticated with OAuth or key pair authentication. See Connect to Snowflake.
-
The default role associated with the connection must be granted the
SNOWFLAKE.CORTEX_USERdatabase role. By default, this role is granted to the PUBLIC role. If you need to grant the database role, see Setting up authorization in the Snowflake documentation. More specifically:- For key pair authentication, the default role of the service account user must be granted the
SNOWFLAKE.CORTEX_USERdatabase role. - For key pair authentication and dynamically assign roles to users, all roles must be granted the
SNOWFLAKE.CORTEX_USERdatabase role. - For OAuth authentication, the default role of all users with access to the connection must be granted the
SNOWFLAKE.CORTEX_USERdatabase role. If you configure a service account, the role of the service account user must also be granted theSNOWFLAKE.CORTEX_USERdatabase role.
The Cortex REST API is not available in Snowflake trial accounts. If you set up an AI provider that uses a Snowflake trial account, you might see an error "400 status code (no body)". If you encounter this error when connecting to a Snowflake trial account, contact Snowflake Support.
- For key pair authentication, the default role of the service account user must be granted the
Supported models
The supported models used by Sigma depend on where your Snowflake account is hosted. You cannot choose which model to use. The region where your Snowflake account is hosted must support the following models:
-
The
snowflake-arctic-embed-l-v2.0EMBED function model.- If your region supports the embed model through the Cortex REST API, the API is used. See the section Model availability in the Vector embedding API page in the Snowflake documentation.
- If your region does not support accessing the embed model using the Cortex REST API, the
ai_embed()function is used to call the model through a SQL query. For details about regions that support this function, see Region availability in the AI_EMBED page in the Snowflake documentation. If your Snowflake account is hosted in a different region, enable theCORTEX_ENABLED_CROSS_REGIONaccount parameter. For more details, see CORTEX_ENABLED_CROSS_REGION in the Account parameters page in the Snowflake documentation.
-
A large language model (LLM) used for reasoning, called through the Cortex REST API. Specifically, Sigma requires the following:
- If your Snowflake account is hosted in AWS, your AWS region must also support
claude-sonnet-4-5. - If your Snowflake account is hosted in Azure, your Azure region must also support:
openai-gpt-5.1. - If your Snowflake account is hosted in a region that does not support either of those models, you must enable the
CORTEX_ENABLED_CROSS_REGIONaccount parameter to allow access to a region that supports the relevant model. If you enable this parameter, theopenai-gpt-5.1model is used. For more details about cross-region inference, see Cross-region inference and CORTEX_ENABLED_CROSS_REGION in the Account parameters page in the Snowflake documentation.
For details, see the section Model availability on the Cortex REST API page in the Snowflake documentation.
- If your Snowflake account is hosted in AWS, your AWS region must also support
Configure a warehouse-hosted model as your AI provider
After setting up your connection and confirming the availability of supported models, set up your AI provider to use a warehouse-hosted model:
-
Go to Administration > AI settings:
- In the Sigma header, click your user avatar to open the user menu.
- Select Administration to open the Administration portal.
- In the side panel, select AI settings.
-
In the AI provider section, for Provider hosting, select the Data warehouse hosted model option.
If you had an external model set up previously, click Edit, then click Remove to remove the selected AI provider and credentials. Any highlighted data sources are also removed.
-
In the Connection dropdown menu, select the connection.
-
(Optional, BigQuery connections only) In the Region dropdown menu, select the region to access the Vertex AI API from. By default, Sigma uses
us-central1. -
Click Save.
After adding the AI provider, highlight data sources to make available to Sigma Assistant by default.
Manage a warehouse-hosted AI provider
You can edit the AI provider integration at any time. For example, to update your credentials:
- Go to Administration > AI settings.
- In the AI provider section, click Edit.
- Make the desired changes. For example, change the AI provider. To change the AI provider, first click Remove.
- Click Save.
Remove an AI provider
To disable AI functionality within Sigma, you can remove the AI provider:
-
Go to Administration > AI settings.
-
In the AI provider section, click Edit.
-
Click Remove.
-
When prompted to confirm, click Remove.
-
Click Save.
The selected AI provider, credentials, and any highlighted data sources are removed.
After the integration is successfully removed, the AI provider section displays a Save button, and AI functionality is unavailable for your organization.
Updated 20 days ago
