Configure a warehouse-hosted model as AI provider

When you configure AI features for your organization, you can use the AI models available in your connected data platform as your AI provider and power some Sigma AI features.

📘

This feature isn't supported by all data platform connections. To check if your connection supports it, see Supported data platforms and feature compatibility.

User requirements

You must be assigned the Admin account type.

BigQuery requirements

If you use BigQuery as your AI provider, Sigma uses Vertex AI to access Google-provided models.

Supported models: Sigma currently supports Gemini 2.5 Flash.

To use BigQuery as your AI provider, the following requirements apply:

📘

Performance of the Vertex AI service used by Sigma depends on the current traffic and usage the service. You might experience higher latency during busy periods. For more details, see Dynamic shared quota (DSQ) in the Google Cloud documentation.

Databricks requirements

If you use Databricks as your AI provider, Sigma uses models made available as Foundation Models through Mosaic AI Model Serving. For more details, see Deploy models using Mosaic AI Model Serving in the Databricks documentation.

To use Databricks as your AI provider, the following requirements apply:

  • You must set up a connection to Databricks in Sigma. See Connect to Databricks.

  • The region of your Databricks workspace must support the Foundation Model API. See Model serving features availability on the Features with limited regional availability page in the Databricks documentation.

  • Supported models: The region of your Databricks workspace must support all of the following models:

    • The BGE v1.5 model family.
    • databricks-claude-sonnet-4-5

    To determine if your region includes the required models, see Supported foundation models on Mosaic AI Model Serving.

  • Your Databricks account must have access to the ai_query function and meet the relevant requirements, such as a running Databricks Serverless or Pro SQL warehouse. For more details, see the Requirements section of the ai_query function page in the Databricks documentation.

Snowflake requirements

If you use Snowflake as your AI provider, Sigma uses models made available through the Cortex REST API in Snowflake. For more details, see Cortex REST API in the Snowflake documentation.

To use Snowflake as your AI provider, the following requirements apply:

  • You must set up a connection to Snowflake in Sigma. The connection must be authenticated with OAuth or key pair authentication. See Connect to Snowflake.

  • The default role associated with the connection must be granted the SNOWFLAKE.CORTEX_USER database role. By default, this role is granted to the PUBLIC role. If you need to grant the database role, see Setting up authorization in the Snowflake documentation. More specifically:

    • For key pair authentication, the default role of the service account user must be granted the SNOWFLAKE.CORTEX_USER database role.
    • For key pair authentication and dynamically assign roles to users, all roles must be granted the SNOWFLAKE.CORTEX_USER database role.
    • For OAuth authentication, the default role of all users with access to the connection must be granted the SNOWFLAKE.CORTEX_USER database role. If you configure a service account, the role of the service account user must also be granted the SNOWFLAKE.CORTEX_USER database role.
  • Supported models: The supported models used by Sigma depend on where your Snowflake account is hosted. The region where your Snowflake account is hosted must support the following models:

    • The snowflake-arctic-embed-l-v2.0 EMBED function model. For details, see the section Model availability in the EMBED function section of the Cortex REST API page in the Snowflake documentation.

    • Relevant models used for the COMPLETE function. For details, see the section Model availability in the COMPLETE function section of the Cortex REST API page in the Snowflake documentation. Specifically, Sigma requires the following:

      • If your Snowflake account is hosted in AWS, your AWS region must also support: claude-4-sonnet.
      • If your Snowflake account is hosted in Azure, your Azure region must also support: openai-gpt-4.1.
      • If the region where your Snowflake account is hosted does not support the COMPLETE function, you must enable the CORTEX_ENABLED_CROSS_REGION account parameter to allow access from the region where your account is hosted to another region that supports the relevant models. For more details, see Cross-region inference in the Snowflake documentation.

Configure a warehouse-hosted model as your AI provider

After setting up your connection and confirming the availability of supported models, set up your AI provider to use a warehouse-hosted model:

  1. Go to Administration > AI Settings:

    1. In the Sigma header, click your user avatar to open the user menu.
    2. Select Administration to open the Administration portal.
    3. In the side panel, select AI Settings.
  2. If you have External models set up as your AI provider, click Remove to set up a warehouse-hosted model instead.

    🚧

    When you remove an external AI provider, the OpenAI or Azure OpenAI key is cleared and all previously indexed data sources are removed.

  3. Select Data warehouse hosted model (recommended).

  4. In the Connection dropdown menu, select the connection.

  5. (Optional, BigQuery connections only) In the Region dropdown menu, select the region to access the Vertex AI API from. By default, Sigma uses us-central1.

  6. Click Save.

    If the connection is successful at reaching the AI models in your data warehouse, the Ask Sigma data sources section appears.

After adding the AI provider, highlight data sources to make available to Ask Sigma by default.