Intro to input tables
Input tables are dynamic workbook elements that support structured data entry. They allow you to integrate new data points into your analysis and augment existing data from Snowflake or Databricks to facilitate rapid prototyping, advanced modeling, forecasting, what-if analysis, and more—without overwriting source data.
You can use input tables as sources for data elements (tables, pivot tables, and visualizations) or incorporate the data using lookups and joins. And when you create warehouse views for input tables, you can reuse the manually entered data across your broader data ecosystem.
This document introduces empty and linked input tables, which support a variety of use cases for ad hoc data entry. For information about creating and interacting with input tables, see Create and manage input tables and Edit existing input table columns.
Input table overview
Input tables give you the ability to do the following:
- Add rows (empty input tables only)
- Add columns (text, number, date, and logical)
- Input values through keyboard entry
- Paste values in up to 12,500 cells at once (500 rows and 25 columns)
- Configure data entry permissions
- Configure data validation
- Protect columns to prevent edits
- Add row edit history
For information about using this functionality, see Create and manage input tables.
Empty input tables
Empty input tables support data entry in standalone tables independent of existing data. All cells in an empty input table are editable, and you can add rows and columns to construct your table as you see fit.
Linked input tables
Linked input tables support data entry alongside data from Snowflake, Databricks, or data elements in the same workbook.
As a child element, a linked input table includes one or more columns that reference source columns in the parent element. This includes a primary key column, which provides row identifiers that define the input table's granularity and link manually entered data to existing data in the parent element.
Maintaining the data relationship between the input table and parent element requires the primary key column to reference static data in the parent element. All other linked columns can reference variable data, which is continually updated in the input table to reflect live data from the root source.
Frequently asked questions
Can I use input tables with CDW or DBMS connections other than Snowflake and Databricks?
Input tables are currently compatible with Snowflake and Databricks connections only. Sigma plans to add support for BigQuery, Postgres, and Redshift connections.
Can I change the write-to location where input table data is stored?
No. Sigma currently writes input table data to Snowflake and Databricks in a predefined schema identified in the connection details (Admin > Connections page).
Can linked input tables overwrite data in existing database tables?
No. Sigma protects the integrity of your existing data and stores input table data in a separate schema and database table. As a result, input tables can never overwrite source data.
Why isn’t there a view of published input tables in the connection panel (in the Sigma interface) that displays all database and schemas associated with my Snowflake or Databricks account?
When Sigma writes input table data to Snowflake or Databricks, the corresponding database table is created in a write-to schema not directly accessible from the Sigma interface. To identify the destination schema for input table data in Snowflake or Databricks, reference the connection details in the Admin > Connections page.
Can I query the input table data from my CDW or DBMS?
While you cannot query the database table directly, you can create a warehouse view for the input table and query that to retrieve the data stored in Snowflake or Databricks. For more information, see Create and manage workbook warehouse views.
How long is input table data retained in a Snowflake or Databricks database table after I delete it in Sigma?
Sigma doesn’t delete input table data from a connected CDW or DBMS. To remove input table data from Snowflake or Databricks, you must delete it directly in the data platform.