Snowflake Enterprise MCP Server by MCP4E turns natural language questions into instant, secure insights. It automatically converts plain English into SQL, executes queries in Snowflake with role-based access control, and delivers answers in real time. Empower non-technical users, accelerate decisions, boost productivity, and maintain enterprise-grade governance—all while maximizing the value of your AI and data investments.
# Install Directly
Download MCP4E Launcher. Run LICENSE_KEY=KEY_GOES_HERE ./mcp4e-launcher
# Or via Docker
docker run mcp4e/snowflake-enterprise-mcp-server
Interested in seeing this product in action? Schedule a personalized demo with our team.
Connect your AI projects with a Snowflake Database. Large Language Models can interact with tools that extend the capability to interact with resources outside of the LLM.
The Snowflake Enterprise MCP server connects the following concepts into one simple service:
this service is an Enterprise offering and focuses on meeting enterprise needs of:
Environment variables can be set in one of two ways:
When to pick between which approach? We recommend setting SECURE values directly on the virtual machine. For other attributes that are not particularly secret, setting those values in the LICENSE configuration is an easy spot to manage configuration. Configuration is re-fetched whenever a new version of the MCP Server is found.
| Variable | Description | Default |
|---|---|---|
AWS_ACCESS_KEY_ID |
This is used for S3 operations. Recommended to provide this directly when running the application | (none) |
AWS_BUCKET |
some-s3-bucket to store the CSV reports and logs to | (none) |
AWS_REGION |
region of S3 bucket. Example: us-west-1 | (none) |
AWS_SECRET_ACCESS_KEY |
Secret for S3 operations. Recommended to provide this directly. | (none) |
AWS_SIGNING_DURATION_IN_MINUTES |
duration to allow downloading a report (presign url expiration) | 1880 |
CORS_ALLOWED_ORIGIN |
Add if using JavaScript that calls the MCP Server directly to prevent CORS issues. | (none) |
SNOWFLAKE_ACCOUNT_ID |
Snowflake account ID. This is used as part of the URL when configuring the connection to snowflake | YourAccountId |
SNOWFLAKE_ANALYST_BASE_URL |
URL For Snowflake Analyst. E.G., "https://FMXXXX-HPBXXXX.snowflakecomputing.com" | (none) |
SNOWFLAKE_ANALYST_DEFAULT_ROLE |
Role to run the queries as in Snowflake Analyst | (none) |
SNOWFLAKE_ANALYST_SEMANTIC_MODEL_FILE |
This is the Semantic Model File that connects Analyst with the underlying data. This file needs to be carefully managed. Example: "@SOME_DATABASE.SOME_SCHEMA.SOME_STAGE/semantic_model_name.yaml" | (none) |
SNOWFLAKE_JDBC_ACCOUNT |
Your Snowflake Account ID. E.g., "FMXXXX-HPBXXXX" | (none) |
SNOWFLAKE_JDBC_DATABASE |
Database name goes here. | (none) |
SNOWFLAKE_JDBC_URL |
Database connection string. Example: "jdbc:snowflake://FMXXXX-HPBXXXX.snowflakecomputing.com" | (none) |
SNOWFLAKE_JDBC_USERNAME_PREFIX |
useful for setting a prefix when using RBAC. Example, "USER_ID_". This takes a cognito subject claim, converts hyphens to underscores and outputs a name like, "USER_ID_XXX_XXX_XXX". | (none) |
SNOWFLAKE_JDBC_WAREHOUSE |
Snowflake Warehouse to use. | (none) |
TOOL_SNOWFLAKE_DESCRIPTION |
Set this description to provide the LLM with more specific data-aware information. | Perform data analysis against data stored in the Snowflake Data Warehouse. Accepts plain text, performs Text to SQL analysis using Snowflake Analyst, then executes the query on Snowflake via JDBC as the authenticated user. |
TOOL_SNOWFLAKE_NAME |
Set this to override the Analyst (Text to SQL) and Query Execution tool name | ask_snowflake_dw |
TOOL_SQL_FETCH_PAGED_DESCRIPTION |
Describes how to fetch paginated results (JSON) | (none) |
TOOL_SQL_FETCH_PAGED_NAME |
describes how to fetch paginated results to LLM | (none) |
TOOL_SQL_QUERY_HISTORY_DESCRIPTION |
Describes the tool for fetching the user's query history | (none) |
TOOL_SQL_QUERY_HISTORY_NAME |
set the name for getting the user's query history | (none) |