Utilize Cortex Agents with the Snowflake Agent¶
AI Refinery simplifies the process of integrating third-party agents into your agentic team and workflow. This guide provides an example of using the SnowflakeAgent
, a database assistant that enables users to interact with data stored in Snowflake databases.
Objective¶
Create a database assistant that enables you to explore and analyze your data by leveraging the SnowflakeAgent
, one of the third-party agents supported by AI Refinery.
Tutorial Description¶
This tutorial provides a step-by-step guide on building a Cortex agent on the Snowflake platform, how to connect it with data tables and Cortex services (Search and Analyst), and utilize the SnowflakeAgent
from the AI Refinery SDK to interact with your data seamlessly.
Tutorial Workflow¶
The tutorial provides a detailed walkthrough of the following steps:
- Setting up a Snowflake account: Sign up for a free trial account on the Snowflake platform.
- Creating a Cortex agent:
- Database and schema creation: Create a sample database along with a schema within it.
- Table creation: Add data tables to populate the database.
- Generating sample tables: Example of how to create data tables in a schema.
- Configuring Cortex services: Set up Cortex Search and Cortex Analyst services, and connect them to your data tables.
- Building a Cortex agent: Develop a Cortex agent and link it to the configured Cortex Search and Cortex Analyst services.
- Configuration and Execution example: Sample code demonstrating how to utilize the created Cortex agent and configure a
Snowflake Agent
within AI Refinery. This example showcases the advanced capabilities of Cortex agents integrated into the AI Refinery platform.
Setting up a Snowflake account ¶
To create a Snowflake Agent
on the AI Refinery platform, you first need to set up a Cortex agent, which can be customized directly within the Snowflake platform. You can sign up for a free Snowflake account (30-day free trial, worths $400 free usage). If you already have a corporate account, you can log in to access the data already stored in Snowflake. Once your account is set up, ensure that you have the ADMIN role to manage tables and create agents.
Creating a Cortex agent ¶
The steps below guide you through the process of creating a Cortex agent, assuming you have admin privileges on your personal account. If you are using a corporate account, you may need to request certain permissions from your account administrator.
1. Database and schema creation:¶
From the Catalog section, click on Database Explorer, and proceed with creating a new database (or use an existing one). In this tutorial, we are going to use the SNOWFLAKE_INTELLIGENCE_DEMO as our sample database.
Then you need to create a schema (or use an existing one) to manage data tables. In this tutorial, we are going to stick with TAH as our sample schema.
2. Table creation:¶
After setting up the schema, you can add your data tables, by following either of the available ways; e.g., creating the table from a file or through running a SQL command.
To ensure thoroughness, in this tutorial we are going to use three tables: SALES_METRICS, which is a table with numerical values (structured data), and SALES_CONVERSATIONS and CUSTOMER_FEEDBACK, which are two tables with textual information (unstructured data).
Below are the data previews of these tables:
3. Generating sample tables:¶
In this section we provide detailed SQL commands to populate the sample tables used in this tutorial:
-
Sales metrics table:
USE DATABASE SNOWFLAKE_INTELLIGENCE_DEMO; USE SCHEMA TAH; -- For Cortex Analyst agent CREATE TABLE sales_metrics ( deal_id VARCHAR, customer_name VARCHAR, deal_value FLOAT, close_date DATE, sales_stage VARCHAR, win_status BOOLEAN, sales_rep VARCHAR, product_line VARCHAR ); INSERT INTO sales_metrics (deal_id, customer_name, deal_value, close_date, sales_stage, win_status, sales_rep, product_line) VALUES ('DEAL001', 'TechCorp Inc', 75000, '2024-02-15', 'Closed', true, 'Sarah Johnson', 'Enterprise Suite'), ('DEAL002', 'SmallBiz Solutions', 25000, '2024-02-01', 'Lost', false, 'Mike Chen', 'Basic Package'), ('DEAL003', 'SecureBank Ltd', 150000, '2024-01-30', 'Closed', true, 'Rachel Torres', 'Premium Security'), ('DEAL004', 'GrowthStart Up', 100000, '2024-02-10', 'Pending', false, 'Sarah Johnson', 'Enterprise Suite'), ('DEAL005', 'DataDriven Co', 85000, '2024-02-05', 'Closed', true, 'James Wilson', 'Analytics Pro'), ('DEAL006', 'HealthTech Solutions', 120000, '2024-02-20', 'Pending', false, 'Rachel Torres', 'Premium Security'), ('DEAL007', 'LegalEase Corp', 95000, '2024-01-25', 'Closed', true, 'Mike Chen', 'Enterprise Suite'), ('DEAL008', 'GlobalTrade Inc', 45000, '2024-02-08', 'Closed', true, 'James Wilson', 'Basic Package'), ('DEAL009', 'FastTrack Ltd', 180000, '2024-02-12', 'Closed', true, 'Sarah Johnson', 'Premium Security'), ('DEAL010', 'UpgradeNow Corp', 65000, '2024-02-18', 'Pending', false, 'Rachel Torres', 'Analytics Pro'); CREATE OR REPLACE STAGE models DIRECTORY = (ENABLE = TRUE);
-
Sales conversations table:
USE DATABASE SNOWFLAKE_INTELLIGENCE_DEMO; USE SCHEMA TAH; For Cortex Search service CREATE TABLE sales_conversations ( conversation_id VARCHAR, transcript_text TEXT, customer_name VARCHAR, deal_stage VARCHAR, sales_rep VARCHAR, conversation_date TIMESTAMP, deal_value FLOAT, product_line VARCHAR ); INSERT INTO sales_conversations (conversation_id, transcript_text, customer_name, deal_stage, sales_rep, conversation_date, deal_value, product_line) VALUES ('CONV001', 'Initial discovery call with TechCorp Inc''s IT Director and Solutions Architect. Client showed strong interest in our enterprise solution features, particularly the automated workflow capabilities. The main discussion centered around integration timeline and complexity. They currently use Legacy System X for their core operations and expressed concerns about potential disruption during migration. The team asked detailed questions about API compatibility and data migration tools. Action items include providing a detailed integration timeline document, scheduling a technical deep-dive with their infrastructure team, and sharing case studies of similar Legacy System X migrations. The client mentioned a Q2 budget allocation for digital transformation initiatives. Overall, it was a positive engagement with clear next steps.', 'TechCorp Inc', 'Discovery', 'Sarah Johnson', '2024-01-15 10:30:00', 75000, 'Enterprise Suite'), ('CONV002', 'Follow-up call with SmallBiz Solutions'' Operations Manager and Finance Director. The primary focus was on pricing structure and ROI timeline. They compared our Basic Package pricing with Competitor Y''s small business offering. Key discussion points included monthly vs. annual billing options, user license limitations, and potential cost savings from process automation. The client requested a detailed ROI analysis focusing on time saved in daily operations, resource allocation improvements, and projected efficiency gains. Budget constraints were clearly communicated, with a maximum budget of $30K for this year. They showed interest in starting with the basic package with room for a potential upgrade in Q4. Next steps include providing a competitive analysis and a customized ROI calculator by next week.', 'SmallBiz Solutions', 'Negotiation', 'Mike Chen', '2024-01-16 14:45:00', 25000, 'Basic Package'), ('CONV003', 'Strategy session with SecureBank Ltd''s CISO and Security Operations team. Extremely positive 90-minute deep dive into our Premium Security package. Customer emphasized immediate need for implementation due to recent industry compliance updates. Our advanced security features, especially multi-factor authentication and encryption protocols, were identified as perfect fits for their requirements. Technical team was particularly impressed with our zero-trust architecture approach and real-time threat monitoring capabilities. They''ve already secured budget approval and have executive buy-in. Compliance documentation is ready for review. Action items include: finalizing implementation timeline, scheduling security audit, and preparing necessary documentation for their risk assessment team. Client ready to move forward with contract discussions.', 'SecureBank Ltd', 'Closing', 'Rachel Torres', '2024-01-17 11:20:00', 150000, 'Premium Security'), ('CONV004', 'Comprehensive discovery call with GrowthStart Up''s CTO and Department Heads. Team of 500+ employees across 3 continents discussed current challenges with their existing solution. Major pain points identified: system crashes during peak usage, limited cross-department reporting capabilities, and poor scalability for remote teams. Deep dive into their current workflow revealed bottlenecks in data sharing and collaboration. Technical requirements gathered for each department. Platform demo focused on scalability features and global team management capabilities. Client particularly interested in our API ecosystem and custom reporting engine. Next steps: schedule department-specific workflow analysis and prepare detailed platform migration plan.', 'GrowthStart Up', 'Discovery', 'Sarah Johnson', '2024-01-18 09:15:00', 100000, 'Enterprise Suite'), ('CONV005', 'In-depth demo session with DataDriven Co''s Analytics team and Business Intelligence managers. Showcase focused on advanced analytics capabilities, custom dashboard creation, and real-time data processing features. Team was particularly impressed with our machine learning integration and predictive analytics models. Competitor comparison requested specifically against Market Leader Z and Innovative Start-up X. Price point falls within their allocated budget range, but team expressed interest in multi-year commitment with corresponding discount structure. Technical questions centered around data warehouse integration and custom visualization capabilities. Action items: prepare detailed competitor feature comparison matrix and draft multi-year pricing proposals with various discount scenarios.', 'DataDriven Co', 'Demo', 'James Wilson', '2024-01-19 13:30:00', 85000, 'Analytics Pro'), ('CONV006', 'Extended technical deep dive with HealthTech Solutions'' IT Security team, Compliance Officer, and System Architects. Four-hour session focused on API infrastructure, data security protocols, and compliance requirements. Team raised specific concerns about HIPAA compliance, data encryption standards, and API rate limiting. Detailed discussion of our security architecture, including: end-to-end encryption, audit logging, and disaster recovery protocols. Client requires extensive documentation on compliance certifications, particularly SOC 2 and HITRUST. Security team performed initial architecture review and requested additional information about: database segregation, backup procedures, and incident response protocols. Follow-up session scheduled with their compliance team next week.', 'HealthTech Solutions', 'Technical Review', 'Rachel Torres', '2024-01-20 15:45:00', 120000, 'Premium Security'), ('CONV007', 'Contract review meeting with LegalEase Corp''s General Counsel, Procurement Director, and IT Manager. Detailed analysis of SLA terms, focusing on uptime guarantees and support response times. Legal team requested specific modifications to liability clauses and data handling agreements. Procurement raised questions about payment terms and service credit structure. Key discussion points included: disaster recovery commitments, data retention policies, and exit clause specifications. IT Manager confirmed technical requirements are met pending final security assessment. Agreement reached on most terms, with only SLA modifications remaining for discussion. Legal team to provide revised contract language by end of week. Overall positive session with clear path to closing.', 'LegalEase Corp', 'Negotiation', 'Mike Chen', '2024-01-21 10:00:00', 95000, 'Enterprise Suite'), ('CONV008', 'Quarterly business review with GlobalTrade Inc''s current implementation team and potential expansion stakeholders. Current implementation in Finance department showcasing strong adoption rates and 40% improvement in processing times. Discussion focused on expanding solution to Operations and HR departments. Users highlighted positive experiences with customer support and platform stability. Challenges identified in current usage: need for additional custom reports and increased automation in workflow processes. Expansion requirements gathered from Operations Director: inventory management integration, supplier portal access, and enhanced tracking capabilities. HR team interested in recruitment and onboarding workflow automation. Next steps: prepare department-specific implementation plans and ROI analysis for expansion.', 'GlobalTrade Inc', 'Expansion', 'James Wilson', '2024-01-22 14:20:00', 45000, 'Basic Package'), ('CONV009', 'Emergency planning session with FastTrack Ltd''s Executive team and Project Managers. Critical need for rapid implementation due to current system failure. Team willing to pay premium for expedited deployment and dedicated support team. Detailed discussion of accelerated implementation timeline and resource requirements. Key requirements: minimal disruption to operations, phased data migration, and emergency support protocols. Technical team confident in meeting aggressive timeline with additional resources. Executive sponsor emphasized importance of going live within 30 days. Immediate next steps: finalize expedited implementation plan, assign dedicated support team, and begin emergency onboarding procedures. Team to reconvene daily for progress updates.', 'FastTrack Ltd', 'Closing', 'Sarah Johnson', '2024-01-23 16:30:00', 180000, 'Premium Security'), ('CONV010', 'Quarterly strategic review with UpgradeNow Corp''s Department Heads and Analytics team. Current implementation meeting basic needs but team requiring more sophisticated analytics capabilities. Deep dive into current usage patterns revealed opportunities for workflow optimization and advanced reporting needs. Users expressed strong satisfaction with platform stability and basic features, but requiring enhanced data visualization and predictive analytics capabilities. Analytics team presented specific requirements: custom dashboard creation, advanced data modeling tools, and integrated BI features. Discussion about upgrade path from current package to Analytics Pro tier. ROI analysis presented showing potential 60% improvement in reporting efficiency. Team to present upgrade proposal to executive committee next month.', 'UpgradeNow Corp', 'Expansion', 'Rachel Torres', '2024-01-24 11:45:00', 65000, 'Analytics Pro'); -- Enable change tracking ALTER TABLE sales_conversations SET CHANGE_TRACKING = TRUE;
-
Customer feedback table:
USE DATABASE SNOWFLAKE_INTELLIGENCE_DEMO; USE SCHEMA TAH; CREATE TABLE customer_feedback ( feedback_id STRING PRIMARY KEY, customer_name STRING, related_deal_id STRING, feedback_text TEXT, feedback_date TIMESTAMP, submitted_by STRING, sentiment STRING -- e.g., 'Positive', 'Neutral', 'Negative' ); INSERT INTO customer_feedback (feedback_id, customer_name, related_deal_id, feedback_text, feedback_date, submitted_by, sentiment) VALUES -- Existing 5 entries ('FB001', 'TechCorp Inc', 'DEAL001', 'We are pleased with the integration progress so far, though we encountered delays due to API throttling. Would appreciate more flexible support hours.', '2024-03-01 09:00:00', 'Customer Success', 'Positive'), ('FB002', 'SmallBiz Solutions', 'DEAL002', 'Decided not to proceed as pricing was too high for our budget. Loved the features, but ROI was not compelling for us at this stage.', '2024-02-02 13:30:00', 'Sales Team', 'Negative'), ('FB003', 'SecureBank Ltd', 'DEAL003', 'Excellent onboarding and security posture. Requesting additional documentation for our internal audit next quarter.', '2024-02-10 15:15:00', 'Customer Success', 'Positive'), ('FB004', 'FastTrack Ltd', 'DEAL009', 'Appreciate the rapid deployment support. Some minor bugs surfaced but were quickly resolved. Looking forward to QBR.', '2024-02-20 10:00:00', 'Implementation Manager', 'Positive'), ('FB005', 'GrowthStart Up', 'DEAL004', 'Still evaluating options. Impressed with scalability, but we need more clarity on cost projections for global rollout.', '2024-02-12 11:45:00', 'Account Executive', 'Neutral'), -- New feedback entries ('FB006', 'DataDriven Co', 'DEAL005', 'The analytics capabilities exceeded our expectations. Machine learning features are promising, but onboarding took longer than expected.', '2024-02-15 14:20:00', 'BI Lead', 'Positive'), ('FB007', 'HealthTech Solutions', 'DEAL006', 'Security features align well with our compliance needs. However, we’re waiting on final documentation for HIPAA audits before moving forward.', '2024-02-22 16:10:00', 'Compliance Officer', 'Neutral'), ('FB008', 'LegalEase Corp', 'DEAL007', 'Legal review of contract terms took longer than expected. Product meets our technical needs, but we still need clarity on support SLAs.', '2024-01-30 10:45:00', 'Legal Counsel', 'Neutral'), ('FB009', 'GlobalTrade Inc', 'DEAL008', 'Finance team happy with current implementation. Considering expanding to other departments but need stronger integration with supply chain systems.', '2024-02-18 09:30:00', 'Operations Lead', 'Positive'), ('FB010', 'UpgradeNow Corp', 'DEAL010', 'Basic package worked well, but we’re hitting limitations on reporting. Analytics Pro looks promising—team is preparing an upgrade proposal.', '2024-02-25 13:00:00', 'Analytics Manager', 'Positive');
4. Configuring Cortex services:¶
Cortex Search and Cortex Analyst are key tools that enable a Cortex agent to efficiently interact with data tables and retrieve information. Cortex Search specializes in handling unstructured data (e.g., textual), while Cortex Analyst focuses on structured data (e.g., numerical) and supports SQL-based operations. Depending on the number and types of data tables, you will need to configure the Cortex Search and Analyst services accordingly. Both of these services can be configured from the AI & ML section.
4.1. Creating a Cortex Search Service: Let's proceed with Cortex Search first: click on Create to create a new service; choose the appropriate database and schema (in our example, SNOWFLAKE_INTELLIGENCE_DEMO.TAH), and choose a name for your service (in our example, SALES_SEARCH_SERVICE).
Choose the data table you want this Search service to operate on. For demonstration, we configure this service on top of the SALES_CONVERSATIONS table.
Choose the searchable column from that table, then choose the proper attributes and the columns you want to include in the search index.
Eventually, configure the indexing of your service and other optional parameters, and create the service.
Similarly, create other necessary Cortex Search services for the rest of your unstructured (textual) data tables. In our case, we created another service for the CUSTOMER_FEEDBACK table, named as FEEDBACK_SEARCH_SERVICE.
4.2. Creating a Cortex Analyst Service: Cortex Analyst services can be created either as Semantic views or Semantic models. In this tutorial, we demonstrate how to create one as a semantic model.
In the creation window, configure the database, schema, and stage of the model; provide a proper description for its role, and choose a name for the model. In our case, we are using the stage MODELS under SNOWFLAKE_INTELLIGENCE_DEMO.TAH.
Choose the data table you want this Analyst service to operate on. For demonstration, we configure this service on top of the SALES_METRICS table.
Select the appropriate columns from the table, and create the service.
Here is a snapshot of our created Analyst service, named as sales_analyst_service.yaml.
5. Building a Cortex Agent:¶
After configuring the Cortex services, you can proceed with the final stage, which is developing the Cortex agent itself. From the AI & ML section, go to the Agents tab and click on Create agent.
The agent creation window might select the SNOWFLAKE_INTELLIGENCE.AGENTS for the database and schema by default, and show a notice that you don't have permissions to create agents there. Make sure to uncheck the box on top of the page, and choose the appropriate database and schema based on your need.
In our case, we choose the SNOWFLAKE_INTELLIGENCE_DEMO.TAH and name the agent as TAH_AGENT_DEMO.
After creating the agent, you can edit it to add instructions, tools, and configure other parameters.
Make sure to edit the tools section of the agent and add the required Cortex Search and Cortex Analyst services there.
After editing all the necessary fields of the agent, you should be able to use the chat box of the portal to test the agent.
Note: If you have fully configured your Cortex agent and when you prompt it on the portal, it shows the following error:
"Error: None of the preferred models are authorized or available in your region:
claude-4-sonnet
,claude-sonnet-4-5
,claude-3-7-sonnet
,claude-3-5-sonnet
,openai-gpt-5
,openai-gpt-oss-20b
,openai-gpt-4.1
,openai-o4-mini
. Please contact your administrator.",it might be a cross-region inference issue. Generally available models on Snowflake platform include
This will allow your account to handle inference requests in regions where the necessary models are accessible. Furthermore, ensure there are no restrictions on model access by reviewing the parameterclaude-3-7-sonnet
andclaude-3-5-sonnet
and you can resolve the error above by enabling the cross-region inference. As an ACCOUNTADMIN, run the following command in a SQL worksheet:CORTEX_MODELS_ALLOWLIST
, using the following command: If this parameter is set to'None'
or a limited list, you may need to update it to allow the required models: After enabling cross-region inference, you should be able to use the generally available models (claude-3-7-sonnet
andclaude-3-5-sonnet
) with your Cortex agent. The preview models (claude-4-sonnet
,claude-sonnet-4-5
,openai-gpt-5
,openai-gpt-oss-20b
,openai-gpt-4.1
,openai-o4-mini
) will require special access permissions that are not generally available.
Configuration and Execution Example ¶
1. Configuration file¶
To use the SnowflakeAgent
, define its configuration in a YAML file and include the agent in the orchestrator's agent_list
to permit invocation.
See the YAML template below for the SnowflakeAgent
configuration.
orchestrator:
agent_list:
- agent_name: "Snowflake Agent" # The name you choose for your Snowflake agent.
utility_agents:
- agent_class: SnowflakeAgent
agent_name: "Snowflake Agent" # The name you choose for your Snowflake agent
agent_description: "The Snowflake Agent can answer questions regarding orders recorded in the Snowflake databases."
config:
snowflake_password: "SNOWFLAKE_PASSWORD" # Required: Name of the environment variable holding your SNOWFLAKE Programmatic Access Token (PAT)
snowflake_services: # Required: Dictionary containing the information of the Cortex services configured for this Cortex Agent.
search: # Required: List of the required Cortex Search services.
- name: <name of the service> # Required: Arbitrary name you choose for this Cortex Search service.
database: <database name> # Required: Name of the database of this Cortex Search service.
db_schema: <schema name> # Required: Name of the schema of this Cortex Search service.
service_name: <name of the Cortex Search> # Required: Name of the Cortex Search service as recorded on the Snowflake platform.
analyst: # Required: List of the required Cortex Analyst services.
- name: <name of the service> # Required: Arbitrary name you choose for this Cortex Analyst service.
database: <database name> # Required: Name of the database of this Cortex Analyst service.
db_schema: <schema name> # Required: Name of the schema of this Cortex Analyst service.
stage: <stage name> # Required: Name of the stage of this Cortex Analyst service.
file_name: <file name> # Required: Name of the schema of this Cortex Analyst service.
warehouse: <warehouse name> # Required: Name of the warehouse of this Cortex Analyst service.
user_role: <user role> # Required: User role that has access to this Cortex agent.
snowflake_model: <LLM model> # Required: Name of the LLM model to use with this Snowflake Cortex agent.
snowflake_base_url: <account's base url> # Required: Base URL Address of your Snowflake account.
sql_timeout: <timeout in seconds> # Optional: Timeout in seconds for execution of any SQL statement (default: 10).
system_prompt: <response instructions> # Optional: The instructions that the Cortex agent follows when it generates the response.
snowflake_experimental: <experimental flags> # Optional: Experimental flags passed to the Cortex agent (default: {}).
snowflake_tool_choice: <tool choice> # Optional: Configuration used to select the tools for the Cortex agent (default: "auto").
thought_process_tracing: <tracing flag> # Optional: Boolean flag for monitoring the thought process steps of the Cortex agent (default: False).
contexts: # Optional additional agent contexts
- "date"
- "chat_history"
2. Authentication credentials¶
Before running the demo code, make sure that you have the following environment variables in your local .env file:
# AIR variables
export API_KEY=<YOUR-API-KEY>
# Snowflake variables
export SNOWFLAKE_PASSWORD=<YOUR-SNOWFLAKE-PAT>
You can obtain your Snowflake Programmatic Access Token (PAT) through your profile, under Settings tab:
Go to the Authentication section, and you can generate a new token as annotated below.
3. Python file¶
The code below demonstrates how the AI Refinery communicates with your Snowflake agent, to interact with the tables you configured in the previous steps.
import os
import asyncio
from dotenv import load_dotenv
from air import DistillerClient
load_dotenv() # loads your API_KEY from your local '.env' file
api_key=str(os.getenv("API_KEY"))
async def snowflake_agent_demo():
"""
sample queries for Snowflake agent.
"""
distiller_client = DistillerClient(api_key=api_key)
distiller_client.create_project(config_path="example.yaml", project="example-snowflake") # Create a project with the corresponding configuration
queries = [
"What are our top 3 client deals by deal value?",
"For the top three clients by deal value, "
"summarize key concerns from conversations with their sales reps.",
"For the top three clients by deal value, "
"what feedback did they share post-sale or during onboarding?",
]
async with distiller_client( # Initialize the distiller client instance
project="example-snowflake",
uuid="test_user",
) as dc:
for query in queries: # Send in the above queries one by one
responses = await dc.query(query=query)
print(f"----\nQuery: {query}")
async for response in responses: # Collect the responses to the queries
print(f"Response: {response['content']}") # Format and print them in the chat
if __name__ == "__main__":
print("\nSnowflake Agent Demo")
asyncio.run(snowflake_agent_demo())