Custom Agent¶
The CustomAgent lets you define your own agent logic using Python functions, offering flexibility for anything from simple query-response tasks to advanced workflows involving APIs, analytics, or multi-step processing.
Unlike the Base
UtilityAgent, which runs on the AI Refinery service, aCustomAgentexecutes locally on the SDK side and is not pre-configured with LLM interaction or prompt logic. Instead, you define its behavior in Python and register it in anexecutor_dictfor orchestration within the platform.
Workflow Overview¶
Here are the workflow for CustomAgent:
-
Function Definition: You define an async Python function that accepts a string query and returns a string result.
-
Executor Registration: This function must be added to an executor_dict with a unique name. This name is then referenced in your orchestration YAML under agent_name.
-
Integration: The AI Refinery platform invokes your custom function when routing queries through the orchestrator.
This design allows you to extend the platform with any logic not supported by built-in agents.
Usage¶
To register a CustomAgent, implement a Python async function like this:
async def your_custom_agent(
query: str,
env_variable: Optional[dict] = None,
chat_history: Optional[str] = None,
relevant_chat_history: Optional[str] = None,
#<any_arbitrary_config>: Optional[Any] = None
) -> str:
"""
Processes the given query and generates a response utilizing various optional parameters.
Args:
query (str): The input query to be processed.
env_variable (Optional[dict]): Dictionary containing key-value pairs sourced from the environment variable memory module.
chat_history (Optional[str]): String encapsulating the conversation log maintained by the chat_history memory module.
relevant_chat_history (Optional[str]): Subset of chat history identified as pertinent to the current query, sourced from the relevant_chat_history module for enhanced contextual relevance.
<any_arbitrary_config> (Optional[Any]): Any other arbitrary configuration under your custom agent's config.
Returns:
str: The generated response from the agent.
"""
# Example logic — replace with your own
response = f"This is a custom response to: {query}"
return response
Then register it in the executor_dict:
QuickStart¶
Here is an example to custom agent that generates synthetic data:
import os
from air import AsyncAIRefinery
from dotenv import load_dotenv
load_dotenv() # loads your API_KEY from your local '.env' file
api_key=str(os.getenv("API_KEY"))
async def simple_agent(query: str):
client = AsyncAIRefinery(api_key=api_key)
prompt = f"""
Your task is to generate synthetic data that can help answer the user question below.
Do not mention that this is synthetic data.
{query}
"""
response = await client.chat.completions.create(
messages=[{"role": "user", "content": prompt}],
model="meta-llama/Llama-3.1-70B-Instruct",
)
return response.choices[0].message.content
simple_agent function uses the AIRefinery SDK to generate a synthetic response. You can replace this logic with your own API call, tool invocation, or data processing.
Template YAML Configuration of CustomAgent¶
The CustomAgent also supports additional settings. See the template YAML below for all available options:
utility_agents:
- agent_class: CustomAgent # Required: Must be 'CustomAgent'
agent_name: CustomAgentName # Required: Must match name in executor_dict
agent_description: Generate synthetic data from query # Optional
config: {} # Optional. You can have any_arbitrary_config that can be passed to your_custom_agent
orchestrator:
agent_list:
- agent_name: CustomAgentName