Integrating Custom Agents with Built-in Agents¶
With AI Refinery, you can seamlessly access LLM, VLM, and Embedding models using the standard AIRefinery API. The Authenticator object in AI Refinery takes care of authentication for AIRefinery, ensuring a smooth integration process. In this tutorial, we will demonstrate how to utilize the standard AIRefinery API with your Custom Agent, and use it along with a built-in utility agent.
Objective¶
Combine custom and built-in agents using the AI Refinery SDK to create and run a simple AI system that helps users plan parties.
Steps¶
1. Configuration file¶
As a first step, you simply need to create a yaml file with all the required configuration.
-
The Recommender Agent is the agent that you will design to use the AIRefinery API to provide general recommendation.
-
The Party Planner Agent uses the
PlanningAgent
from the AIRefinery™ Library that is capable of providing concrete planning schemes for the user. The Party Planner Agent will use the chat history (context: - "chat_history"
) stored in the AIRefinery™ memory dedicated for your project to provide a concrete plan according to the user query.
utility_agents:
- agent_class: CustomAgent
agent_name: "Recommender Agent"
agent_description: |
The Recommender Agent is a specialist in item recommendations. For instance,
it can provide users with costume recommendations, items to purchase, food,
decorations, and so on.
config: {}
- agent_class: PlanningAgent
agent_name: "Party Planner"
agent_description: |
The Party Planner agent is specialized in helping users planning their parties.
For example, how to organize a halloween party, christmas party, and so on.
Don't call this agent for item recommendations.
config:
output_style: "markdown"
contexts:
- "chat_history"
super_agents: []
orchestrator:
agent_list:
- agent_name: "Party Planner"
- agent_name: "Recommender Agent"
2. Python file¶
Now, you can start the development of your assistant using:
-
AIRefinery API to enable the LLM capabilities of your Custom Agent.
-
DistillerClient
to take advantage of the other features of AIRefinery™.
import os
from air import AsyncAIRefinery, DistillerClient, login
from dotenv import load_dotenv
load_dotenv() # This loads your ACCOUNT and API_KEY from your local '.env' file
auth = login(
account=str(os.getenv("ACCOUNT")),
api_key=str(os.getenv("API_KEY")),
)
distiller_client = DistillerClient()
project = "party_project"
distiller_client.create_project(config_path="config.yaml", project=project)
async def recommender_agent(query: str) -> str:
global auth
prompt = """Given the query below, your task is to provide the user with useful and cool
recommendation followed by a one-sentence justification.\n\nQUERY: {query}"""
prompt = prompt.format(query=query)
airefinery_client = AsyncAIRefinery(**auth.openai())
response = await airefinery_client.chat.completions.create(
messages=[{"role": "user", "content": prompt}],
model="meta-llama/Llama-3.1-70B-Instruct",
)
return response.choices[0].message.content
executor_dict = {"Recommender Agent": recommender_agent}
response = distiller_client.interactive(
project=project, uuid="test_user", executor_dict=executor_dict
)