Party Planner¶
Agent Marketplace Planning Agent Custom Agent
With the current version of AI Refinery, you can seamlessly access LLM, VLM, and Embedding models using the standard OpenAI API. The Authenticator object in AI Refinery takes care of authentication for OpenAI, ensuring a smooth integration process. In this tutorial, we will demonstrate how to utilize the standard OpenAI API with your Custom Agent, and use it along with a built-in utility agent.
Objective¶
Use the AI Refinery SDK to create and run a simple AI system to help users plan parties.
Steps¶
1. Configuration file¶
As a first step, you simply need to create a yaml file with all the required configuration.
-
The Recommender Agent is the agent that you will design to use the OpenAI API to provide general recommendation.
-
The Party Planner Agent uses the
PlanningAgent
from the AIRefinery™ Marketplace that is capable of providing concrete planning schemes for the user. The Party Planner Agent will use the chat history (context: - "chat_history"
) stored in the AIRefinery™ memory dedicated for your project to provide a concrete plan according to the user query.
utility_agents:
- agent_class: CustomAgent
agent_name: "Recommender Agent"
agent_description: |
The Recommender Agent is a specialist in item recommendations. For instance,
it can provide users with costume recommendations, items to purchase, food,
decorations, and so on.
config: {}
- agent_class: PlanningAgent
agent_name: "Party Planner"
agent_description: |
The Party Planner agent is specialized in helping users planning their parties.
For example, how to organize a halloween party, christmas party, and so on.
Don't call this agent for item recommendations.
config:
output_style: "markdown"
contexts:
- "chat_history"
super_agents: []
orchestrator:
agent_list:
- agent_name: "Party Planner"
- agent_name: "Recommender Agent"
2. Python file¶
Now, you can start the development of your assistant using:
-
OpenAI API (authentication is handled by AI Refinery) to enable the LLM capabilities of your Custom Agent.
-
DistillerClient
to take advantage of the other features of AIRefinery™.
import os
from dotenv import load_dotenv
import asyncio
from openai import AsyncOpenAI
from air import login, DistillerClient
load_dotenv() # This loads your ACCOUNT and API_KEY from your local '.env' file
auth = login(
account=str(os.getenv("ACCOUNT")),
api_key=str(os.getenv("API_KEY")),
)
distiller_client = DistillerClient()
project = "party_project"
distiller_client.create_project(
config_path="config.yaml",
project=project
)
async def recommender_agent(query: str) -> str:
global auth
prompt = """Given the query below, your task is to provide the user with useful and cool
recommendation followed by a one-sentence justification.\n\nQUERY: {query}"""
openai_client = AsyncOpenAI(**auth.openai())
response = await openai_client.chat.completions.create(
messages=[{"role": "user", "content": prompt}],
model="meta-llama/Llama-3.1-70B-Instruct",
)
return response.choices[0].message.content
custom_agent_gallery = {"Recommender Agent": recommender_agent}
response = distiller_client.interactive(
project=project,
uuid="<your user id>",
custom_agent_gallery=custom_agent_gallery
)