Model Catalog¶
Our comprehensive model catalog provides a diverse array of models for your selection. To configure your agents to leverage any of these models, please refer to our project configuration guidelines. Below, you will find a list of the models currently supported. We are dedicated to the continuous enhancement and expansion of our model catalog, so please visit this page regularly for the latest updates.
Large Languge Models (LLMs)¶
The list of LLMs that we support are:
meta-llama/Llama-3.1-70B-Instruct
To utilize any of these LLMs, simply update the llm_config
within the base_config
or within the config
section of any utility agents in your project's YAML configuration file. Ensure that the model
parameter of the llm_config
is set to one of the names listed above.
Vision Language Models (VLMs)¶
The list of VLMs that we support are:
meta-llama/Llama-3.2-90B-Vision-Instruct
To utilize any of these VLMs, simply update the vlm_config
within the base_config
or within the config
section of any utility agents (which support images e.g.,ImageUnderstandingAgent
) in your project's YAML configuration file. Ensure that the model
parameter of the vlm_config
is set to one of the names listed above.
Embedding Models¶
The list of models that we support for embedding your data are as follows:
intfloat/e5-mistral-7b-instruct
To utilize any of these embedding models in your project, simply update the embedding_config
within the base_config
or within the aisearch_config
section of the ResearchAgent
. Ensure that the model_name
parameter of the embedding_config
is set to one of the names listed above.
Compression Models¶
The list of prompt compression models that we support are:
llmlingua/bert
To utilize any of these prompt compression models in your project, simply update the compression_config
within the base_config
of your project. To learn more about prompt compression, see this tutorial. Ensure that the model
parameter of the compression_config
is set to one of the names listed above.
Reranker Models¶
The list of reranker models that we support are:
BAAI/bge-reranker-large
To utilize any of these reranker models in your project, simply update the reranker_config
within the base_config
of your project. To learn more about reranking, see this tutorial. Ensure that the model
parameter of the reranker_config
is set to one of the names listed above.