Get Started with cargo add rig-core
Rig is a Rust library for building portable, modular, and lightweight Fullstack AI Agents. You can find API documentation on docs.rs.
Why Rig?
Rig is a Rust library for building portable, modular, and lightweight Fullstack AI Agents. You can find API documentation on docs.rs.
Why Rust?
// TODO: Add content and embed https://benjdd.com/languages/
High-level features
- Full support for LLM completion and embedding workflows
- Simple but powerful common abstractions over LLM providers (e.g. OpenAI, Cohere) and vector stores (e.g. MongoDB, in-memory)
- Integrate LLMs in your app with minimal boilerplate
Simple example:
use rig::{completion::Prompt, providers::openai};
#[tokio::main]
async fn main() {
// Create OpenAI client and agent.
// This requires the `OPENAI_API_KEY` environment variable to be set.
let openai_client = openai::Client::from_env();
let gpt4 = openai_client.agent("gpt-4").build();
// Prompt the model and print its response
let response = gpt4
.prompt("Who are you?")
.await
.expect("Failed to prompt GPT-4");
println!("GPT-4: {response}");
}
Note: using #[tokio::main]
requires you enable tokio’s macros
and rt-multi-thread
features
or just full
to enable all features cargo add tokio --features macros,rt-multi-thread
Integrations
Model Providers
Rig natively supports the following completion and embedding model provider integrations:
You can also implement your own model provider integration by defining types that implement the CompletionModel and EmbeddingModel traits.
Vector Stores
Rig currently supports the following vector store integrations via companion crates:
rig-mongodb
: Vector store implementation for MongoDBrig-lancedb
: Vector store implementation for LanceDBrig-neo4j
: Vector store implementation for Neo4jrig-qdrant
: Vector store implementation for Qdrant
You can also implement your own vector store integration by defining types that implement the VectorStoreIndex trait.