Build AI Assistants with memory, knowledge & tools
Turn any LLM into an AI Assistant with memory, knowledge and tools
LLMs have limited context
We give them tools to access data in real time.
1
from phi.assistant import Assistant
2
from phi.tools.duckduckgo import DuckDuckGo
3
4
assistant = Assistant(
5
tools=[DuckDuckGo()].
6
markdown=True,
7
)
8
assistant.print_response("What's happening in France?")
1
import json
2
from phi.assistant import Assistant
3
4
def get_user_order_details() -> str:
5
# Add your API call her
6
return json.dumps({"status": "on the way"})
7
8
assistant = Assistant(
9
tools=[get_user_order_details]
10
)
11
assistant.print_response("Where's my order?")
1
from phi.assistant import Assistant
2
from phi.knowledge.pdf import PDFKnowledgeBase
3
4
assistant = Assistant(
5
# Add a Knowledge Base
6
knowledge_base=PDFKnowledgeBase(
7
data=..., # Your PDFs
8
vector_db=..., # Your VectorDb
9
),
10
# Search the knowledge base for information
11
search_knowledge=True,
12
)
13
assistant.print_response("Ask a question from the knowledge base")
1
from phi.assistant import Assistant
2
from phi.storage.assistant.postgres import PgAssistantStorage
3
4
assistant = Assistant(
5
# Add memory by storing context in a database
6
storage=PgAssistantStorage(...),
7
# Read chat history to answer questions
8
read_chat_history=True,
9
)
10
assistant.print_response("Summarize this conversation?")
1
from phi.assistant import Assistant
2
from phi.tools.sql import SQLTools
3
4
assistant = Assistant(
5
tools=[SQLTools(db_url=...)],
6
)
7
assistant.print_response("What's the revenue for this quarter?")
1
from phi.assistant import Assistant
2
from phi.tools.shell import ShellTools
3
from phi.tools.file import FileTools
4
5
assistant = Assistant(
6
# Add tools to run shell commands and read files
7
tools=[ShellTools(), FileTools()]
8
)
9
assistant.print_response("Read the most recent file in current directory")
Turn any LLM into an AI Assistant
Phidata brings together everything you need to build reliable AI products
Memory
Enable long-term conversations by storing chat history and context in a database.
Knowledge
Provide LLMs with business context (RAG) by storing knowledge in Vector Dbs.
Tools
Call APIs, run queries, send emails. Integrate LLMs with your application
Monitoring
Monitor runs, tokens, feedback, quality and cost - all in one place. Automatic logging & monitoring makes building AI products a breeze.
Evaluations
Automatically evaluate responses and set alerts on quality issues. Compare models, get recommendations and ship with confidence.
Dedicated Support
Dedicated support and training to help you build the best AI product in the market.
Human Review
Human review measures true performance. The only way to evaluate the evaluations.
Prompt Management
Manage and version prompts, run experiments and test before you invest.
Loved by
AI Engineers
Built for AI Engineers
Free
Build AI Assistants using our open-source framework
$0
/month
1 workspace
Basic monitoring
Community support
Pro
Add monitoring, evaluations, human review and pro support.
$20
/month
Monitoring & Evaluations
Human Reviews
Pro Support via discord
Pro features like prompt registry, unlimited workspaces, teams and auto recommendations
Enterprise
We help you build the best AI product in the market.
Custom
1:1 Consultations
Dedicated slack channel
Custom AI development
FAQs
Phidata works with every LLM. From closed models like OpenAI, Anthropic, Cohere to open source models via Ollama, Together, Anyscale. See our docs for the full list.
If there’s any we’re missing, let us know and we’ll add them within a week.
Our team prefers using Postgres + PgVector but Phidata works with every database & vector store on the market. Pinecone, LanceDb, Singlestore - we support them all. See our docs for more info.
If there’s any we’re missing, let us know and we’ll add them within a week.
Yes and no. Assistants are autonomous and can take actions like Agents, but can also have long-term conversations and use RAG. They’re the perfect middle-ground for building AI Applications.
Read more about Assistants and how to build them.
You can serve Assistants using Streamlit, FastApi or Django. We provide ready-to-use templates to run Assistants in production and many code examples to help you build for your use case.
If you want dedicated support, book a call and we’re happy to help.
The phidata cookbook contains in-depth examples and code.
From basic assistants, function calling, structured output to advanced fine-tuning and evaluations. If there’s something specific, let us know and we’re happy to add a new example.
Phidata pro is free of charge for students, educators and startups < $5m in funding.
Reach out to ashpreet@phidata.com for your discount.
Need dedicated support ?
Our mission is to make LLMs smarter using memory, knowledge and tools. Stay ahead of the curve and build better AI products.