CrewAI
Use Tensoras.ai as the LLM provider for CrewAI multi-agent workflows.
Installation
pip install crewai langchain-tensorasAuthentication
export TENSORAS_API_KEY="tns_your_key_here"Setup
CrewAI uses LangChain chat models under the hood. Pass a ChatTensoras instance as the LLM for your agents:
from langchain_tensoras import ChatTensoras
llm = ChatTensoras(model="llama-3.3-70b")Basic Crew
Create a crew of agents that collaborate to complete a task:
from crewai import Agent, Task, Crew
from langchain_tensoras import ChatTensoras
llm = ChatTensoras(model="llama-3.3-70b")
# Define agents
researcher = Agent(
role="Research Analyst",
goal="Find accurate, up-to-date information on the given topic.",
backstory="You are an experienced research analyst who excels at finding and synthesizing information.",
llm=llm,
verbose=True,
)
writer = Agent(
role="Technical Writer",
goal="Write clear, concise technical content based on research findings.",
backstory="You are a skilled technical writer who turns complex topics into accessible content.",
llm=llm,
verbose=True,
)
# Define tasks
research_task = Task(
description="Research the current state of retrieval-augmented generation (RAG) technology. Focus on recent advances, common architectures, and best practices.",
expected_output="A structured research summary with key findings.",
agent=researcher,
)
writing_task = Task(
description="Write a blog post about RAG technology based on the research findings. The post should be 500 words, technically accurate, and accessible to developers.",
expected_output="A polished blog post in markdown format.",
agent=writer,
)
# Create and run the crew
crew = Crew(
agents=[researcher, writer],
tasks=[research_task, writing_task],
verbose=True,
)
result = crew.kickoff()
print(result)Using Different Models Per Agent
Assign different Tensoras models to different agents based on their needs:
from crewai import Agent
from langchain_tensoras import ChatTensoras
# Use a larger model for complex reasoning
senior_analyst = Agent(
role="Senior Analyst",
goal="Perform deep analysis of complex data.",
backstory="You are a senior data analyst with 15 years of experience.",
llm=ChatTensoras(model="deepseek-r1"),
)
# Use a faster model for simpler tasks
formatter = Agent(
role="Report Formatter",
goal="Format analysis results into clean reports.",
backstory="You are a report formatting specialist.",
llm=ChatTensoras(model="llama-3.3-70b"),
)Agents with Tools
Give your agents custom tools to interact with external systems:
from crewai import Agent, Task, Crew
from crewai.tools import tool
from langchain_tensoras import ChatTensoras
llm = ChatTensoras(model="llama-3.3-70b")
@tool("Search Knowledge Base")
def search_kb(query: str) -> str:
"""Search the company knowledge base for information."""
from tensoras import Tensoras
client = Tensoras()
response = client.chat.completions.create(
model="llama-3.3-70b",
messages=[{"role": "user", "content": query}],
knowledge_bases=["kb_a1b2c3d4"],
)
return response.choices[0].message.content
support_agent = Agent(
role="Customer Support Agent",
goal="Answer customer questions accurately using the knowledge base.",
backstory="You are a helpful support agent with access to company documentation.",
llm=llm,
tools=[search_kb],
)
support_task = Task(
description="Answer the following customer question: 'How do I configure SSO for my organization?'",
expected_output="A clear, step-by-step answer based on the knowledge base.",
agent=support_agent,
)
crew = Crew(agents=[support_agent], tasks=[support_task])
result = crew.kickoff()
print(result)Sequential and Hierarchical Processes
Sequential (default)
Tasks run in order, each agent completing its task before the next begins:
crew = Crew(
agents=[researcher, writer],
tasks=[research_task, writing_task],
process="sequential", # default
)Hierarchical
A manager agent coordinates the other agents:
crew = Crew(
agents=[researcher, writer],
tasks=[research_task, writing_task],
process="hierarchical",
manager_llm=ChatTensoras(model="llama-3.3-70b"),
)Next Steps
- LangChain Integration — LangChain components for Tensoras
- LangGraph Integration — stateful agent workflows
- Tool Calling — how tool calling works in Tensoras
- Python SDK — full SDK reference