- Overview
- Why UnisonAI?
- Installation
- Core Components
- Parameter Reference Tables
- Usage Examples
- FAQ
- Contributing And License
pip install unisonai
from unisonai import Single_Agent
from unisonai.llms import Gemini
from unisonai import config
config.set_api_key("gemini", "your-api-key")
agent = Single_Agent(
llm=Gemini(model="gemini-2.0-flash"),
identity="Assistant",
description="A helpful AI assistant"
)
print(agent.unleash(task="Explain quantum computing"))
UnisonAI stands out with its unique Agent-to-Agent (A2A) communication architecture, enabling seamless coordination between AI agents as if they were human team members collaborating on complex tasks.
┌─────────────────────────────────────────────────────────────────┐
│ Agent-to-Agent (A2A) Communication │
├─────────────────────────────────────────────────────────────────┤
│ ┌─────────────┐ Message ┌─────────────┐ Message ┌─────────────┐ │
│ │ Agent 1 │◄─────────┤ Agent 2 │◄─────────┤ Agent 3 │ │
│ │ │ Channel │ │ Channel │ │ │
│ │ • Research │─────────►│ • Analysis │─────────►│ • Reporting │ │
│ │ • Planning │ │ • Synthesis │ │ • Delivery │ │
│ └─────────────┘ └─────────────┘ └─────────────┘ │
│ │ │ │ │
│ ▼ ▼ ▼ │
│ ┌─────────────┐ ┌─────────────┐ ┌─────────────┐ │
│ │ Single │ │ Clan │ │ Tool │ │
│ │ Agent │ │ Management │ │ System │ │
│ └─────────────┘ └─────────────┘ └─────────────┘ │
└─────────────────────────────────────────────────────────────────┘
- 🔒 Strong Type Validation: All tool parameters validated against
ToolParameterType
enum before execution - 🛡️ Enhanced Error Handling: Comprehensive error catching with detailed metadata for debugging
- 📊 Standardized Results: All tools return
ToolResult
objects with success status and metadata - 🔌 MCP Integration: Connect to external tools via Model Context Protocol servers
Framework | Single Agent | Multi-Agent | A2A Communication | Type Safety | MCP Support |
---|---|---|---|---|---|
UnisonAI | ✅ | ✅ | ✅ Native | ✅ | ✅ |
AutoGen | ✅ | ✅ | ❌ | ❌ | |
LangChain | ✅ | ❌ | ❌ | ||
CrewAI | ❌ | ✅ | ❌ | ❌ |
Comparison with popular agent frameworks - UnisonAI leads in A2A communication and type safety
- Complex Research Tasks: Multiple agents gathering, analyzing, and synthesizing information
- Workflow Automation: Coordinated agents handling multi-step business processes
- Content Creation: Specialized agents for research, writing, editing, and publishing
- Data Analysis: Distributed agents processing large datasets with different expertise
Component | Purpose | Key Features |
---|---|---|
Single_Agent | Standalone agent for focused tasks | Own history, tool integration, configurable LLMs |
Agent | Clan member for team coordination | Inter-agent messaging, role-based tasks, specialized tools |
Clan | Multi-agent orchestration | Team management, shared goals, coordinated execution |
Tool System | Extensible capability framework | Type validation, error handling, standardized results |
MCP Integration | External tool connectivity | MCP server support, protocol translation, service integration |
from unisonai.tools.websearch import WebSearchTool
agent = Single_Agent(
llm=Gemini(model="gemini-2.0-flash"),
identity="Research Assistant",
tools=[WebSearchTool]
)
agent.unleash(task="Latest AI trends")
from unisonai import Agent, Clan
research_agent = Agent(llm=Gemini(), identity="Researcher", task="Gather information")
analysis_agent = Agent(llm=Gemini(), identity="Analyst", task="Analyze findings")
clan = Clan(
clan_name="Research Team",
manager=research_agent,
members=[research_agent, analysis_agent],
goal="Comprehensive market analysis"
)
clan.unleash()
from unisonai.tools.tool import BaseTool, Field
from unisonai.tools.types import ToolParameterType
class CalculatorTool(BaseTool):
def __init__(self):
self.name = "calculator"
self.description = "Mathematical operations"
self.params = [
Field(name="operation", field_type=ToolParameterType.STRING, required=True),
Field(name="a", field_type=ToolParameterType.FLOAT, required=True),
Field(name="b", field_type=ToolParameterType.FLOAT, required=True)
]
super().__init__()
def _run(self, operation: str, a: float, b: float) -> float:
return a + b if operation == "add" else a * b
from unisonai import config
# Method 1: Configuration system
config.set_api_key("gemini", "your-key")
config.set_api_key("openai", "your-key")
# Method 2: Environment variables
export GEMINI_API_KEY="your-key"
export OPENAI_API_KEY="your-key"
# Method 3: Direct LLM initialization
llm = Gemini(api_key="your-key")
MCP_CONFIG = {
"mcpServers": {
"time": {"command": "uvx", "args": ["mcp-server-time"]},
"fetch": {"command": "uvx", "args": ["mcp-server-fetch"]}
}
}
- Quick Start Guide - 5-minute setup guide
- Installation - Detailed installation options
- API Reference - Complete API documentation
- Architecture Guide - System design and patterns
- Usage Guidelines - Best practices and patterns
- Tool System Guide - Custom tool creation and validation
- MCP Integration - External tool integration
- Parameter Reference - Complete parameter documentation
- Basic Examples - Simple agent patterns
- Advanced Examples - Multi-agent coordination
- Tool Examples - Custom tool implementations
- MCP Examples - External tool integration
What is UnisonAI?
Python framework for building and orchestrating AI agents with A2A communication.When should I use a Clan?
For complex, multi-step tasks requiring specialized agents working together.Can I add custom LLMs?
Yes! ExtendBaseLLM
class to integrate any model provider.
What are tools?
Reusable components that extend agent capabilities (web search, APIs, custom logic).How do I manage API keys?
Use config system, environment variables, or pass directly to LLMs.PRs and issues welcome! See our Contributing Guide.
Open Issues • Submit PRs • Suggest Features