-
Notifications
You must be signed in to change notification settings - Fork 19
Description
🔧 Feature Request: MCP Server Tools Integration for Enhanced Text Generation
📋 Problem Statement
Currently, the Obsidian TARS plugin generates text using AI services based on tag suggestions, but it operates in isolation from external data sources. This creates several limitations:
- Static Content Generation: Generated text cannot include real-time data from external APIs, databases, or services
- Manual Data Retrieval: Users must manually gather information from Jira, Confluence, web pages, etc., before generating content
- Limited Context Awareness: AI models cannot access current project status, recent documentation, or live system data
- Repetitive Research Tasks: Common information gathering tasks (checking ticket status, fetching documentation) must be done manually each time
This limits TARS from becoming a true knowledge synthesis tool that can create contextually rich, up-to-date content.
💡 Use Cases & Business Value
Primary Use Cases
1. Project Management & Status Updates
- Generate project status reports that automatically include current Jira ticket statuses
- Create meeting notes with real-time sprint progress and blocker information
- Update documentation with latest deployment status from CI/CD systems
2. Research & Documentation
- Extract and summarize content from web pages while taking notes
- Pull code examples and documentation from GitHub/Azure DevOps repositories
- Generate research summaries with data from multiple online sources
3. Personal Productivity
- Create daily standup notes that include calendar events and task status
- Generate contextualized meeting notes with attendee information and previous decisions
- Synthesize information from emails, documents, and external systems
Business Value
- Time Savings: Reduce manual data gathering from hours to seconds
- Accuracy: Always include the most current information in generated content
- Consistency: Standardize how external data is incorporated into notes
- Discoverability: Surface relevant information users might have missed
🎯 Proposed Solution: PoC with Docker-based MCP Servers
Core Concept
Integrate Model Context Protocol (MCP) servers to provide AI models with access to external tools and data sources during text generation.
PoC Implementation Approach
Simplest viable implementation for proof of concept:
- MCP Server Deployment: Run MCP servers as Docker containers
- Connection Bridge: Use
mcp-remote
package to connect plugin to containers - Tag-Triggered Tools: Map Obsidian note tags to relevant MCP tools
- Enhanced Generation: Include tool results in AI context for richer content
PoC Architecture
graph TB
A[User Types #project/alpha] --> B[TARS Plugin]
B --> C[Tag Analysis]
C --> D[MCP Manager]
D --> E[mcp-remote Client]
E --> F[Docker Container: Jira MCP]
F --> G[Jira API]
G --> F
F --> E
E --> D
D --> H[AI Service with Tool Data]
H --> I[Generated Content with Live Data]
MVP Feature Set
- ✅ Connect to dockerized MCP servers
- ✅ Basic web scraping MCP server
- ✅ Jira integration MCP server
- ✅ Tag-based tool triggering
- ✅ Enhanced text generation with external data
🛠️ Implementation Plan
Phase 1: Foundation Setup
- 1.1 Add
mcp-remote
dependency to plugin - 1.2 Create
MCPManager
class for connection management - 1.3 Implement Docker container health checking
- 1.4 Add MCP servers configuration section to settings UI
- 1.5 Create encrypted credential storage system
- 1.6 Test connection to simple echo MCP server
Phase 2: Basic Tool Integration
- 2.1 Implement tool discovery from connected MCP servers
- 2.2 Create tool invocation interface with parameter validation
- 2.3 Add error handling and retry logic for tool calls
- 2.4 Implement basic caching for tool results
- 2.5 Test with web scraping MCP server
- 2.6 Verify tool data integration in text generation
Phase 3: Tag-Driven Automation
- 3.1 Create tag pattern recognition system
- 3.2 Implement tag-to-tool mapping configuration
- 3.3 Add automatic tool selection based on note tags
- 3.4 Create smart parameter inference from tag context
- 3.5 Test with multiple tag patterns and tools
- 3.6 Validate end-to-end workflow
Phase 4: AI Service Integration
- 4.1 Implement tool calling for Claude/OpenAI (native support)
- 4.2 Create pre-fetch strategy for non-tool-calling models
- 4.3 Add tool result formatting for different AI providers
- 4.4 Implement graceful fallback when tools unavailable
- 4.5 Test with all supported AI services
- 4.6 Optimize for performance and user experience
Phase 5: Production Readiness
- 5.1 Add comprehensive error handling and user feedback
- 5.2 Implement performance monitoring and logging
- 5.3 Create user documentation and examples
- 5.4 Add security audit and input validation
- 5.5 Perform load testing with multiple concurrent users
- 5.6 Package and prepare for release
📚 Technical Resources & References
Core Dependencies
{
"mcp-remote": "^1.0.0",
"@modelcontextprotocol/sdk": "^1.0.0",
"dockerode": "^3.3.4"
}
Sample MCP Server Configuration
# docker-compose.yml for PoC MCP servers
version: '3.8'
services:
web-scraper:
image: mcp-servers/web-scraper:latest
ports: ["3001:3001"]
environment:
- MAX_REQUESTS=10
jira-connector:
image: mcp-servers/jira:latest
ports: ["3002:3002"]
environment:
- JIRA_URL=${JIRA_URL}
- JIRA_TOKEN=${JIRA_TOKEN}
Key Implementation Interfaces
interface MCPManager {
connectToServer(config: MCPServerConfig): Promise<MCPClient>;
discoverTools(): Promise<Tool[]>;
invokeToolsForTags(tags: string[]): Promise<ToolResult[]>;
}
interface TagToolMapper {
getToolsForTags(tags: string[]): Promise<Tool[]>;
mapTagToTool(tagPattern: string, toolConfig: ToolConfig): void;
}
Data Flow Documentation
sequenceDiagram
participant U as User
participant T as TARS
participant M as MCPManager
participant D as Docker MCP
participant A as AI Service
U->>T: Write note with #project/alpha
T->>M: Get tools for tags
M->>D: Invoke project tools
D->>M: Return project data
M->>T: Formatted tool results
T->>A: Generate with context + tools
A->>T: Enhanced content
T->>U: Note with live project data
🧪 Testing Strategy
Unit Tests
- MCP connection establishment and health checks
- Tool discovery and parameter validation
- Tag pattern matching and tool selection
- Error handling and retry mechanisms
- Credential encryption/decryption
Integration Tests
- End-to-end workflow with docker MCP servers
- Multi-tool orchestration scenarios
- AI service compatibility across providers
- Performance under concurrent usage
- Graceful degradation when services unavailable
User Acceptance Tests
- Complete user workflow from tag entry to enhanced content
- Settings configuration and server management
- Error scenarios and recovery procedures
- Performance meets user expectations (< 5s tool execution)
✅ Acceptance Criteria
Core Functionality
- MCP Connection: Successfully connect to dockerized MCP servers with health monitoring
- Tool Discovery: Automatically discover and list available tools from connected servers
- Tag Integration: Recognize note tags and trigger appropriate MCP tools automatically
- Enhanced Generation: Generate text that includes live data from external sources
- Error Handling: Gracefully handle server failures with clear user feedback
Performance Requirements
- Response Time: Tool execution completes within 5 seconds for 95% of requests
- Reliability: 95%+ success rate for tool invocations under normal conditions
- Resource Usage: Plugin memory increase stays below 100MB during operation
- Startup Impact: Plugin load time increases by less than 2 seconds
User Experience
- Configuration: Intuitive settings UI for adding and managing MCP servers
- Feedback: Clear loading indicators and error messages during tool execution
- Documentation: Complete setup guide and common use case examples
- Compatibility: Works with all existing AI service providers in TARS
Security & Reliability
- Credential Security: All API keys and tokens encrypted at rest
- Input Validation: Proper sanitization of external data before AI processing
- Graceful Degradation: Plugin continues working when MCP servers unavailable
- Audit Trail: Logging of tool usage for debugging and monitoring
🔄 Success Metrics
Short-term (30 days post-release)
- 70%+ of active users enable MCP integration
- Average tool usage: 5+ invocations per user per day
- User satisfaction score: 4.0+ out of 5.0
Long-term (90 days post-release)
- 50%+ reduction in time spent on manual data gathering
- 30%+ increase in note creation frequency
- Community contribution of 3+ new MCP server types
Ready for Implementation: This ticket provides sufficient detail for a developer to implement the MCP integration feature following the phased approach outlined above.