Skip to content

AI Talks redirects prompts between two AI models, ensuring continuous conversation with digressions while allowing context management and external content injection.

Notifications You must be signed in to change notification settings

Mathsqrt2/AI_Talks

Repository files navigation

AI Talks

📚 About the Project

AI Talks is an application that facilitates conversations between two language models. Each model receives a specialized prompt, guiding them to maintain dialogue and intentionally drift into digressions, making conversations engaging and varied.


🚀 Getting Started

Follow these steps to get the project up and running locally:

1. Clone the Repository

git clone https://github.com/Mathsqrt2/AI_Talks.git

2. Environment Variable Configuration

The application includes default settings for basic functionality. However, if you want Telegram integration to access conversations externally, configure these optional variables in a .env file:

TOKEN1=bot2_telegram_token       # Optional Telegram token for the first bot
TOKEN2=bot1_telegram_token       # Optional Telegram token for the second bot

PUBLIC_TELEGRAM_CHAT_ID=chat_id  # Optional public Telegram chat ID
GROUP_CHAT_ID=group_chat_id      # Optional Telegram group chat ID

Note: These values are optional, but without them, the application will not send any messages to Telegram. At least one text channel and one bot are required for Telegram integration. You can configure either one bot with one channel or two bots with two channels. You can get these informations from botFather.

3. Build Docker Container

Make sure Docker is installed, then run:

# Application build with proxy (default)
docker compose --profile prod up -d -V --build

# Application build if you're already using 80 and 443 ports
docker compose --profile prod-native up -d -V --build

⚠ Disclaimer This application includes a container running a reverse proxy, which requires ports 80 and 443 to be available.
If you already have another service running on these ports, you can still access the containers directly:

  • API → port 13000
  • Adminer → port 18080

4. Using the Application

Send an HTTP request using tools like Postman or Insomnia:

POST http://localhost/init/1

📄 Documentation

All available functionalities and system behavior are described in the Swagger documentation. It provides a user-friendly interface to explore and test API endpoints directly from the browser. You can check available routes, required parameters, and expected responses. The documentation is automatically generated and accessible at:

GET http://localhost/api

Access to the application’s database is available at the address below, using the credentials specified in the .env file or in docker-compose.yml

GET http://localhost/adminer

Access to the application interface, which allows you to manage the conversation through buttons and view it in real time, is available at the address below:

GET http://localhost/ui

📄 Features

  1. Managing Four Services
    The application runs four essential services in Docker containers: a MySQL database, an Ollama server (for handling language models), Adminer (for viewing and managing the database), and a NestJS application, which serves as the core backend system.

  2. Remote Conversation Control
    The application exposes a REST API controller that allows remote management of conversations. Users can start, pause, resume, or terminate conversations at any time by sending appropriate HTTP requests.

  3. Settings Management
    The application provides a dedicated controller for managing various settings, including context length, bot names, message logging, Telegram message display, and other configurable parameters.

  4. Event-Driven Architecture
    The system operates based on events emitted within a single service. It broadcasts a "message" event, dynamically switches context, and ensures that the conversation remains coherent and uninterrupted.

  5. State Archiving and Recovery
    The application stores all its states in the database, allowing it to restore the last conversation state in case of a system failure or unexpected shutdown.

  6. Error Handling and Retry Mechanism
    If message generation or delivery fails, the application will attempt a predefined number of retries as specified in the settings. If all retries fail, it will save its state and gracefully terminate the conversation.

  7. Interactive User Interface The application offers an intuitive interface for managing the system and monitoring the current state of ongoing conversations. It leverages WebSockets to deliver real-time updates, ensuring that users can seamlessly interact with and observe the conversation as it unfolds.


🛠️ Tech Stack

  • NestJS – Node.js framework for scalable backend applications.
  • TypeScript – Strongly-typed superset of JavaScript.
  • MySQL – Popular open-source relational database.
  • TelegramBot API – API for integrating Telegram bot functionality.
  • TypeORM – Powerful ORM for database interactions in Node.js applications.
  • Docker – Tool for containerizing applications and their dependencies.
  • WebSockets - Enables real-time, bidirectional communication between the client and the server.
  • vue - Progressive JavaScript framework for building interactive user interfaces.
  • Tailwindcss - Utility-first CSS framework for rapidly building modern and responsive user interfaces.

📌 License

This project is available under the MIT License. For more details, see the LICENSE file.

About

AI Talks redirects prompts between two AI models, ensuring continuous conversation with digressions while allowing context management and external content injection.

Topics

Resources

Stars

Watchers

Forks

Contributors 2

  •  
  •  

Languages