This repository contains code for StatGPT backend, which implements APIs and main logic of the StatGPT application.
More information about StatGPT and its architecture can be found in the documentation repository.
Application is written in Python 3.11 and uses the following main technologies:
Technology | Purpose |
---|---|
AI DIAL SDK | SDK for building applications on top of AI DIAL platform |
FastAPI | Web framework for API development |
SQLAlchemy | ORM for database operations |
LangChain | LLM application framework |
Pydantic | Data validation and settings |
sdmx1 | SDMX data handling and provider connections |
src/admin_portal
— backend of the administrator portal which allows the user to add and update data.src/common
— common code used in theadmin_portal
andstatgpt
applications.src/statgpt
— main application that generates response using LLMs and based on data prepared byadmin_portal
.tests
- unit and integration tests.docker
- Dockerfiles for building docker images.
The applications are configured using environment variables. The environment variables are described in the following files:
- Common environment variables - used in both applications
- Admin Backend environment variables
- Chat Backend environment variables
1. Install Make
- MacOS - should be already installed
- Windows
- Windows, using Chocolatey
- Make sure that
make
is in the PATH (runwhich make
).
Direct installation:
- MacOS, using Homebrew -
brew install python@3.11
- Windows or MacOS, using official repository
- Windows, using Chocolatey
- Make sure that
python3
orpython3.11
is in the PATH and works properly (runpython3.11 --version
).
Alternative: use pyenv:
pyenv
allows to manage different python versions on the same machine- execute following from the repository root folder:
pyenv install 3.11 pyenv local 3.11 # use Python 3.11 for the current project
3. Install Poetry
Recommended way - system-wide, independent of any particular python venv:
- MacOS - recommended way to install poetry is to use pipx
- Windows - recommended way to install poetry is to use official installer
- Make sure that
poetry
is in the PATH and works properly (runpoetry --version
).
Alternative - venv-specific (using pip
):
- make sure the correct python venv is activated
make install_poetry
Since Docker Desktop requires a paid license for commercial use, you can use one of the following alternatives:
- Docker Engine and Docker Compose on Linux
- Rancher Desktop on Windows or MacOS
Create python virtual environment, using poetry:
make init_venv
If you see the following error: Skipping virtualenv creation, as specified in config file.
, it means venv was not
created because poetry is configured not to create a new virtual environment. You can fix this:
- Either by updating poetry config:
poetry config --local virtualenvs.create true
(local config)- or
poetry config virtualenvs.create true
(global config)
- or by creating venv manually:
python -m venv .venv
For Mac / Linux:
source .venv/bin/activate
For Windows:
.venv/Scripts/Activate
The following will install basic and dev dependencies:
make install_dev
You can copy the template file and fill values for secrets manually:
cp .env.template .env
The Environment variables section provides links to pages with detailed information about environment variables.
make generate_dial_config
-
Run the DIAL using docker compose:
docker compose up -d
-
Apply
alembic
migrations:-
locally:
make db_migrate
-
or using Docker:
- Set
ADMIN_MODE=ALEMBIC_UPGRADE
in the.env
file - Run
admin_portal
fromdocker-compose.yml
- Set
-
-
Run Admin backend (if you want to initialize or update data):
make run_admin
make format
make lint
To automatically apply black and isort on each commit, enable PreCommit Hooks:
make install_pre_commit_hooks
This command will set up the git hook scripts.
(!) It is critical to note that autogenerate is not intended to be perfect. It is always necessary to manually review and correct the candidate migrations that autogenerate produces.
(!) After creating a new migration, it is necessary to update the ALEMBIC_TARGET_VERSION
in the
src/common/config/version.py
file to the new version.
make db_autogenerate MESSAGE="Your message"
or:
alembic -c src/alembic.ini revision --autogenerate -m "Your message"
make db_downgrade
To update localization files, run:
make extract_messages # Extract messages to be translated
make update_messages # Update existing translation files with new messages
Check the *.po files for new messages and provide translations.
Then compile translations:
make compile_messages
- Integration tests require running a test database and elasticsearch.
They are part of the
docker-compose.yml
file. The Docker containers with this database/elasticsearch don't have volumes to store data, so they are always fresh afterdocker compose down
. - To run integration tests, uncomment the
vectordb-test
andelasticsearch-test
containers in thedocker-compose.yml
file. You might also need to comment out theelasticsearch
container if your machine doesn't have enough resources. - To run end-to-end tests, first run StatGPT locally. This step is not required for other tests.
- Run tests:
-
all tests except for end-to-end (unit and integration):
make test
-
only unit tests:
make test_unit
-
only integration tests:
make test_integration
-
just end-to-end tests:
make test_e2e
-