Skip to content

A starting point for building custom LLM apps using Open Source tooling and models. Incorporates Ollama, Open WebUI, Langchain, Streamlit, Chroma, & PGVector using Docker and Docker Compose.

License

Notifications You must be signed in to change notification settings

jbsoftware-io/gen-ai-starter

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

66 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Generative AI Starter

Purpose

This repository can be used as a starting point for building custom LLM applications using Open Source tooling and models. It incorporates Ollama, Open WebUI, Langchain, Streamlit, Chroma, and PGVector using docker to containerize the application and docker compose to run the various service dependencies.

Pre-requisites

  • Docker Engine installed
    • Run Option 1
      • Docker Engine configured with >= 12 GB memory
    • Run Option 2
      • Docker Engine configured with >= 8 GB memory
      • Ollama Executable Installed
brew install ollama

Optional Pre-requisites (required for "Web" example)

  • Register to get a free Brave Search API key here.
    • The free key gives you 2000 calls per month, and if you need to scale they are affordable.
    • To learn more about Brave Search API click here
  • Create a .env file in the root of the project and add your key as follows:
 BRAVE_SEARCH_API_KEY={yourKeyHere}

Running the Backing Services and LLM

Option 1 (Easiest but Slower, only CPU)

docker compose --profile=cpu up -d

Option 2 (Fastest, uses GPU)

docker compose up -d
./etc/ollama_entrypoint.sh

Access the Demo App

http://localhost:8501/

Open WebUI and Ollama Links

To check out the Open Web UI interface (for manual chats and more) go here and sign up for an admin account.

Open WebUI:

http://localhost:3000/

List OLama Models:

curl http://localhost:11434/api/tags

Ollama API Docs: https://github.com/ollama/ollama/blob/main/docs/api.md#api

Tutorials

Tutorial Zero - The Prerequisites

Tutorial One - The Basics

Tutorial Two - Vectorization and Retrievers

Tutorial Three - Dynamic Web Content

Further Reading

About

A starting point for building custom LLM apps using Open Source tooling and models. Incorporates Ollama, Open WebUI, Langchain, Streamlit, Chroma, & PGVector using Docker and Docker Compose.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published