For further reference, please consider the following sections:
- Official Apache Maven documentation
- Spring Boot Maven Plugin Reference Guide
- Create an OCI image
- GraalVM Native Image Support
- Spring Boot DevTools
- Spring Configuration Processor
- Spring Web
- Ollama
- OpenAI
The following guides illustrate how to use some features concretely:
- Building a RESTful Web Service
- Serving Web Content with Spring MVC
- Building REST services with Spring
These additional references should also help you:
This project has been configured to let you generate either a lightweight container or a native executable. It is also possible to run your tests in a native image.
If you're already familiar with Spring Boot container images support, this is the easiest way to get started. Docker should be installed and configured on your machine prior to creating the image.
Make sure to set your environment variables:
export OPENAI_API_KEY="your-openai-api-key"
export GEMINI_API_KEY="your-gemini-api-key"
export ANTHROPIC_API_KEY="your-anthropic-api-key"
export MISTRAL_API_KEY="your-mistral-api-key"
To create the image, run the following goal:
$ ./mvnw spring-boot:build-image -Pnative
Then, you can run the app like any other container:
$ docker run --rm -p 8080:8080 multillm-integration:0.0.1-SNAPSHOT
Use this option if you want to explore more options such as running your tests in a native image.
The GraalVM native-image
compiler should be installed and configured on your machine.
NOTE: GraalVM 22.3+ is required.
To create the executable, run the following goal:
$ ./mvnw native:compile -Pnative
Then, you can run the app as follows:
$ target/multillm-integration
You can also run your existing tests suite in a native image. This is an efficient way to validate the compatibility of your application.
To run your existing tests in a native image, run the following goal:
$ ./mvnw test -PnativeTest
Due to Maven's design, elements are inherited from the parent POM to the project POM.
While most of the inheritance is fine, it also inherits unwanted elements like <license>
and <developers>
from the parent.
To prevent this, the project POM contains empty overrides for these elements.
If you manually switch to a different parent and actually want the inheritance, you need to remove those overrides.
http GET localhost:8100/chat \
message=="What model are you? Please provide your name, version, and key capabilities." \
llm==openai
Expected Sample Response
{
"llm": "openai",
"originalMessage": "What model are you? Please provide your name, version, and key capabilities.",
"response": "I am ChatGPT, based on the GPT-4 architecture developed by OpenAI. My version includes improvements in understanding and generating human-like text, enabling me to assist with a wide range of tasks such as answering questions, providing explanations, composing creative writing, and more. I can understand context, handle complex prompts, and generate coherent and relevant responses across various topics.",
"timestamp": 1753268746222
}
http GET localhost:8100/chat \
message=="What model are you? Please tell me your name, version, and what you're good at." \
llm==ollama
Expected Sample Response
{
"llm": "ollama",
"originalMessage": "What model are you? Please tell me your name, version, and what you're good at.",
"response": " I am a model of the Chat Model developed by Mistral AI. My primary function is to assist with various tasks by providing information, answering questions, and engaging in conversation. I strive to provide precise, helpful, and courteous responses.\n\nWhile I don't have a personal name, you can think of me as your digital assistant designed to make your interactions more enjoyable and productive. My capabilities include but are not limited to: answering questions, providing explanations, discussing a wide range of topics, assisting with scheduling and organization, offering recommendations, and much more.\n\nIn terms of my version, I am part of the latest generation of models, continually learning and improving from the data it encounters during interactions like this one.",
"timestamp": 1753268772790
}
http GET localhost:8100/chat \
message=="Please identify yourself. What model are you, what version, and what are your strengths?" \
llm==gemini
Expected Sample Response
{
"llm": "gemini",
"originalMessage": "Please identify yourself. What model are you, what version, and what are your strengths?",
"response": "I am a large language model, trained by Google.\n\n**Model & Version:**\nUnlike traditional software with specific version numbers, large language models like me are continuously updated and refined. There isn't a single, publicly accessible \"version number\" in the way you might think of software like....",
"timestamp": 1753268800297
}