DevoxxGenie is a plugin for IntelliJ IDEA that uses local LLM's (Ollama, LMStudio, GPT4All, Jan and Llama.cpp) and Cloud based LLMs to help review, test, explain your project code.
-
Updated
Oct 1, 2025 - Java
DevoxxGenie is a plugin for IntelliJ IDEA that uses local LLM's (Ollama, LMStudio, GPT4All, Jan and Llama.cpp) and Cloud based LLMs to help review, test, explain your project code.
GPU-accelerated Llama3.java inference in pure Java using TornadoVM.
Mistral-java-client is a client for the Mistral.ai API. It allows you to easily interact with the Mistral AI models. Currently supports all mistral chat completion, OCR and embedding models.
A collection of Spring AI examples
🧠 Offline Ollama Inference Engine built with Java & Spring Boot. Run LLMs like Llama3, Mistral, Phi-3 Mini locally for privacy, multi-model switching, and reliable AI inference. Perfect for customizable, offline AI applications. 🤖
this repo demonstrates the ai capabilities over spring boot
🗺️ Visualize and explore large embeddings interactively, enabling easy navigation, clustering, and density analysis of your data.
Spring AI Functions With Local Ollama and Mistral
Harmoniq is a full-stack web application designed to foster a supportive community for individuals facing mental health challenges. It provides a safe, anonymous space where users can connect, share their experiences, and find solace through meaningful interactions.
Wolf Brigade Team solution- Round 3 AI Battleground assignment
Add a description, image, and links to the mistral topic page so that developers can more easily learn about it.
To associate your repository with the mistral topic, visit your repo's landing page and select "manage topics."