π‘Yes, it's another chat over documents implementation... but this one is entirely local!
πThe vector store (Voy) and embeddings (Transformers.js) are served via Vercel Edge function and run fully in the browser with no setup required.
βοΈThe default LLM is Mistral run locally by Ollama. You'll need to install the Ollama desktop app and run the following commands to give this site access to the locally running model: