🏠 Fully Client-Side Chat Over Documents 🏠

πŸ¦€ Voy + πŸ¦™ Ollama + πŸ¦œπŸ”— LangChain.js + πŸ€— Transformers.js

  • 🏑Yes, it's another chat over documents implementation... but this one is entirely local!
  • βš™οΈThe default LLM is Mistral run locally by Ollama. You'll need to install the Ollama desktop app and run the following commands to give this site access to the locally running model:
    $ OLLAMA_ORIGINS=https://webml-demo.vercel.app OLLAMA_HOST=127.0.0.1:11435 ollama serve

    Then, in another window:
    $ OLLAMA_HOST=127.0.0.1:11435 ollama pull mistral
  • πŸ—ΊοΈThe default embeddings are Nomic Embed v1. For more speed on some machines, switch to
    "Xenova/all-MiniLM-L6-v2"
    in
    app/worker.ts
    .
  • πŸ™This template is open source - you can see the source code and deploy your own version from the GitHub repo!
  • πŸ‘‡Try embedding a PDF below, then asking questions! You can even turn off your WiFi.