
In a previous post I demonstrated how to set up a local LLM that you can run through either a command line interface (Ollama) or a graphical user interface (Open WebUI and others), and quickly demonstrated how to “chat with your documents” with a local model using LMStudio. In that previous post I simply attached a few documents to a one-off chat.