Jumping in over your head is how you learn. Just be patient!
- 0 Posts
- 8 Comments
I think the photo gives the wrong impression. Its completely unrelated to the question.
Congratulations! I’m glad it worked well for you. Mint is a great choice as well.
https://appflowy.com/ is another possibility.
If youre on windows, mremoteng is very comprehensive: https://mremoteng.org/
rutrum@programming.devto Privacy@lemmy.ml•Is Ollama the most private/secure way to run AI models locally?English37·2 months agoIts all local. Ollama is the application, deepseek and llama and qwen and whatever else are just model weights. The models arent executables, nor do the models ping external services or whatever. The models are safe. Ollama itself is meant for hosting models locally, and I dont believe it even has capability of doing anything besides run local models.
Where it gets more complicated is “agentic” assistants, that can read files or execute things at the terminal. The most advanced code assistance are doing this. But this is NOT a function of ollama or the model, its a function of the chat UI or code editor plugin that glues the model output together with a web search, filesystem, terminal session, etc.
So in short, ollama just runs models. Its all local and private, no worries.
Ive started writing in typst. Its simple enough when doing not so complicated things, but an entire ecosystem is available the moment I want to do something complicated. But it does not have LOCAL graphical editor, but there is an online version you can use. Ive never tried it.