• 0 Posts
  • 54 Comments
Joined 2 years ago
cake
Cake day: June 10th, 2023

help-circle
  • ddh@lemmy.sdf.orgtoSelfhosted@lemmy.worldDocker in LXC vs VM
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    17 days ago
    1. I’m backing up LXCs, like I’d back up a VM. I don’t back up Docker containers, just their config and volumes.
    2. I don’t think anyone is doing that. We’re talking about installing Docker in LXC. One of the Proxmox rules you can live by is to not install software on the host. I don’t see the problem with installing Docker in an LXC for that reason.
    3. I’ll snapshot an LXC before running things like a dist-upgrade, or testing something that might break things. It’s very easy, so why not?
    4. I back up my LXC that has Docker installed because that way it’s easy to restore everything, including local volumes, to various points in time.











  • I’m still trying out combinations of hardware and models, but even my old Intel 8500T CPU will run around reading speed with a stock version of Meta’s Llama 3.2 3b (maybe the one you tried) with mostly good output—fine for rewriting content, answering questions about uploaded document stores etc.

    There are thousands of models tuned for various purposes, so one of the key questions is your purpose. If you want to use your setup for something specific (e.g., coding SQL) you are going to be able to find a much more efficient model.





  • I run Ollama with Open WebUI at home.

    A) the containers they run in by default can’t access the Internet, but they are provided access if we turn on web search or want to download new models. Ollama and Open WebUI are fairly popular products and I haven’t seen any evidence of nefarious activity so far.

    B) they create a profile on me and my family members that use them, by design. We can add sensitive documents that the models can use.

    C) they are restricted by what we type and the documents we provide.