• pezhore@infosec.pub
    link
    fedilink
    arrow-up
    0
    ·
    8 days ago

    That’s why I’ve stopped using non-local LLMs. Ollama works just fine on my outdated GTX 2060.