I installed Ollama but I don’t have any ideas of what to do with it.

Do you have any fun/original use cases for it? I’m a programmer so it doesn’t have to exist already.

  • The Hobbyist@lemmy.zip
    link
    fedilink
    English
    arrow-up
    20
    ·
    edit-2
    17 days ago

    Ollama is very useful but also rather barebones. I recommend installing Open-Webui to manage models and conversations. It will also be useful if you want to tweak more advanced settings like system prompts, seed, temperature and others.

    You can install open-webui using docker or just pip, which is enough if you only care about serving yourself.

    Edit: open-webui also renders markdown, which makes formatting and reading much more appealing and useful.

    Edit2: you can also plug ollama into continue.dev, an extension to vscode which brings the LLM capabilities to your IDE.

  • Scrubbles@poptalk.scrubbles.tech
    link
    fedilink
    English
    arrow-up
    7
    ·
    17 days ago

    Great job trying to learn! Ignore the naysayers here, as a fellow programmer like it or not, you’re going to need to learn how to interact with it. That’s the real way we’d lose our jobs, if you don’t keep up with this stuff you’re doomed to fall behind.

    I recommend trying to first build a simple CLI API that you can work with and ask questions similar to chat gpt. This will give you a good foundation on the APIs and how it works. Then you can move up into function calling for things like HomeAssistant, and then maybe even later training loras. Start small, just getting basic stuff working first, and build from there.