I’m running ollama with llama3.2:1b smollm, all-minilm, moondream, and more. I am able to integrate it with coder/code-server, vscode, vscodium, page assist, cli, and also created a discord ai user.

I’m an infrastructure and automation guy, not a developer so much. Although my field is technically devops.

Now, I hear that some llms have “tools.” How do I use them? How do I find a list of tools for a model?

I don’t think I can simply prompt “Hi llama3.2, list your tools.” Is this part of prompt engineering?

What, do you take a model and retrain it or something?

Anybody able to point me in the right direction?

  • 0x01@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    21 days ago

    Ah for training a new model from scratch? Yes there is a specific format, you can look at the ollama source code or any of the big models that accept tool use like llama4 for the format both to and from a model. However unless you’re secretly a billionaire I doubt you could compete with these pertained models in tool calling.

    Ollama’s model list on their website has a filter for tool using models. To be honest all open source models suck at tool use compared to the big players, openai, anthropic, google. To be fair I don’t have any hardware capable of running deepseeks newest models so I haven’t tested them for tool use.