Constructs a reference to an Ollama server, optionally starting one if one is not running at the specified (local) URL. If the server is started, it will be killed when the returned object goes out of scope (such as when the agents using it go out of scope and there is no independent reference).

This function requires Ollama to be installed on your system. You can install Ollama from https://ollama.ai and follow their platform-specific installation instructions.

ollama_server(url = ollama_url(), start = TRUE, ...)

Arguments

url

URL to the server, taken as the environment variable OLLAMA_HOST or http://127.0.0.1:11434 by default.

start

Whether to start the server if one is not running.

...

Arguments passed to an underlying function.

Value

An OllamaServer object

Author

Michael Lawrence

Examples

if (FALSE) { # \dontrun{
    server <- ollama_server()
    agent <- llama(server)
    agent |>
        instruct("Answer questions about this dataset:", mtcars) |>
        predict("What is the relationship between horsepower and mpg?")
} # }