chat.Rd
Interact with an agent, either by a single exchange (predict
)
or by maintaining context over multiple exchanges (chat
).
There are methods for both predict
and chat
on Chat and Agent
objects.
chat(x, input = NULL, stream_callback = NULL,
system_params = list(), env = parent.frame(), ...)
predict(object, input, env = parent.frame(), ...)
chat(x, input = NULL, stream_callback = NULL, ...)
predict(object, input, ...)
An Agent or Chat object, which receives the input and generates a response.
An object that is sent to the model (appended to the context when
x
is a Chat object). Typically a string but could also be a
raster
object (for vision models) or any
other R object supported by the extensible serialization
mechanism. data.frames are converted to CSV, and other complex
objects are serialized to JSON, by default.
A function that takes a single argument, a chunk of the model response, as a string. Typically only used in interactive chat settings, so that responses can be streamed to user as they are being generated.
When x
is an Agent, a named list of strings used to
instantiate the system prompt template according to
glue
semantics. See
system_prompt_as
. For advanced use only.
When x
or object
is an Agent, an environment
that is used when resolving R symbols passed to tools. For advanced
use only.
For predict
, arguments passed to chat
. For
chat
, arguments passed to the underlying backend.
if (FALSE) { # \dontrun{
agent <- llama() |>
instruct("Answer questions about the mtcars dataset:", mtcars)
predict(agent, "What is the relationship between hp and fuel efficiency?")
chat <- chat(agent,
"What is the relationship between hp and fuel efficiency?")
chat <- chat(chat, "Can you suggest ways to visualize this relationship?")
last_output(chat)
} # }