Create a reusable chat object for interactive conversations with OpenAI models. The returned object provides `$chat()` for text responses, `$stream()` for streaming responses, and `$chat_structured()` for structured data extraction.
This function complements the low-level [openai_responses()] function. Use `openai_chat()` for interactive, multi-turn conversations with streaming support. Use [openai_responses()] for single-turn, structured output (e.g., pipeline tasks).
Usage
openai_chat(
sys_prompt = NULL,
ml = NULL,
temp = 1,
max_tokens = NULL,
tools = NULL,
echo = "none"
)Arguments
- sys_prompt
Character. System prompt to set model behavior. Default NULL.
- ml
Character. Model ID. Default from `ART_OPENAI_MODEL` env var (gpt-5.1).
- temp
Numeric. Temperature (0-2). Lower = deterministic, higher = creative. Default 1.
- max_tokens
Integer. Maximum output tokens. Default from `ART_OPENAI_MAX_OUTPUT_TOKENS` env var (4096).
- tools
List. Tools to register with the chat. Create with [ellmer::tool()]. Default NULL.
- echo
Character. Output mode: "none", "output", or "all". Default "none".
Value
ellmer Chat object with methods:
- `$chat(...)`: Send message(s), receive text response - `$chat_structured(prompt, type)`: Send message, receive structured data - `$stream(...)`: Stream response chunks (for real-time display) - `$register_tool(tool)`: Add a tool after creation
See also
* [openai_responses()] for low-level single-turn requests * [openai_continue()] for manual conversation continuation * <https://platform.openai.com/docs/api-reference/chat> OpenAI Chat API docs
Examples
if (FALSE) { # \dontrun{
# -----------------------------------------------------------------
# BASIC CHAT
# -----------------------------------------------------------------
chat <- openai_chat()
chat$chat("Explain Impressionism in 2 sentences.")
# With system prompt
chat <- openai_chat(sys_prompt = "You are an art historian. Be concise.")
chat$chat("What is Baroque?")
# -----------------------------------------------------------------
# MULTI-TURN CONVERSATION
# -----------------------------------------------------------------
chat <- openai_chat(sys_prompt = "You are an art expert.")
chat$chat("What is Baroque art?")
chat$chat("How does it differ from Renaissance?")
chat$chat("Name 3 Baroque painters.")
# -----------------------------------------------------------------
# STREAMING (for real-time UI)
# -----------------------------------------------------------------
chat <- openai_chat()
stream <- chat$stream("Tell me a story about an artist.")
coro::loop(for (chunk in stream) cat(chunk))
# -----------------------------------------------------------------
# IMAGE INPUT
# -----------------------------------------------------------------
chat <- openai_chat()
chat$chat(
ellmer::content_image_file("artwork.png"),
"Describe this artwork."
)
# From URL
chat$chat(
ellmer::content_image_url("https://example.com/painting.jpg"),
"What style is this?"
)
# -----------------------------------------------------------------
# STRUCTURED OUTPUT
# -----------------------------------------------------------------
art_schema <- ellmer::type_object(
style = ellmer::type_string("Art movement or style"),
period = ellmer::type_string("Time period"),
notable_artists = ellmer::type_array(ellmer::type_string(), "Key artists")
)
chat <- openai_chat(temp = 0)
result <- chat$chat_structured("Describe Impressionism", type = art_schema)
# -----------------------------------------------------------------
# MODEL SELECTION
# -----------------------------------------------------------------
# Use GPT-5.1 (default)
chat <- openai_chat()
# Use specific model
chat <- openai_chat(ml = "gpt-4o")
# -----------------------------------------------------------------
# WITH TOOLS
# -----------------------------------------------------------------
weather_tool <- ellmer::tool(
function(city) paste("Weather in", city, ": Sunny, 72F"),
name = "get_weather",
description = "Get current weather for a city",
city = ellmer::type_string("City name")
)
chat <- openai_chat(tools = list(weather_tool))
chat$chat("What's the weather in Paris?")
} # }
