
Build multi-turn conversations with history management
Source:R/gemini-continue.R
gemini_continue.RdContinue a conversation by appending to the message history. Use this after
gemini_chat() or gemini_with_tools() to maintain context across turns.
Unlike OpenAI's server-side storage, Gemini requires client-side history management.
Arguments
- prev_resp
Character. Previous response from
gemini_chat(),gemini_continue(), orgemini_with_tools(). Must havecontents_historyattribute (attached automatically by these functions). Pass the response object directly, not extracted text.- prompt
Character. New user message to send. Must be a single string. The full conversation history plus this prompt is sent to the API.
- ml
Character. Model ID to use. If NULL (default), uses the same model as the previous response (from
modelVersionattribute) or falls back toART_GEMINI_MODELenv var. Override to switch models mid-conversation.- temp
Numeric. Temperature setting (0-2). Default 1. Typically keep consistent within a conversation for coherent responses.
- timeout
Numeric. Request timeout in seconds. Default 60.
- max_think
Logical. Enable extended reasoning (
thinkingLevel = "high") for Gemini 3 models. Silently ignored for other models. Default FALSE.
Details
Gemini does not store conversation state server-side. This function maintains conversation history by:
Extracting previous
contentsfromprev_respattributesAppending the model's previous response as a "model" role message
Appending the new user prompt
Calling Gemini API with full conversation history
Attaching updated history to the new response for future continuation
The first message should be created with gemini_chat() (which automatically
stores conversation history). Subsequent messages use this function.
Examples
if (FALSE) { # \dontrun{
# Start conversation (gemini_chat stores history automatically)
resp1 <- gemini_chat("Hello, my name is Alice.", temp = 0.7)
# Continue conversation
resp2 <- gemini_continue(resp1, "What's my name?", temp = 0.7)
# Continue further
resp3 <- gemini_continue(resp2, "Tell me a joke.", temp = 0.7)
} # }