Skip to contents

Continue a conversation by appending to the message history. Use this after gemini_chat() or gemini_with_tools() to maintain context across turns. Unlike OpenAI's server-side storage, Gemini requires client-side history management.

Usage

gemini_continue(
  prev_resp,
  prompt,
  ml = NULL,
  temp = 1,
  timeout = 60,
  max_think = FALSE
)

Arguments

prev_resp

Character. Previous response from gemini_chat(), gemini_continue(), or gemini_with_tools(). Must have contents_history attribute (attached automatically by these functions). Pass the response object directly, not extracted text.

prompt

Character. New user message to send. Must be a single string. The full conversation history plus this prompt is sent to the API.

ml

Character. Model ID to use. If NULL (default), uses the same model as the previous response (from modelVersion attribute) or falls back to ART_GEMINI_MODEL env var. Override to switch models mid-conversation.

temp

Numeric. Temperature setting (0-2). Default 1. Typically keep consistent within a conversation for coherent responses.

timeout

Numeric. Request timeout in seconds. Default 60.

max_think

Logical. Enable extended reasoning (thinkingLevel = "high") for Gemini 3 models. Silently ignored for other models. Default FALSE.

Value

Character model reply with metadata attributes including updated contents_history

Details

Gemini does not store conversation state server-side. This function maintains conversation history by:

  1. Extracting previous contents from prev_resp attributes

  2. Appending the model's previous response as a "model" role message

  3. Appending the new user prompt

  4. Calling Gemini API with full conversation history

  5. Attaching updated history to the new response for future continuation

The first message should be created with gemini_chat() (which automatically stores conversation history). Subsequent messages use this function.

Examples

if (FALSE) { # \dontrun{
# Start conversation (gemini_chat stores history automatically)
resp1 <- gemini_chat("Hello, my name is Alice.", temp = 0.7)

# Continue conversation
resp2 <- gemini_continue(resp1, "What's my name?", temp = 0.7)

# Continue further
resp3 <- gemini_continue(resp2, "Tell me a joke.", temp = 0.7)
} # }