Skip to contents

Continue a stored OpenAI conversation using server-side state management. The previous response must have been created with `store = TRUE` in the original request.

Usage

openai_continue(
  prev_resp,
  prompt,
  ml = NULL,
  temp = 1,
  max_tokens = NULL,
  timeout = 60
)

Arguments

prev_resp

Previous response object with response_id attribute

prompt

New user prompt

ml

Model to use (default: same as previous)

temp

Temperature setting

max_tokens

Optional max_output_tokens cap (passthrough to [openai_responses()])

timeout

Request timeout in seconds (passthrough to [openai_responses()])

Value

Response content with metadata attributes