Skip to contents

Overview

artopenai provides R bindings for OpenAI’s Responses API. This quickstart covers:

  1. Configuration via environment variables
  2. Making requests with openai_with_tools()
  3. Continuing conversations with openai_continue()
  4. Using built-in tools (web search, code interpreter, etc.)

Configuration

artopenai uses environment variables for configuration. See the README Environment Variables section for complete documentation.

# Set your OpenAI API key
Sys.setenv(ART_OPENAI_KEY = apikey)

# Optional: set default model
Sys.setenv(ART_OPENAI_MODEL = "gpt-5.1")

Basic Usage

Making Requests

Use openai_with_tools() for requests:

openai_with_tools(
  prompt = "Explain the difference between impressionism and expressionism.",
  temp = 0.7
)

With System Prompts

Add context with system prompts:

openai_with_tools(
  prompt = "Analyze this artistic movement.",
  sys_prompt = "You are an art historian specializing in 19th century European art.",
  temp = 0.5
)

Temperature Strategies

# Deterministic for structured data
openai_with_tools("Provide a bullet list of 5 art movements.", temp = 0)

# Balanced for analysis
openai_with_tools("Compare impressionism and post-impressionism.", temp = 0.5)

# Creative for generation
openai_with_tools("Write a description of this artistic style.", temp = 0.9)

Model Selection

Specify models with the ml parameter:

# Default model
openai_with_tools("Quick question")

# Specific model
openai_with_tools("Complex analysis", ml = "gpt-5.1")

Tool Use

Enable web search for current information:

openai_with_tools(
  prompt = "What are the latest trends in digital art?",
  tools = list(tool_web_search()),
  temp = 0.7
)

Web Search for Context

openai_with_tools(
  prompt = "Research contemporary art market trends and summarize key findings.",
  tools = list(tool_web_search(search_ctx_size = "high")),
  temp = 0.5
)

Code Execution

Execute Python code:

openai_with_tools(
  prompt = "Calculate 15 factorial",
  tools = list(tool_code_interpreter()),
  temp = 0
)

Search uploaded documents:

# openai_with_tools(
#   prompt = "Find references to color theory in the uploaded documents.",
#   tools = list(tool_file_search(vstore_ids = c("vs_abc123"))),
#   temp = 0
# )

Conversation Continuation

OpenAI provides server-side conversation storage. Enable with store = TRUE:

# First message - enable storage
msg1 <- openai_with_tools(
  "I'm interested in Renaissance art.",
  store = TRUE # Enable server-side storage
)

# Continue the conversation
msg2 <- openai_continue(msg1, "Who were the major Renaissance artists?")

# Continue further
msg3 <- openai_continue(msg2, "Tell me about their techniques.")

The server manages history automatically - you only send the new prompt.

Long Conversations

Server-side storage for extended discussions:

# Initialize with context
conv <- openai_with_tools(
  prompt = "I want to learn about art history chronologically.",
  sys_prompt = "You are a knowledgeable art history professor.",
  store = TRUE
)

# Multiple turns
q1 <- openai_continue(conv, "Start with prehistoric cave paintings.")
q2 <- openai_continue(q1, "What came next?")
q3 <- openai_continue(q2, "Continue to ancient civilizations.")

Next Steps