Skip to contents

Overview

modUpload uses asynchronous processing to handle artwork uploads without blocking the Shiny session. This guide explains the pipeline architecture and patterns for developers working with or extending the module.

Architecture

The processing pipeline consists of three main components:

  1. Pipeline Task - Created by new_pipeline_task(), manages async execution
  2. Pipeline Runner - run_pipeline() orchestrates the processing stages
  3. Progress Monitoring - Waiter screens and notifications for user feedback

Pipeline Stages

When an artwork is submitted, it passes through these stages:

  1. Authentication - Verify canvas signature using {artpixeltrace}
  2. Deduplication - Check image hash against existing artworks
  3. File Processing - Extract metadata, generate thumbnails
  4. Analysis - Run image/video analysis via {artpipelines}
  5. Storage - Upload to CDN and create database records

Creating Pipeline Tasks

The new_pipeline_task() function creates an ExtendedTask for async processing:

# In modUploadServer
pipeline <- new_pipeline_task()

# Submit files for processing
pipeline$invoke(
  artist = r$artist,
  artwork = artcore::gen_artwork_id(),
  collection = input$art_collection,
  art_title = input$art_name,
  artist_name = r$appdata$artist$info$artist_name,
  art_story = input$art_story,
  file_image = input$image_file$datapath,
  file_canvas = input$raw_file$datapath,
  file_video = input$video_file$datapath,
  file_variants = input$variants$datapath,
  zip_frames = input$frames_zip$datapath,
  file_stats = input$stats_file$datapath
)

Monitoring Pipeline Status

The pipeline status can be monitored using reactive observers:

shiny::observeEvent(pipeline$status(), {
  status <- pipeline$status()
  
  if (status == "running") {
    # Show progress notification
    showNotification("Processing artwork...", type = "message")
  } else if (status != "running") {
    # Pipeline completed
    result <- pipeline$result()
    
    if (!is.null(result)) {
      showNotification("Artwork ready!", type = "success")
      # Refresh appdata with new artwork
      r$appdata <- artutils::get_appdata(r$artist, r$artwork)
    } else {
      showNotification("Processing failed", type = "error")
    }
  }
})

Future/Promises Pattern

The pipeline uses {future} for async execution and {promises} for result handling:

run_pipeline <- function(...) {
  params <- list(...)
  
  promises::future_promise(
    {
      # This runs in a background R process
      ll <- artpipelines::launch_artwork_pipeline(
        artist = params$artist,
        artwork = params$artwork,
        # ... other parameters
      )
      
      if (is.null(ll)) {
        warning("Pipeline Failed")
        NULL
      } else {
        message("Pipeline Completed")
        # Build result UI
        do.call("fun_buildAboutUI", list(ll))
      }
    },
    packages = "modUpload",
    globals = list("fun_buildAboutUI" = fun_buildAboutUI, "params" = params),
    future.seed = TRUE
  )
}

Session Initialization

The pipeline requires proper future plan initialization:

modUploadServer <- function(id, r) {
  # Initialize async processing plan
  future::plan(future::multisession)
  
  shiny::moduleServer(id, function(input, output, session) {
    # ... server logic
  })
}

Error Handling

Pipeline errors are captured and surfaced to the user:

# In the pipeline observer
if (is.null(result)) {
  shinyalert::shinyalert(
    "Oops", 
    "Something went wrong during processing.", 
    type = "error"
  )
}

Best Practices

  1. Always check demo mode before pipeline submission
  2. Validate inputs before submitting to pipeline
  3. Clean up observers on session end to prevent memory leaks
  4. Use unique IDs for pipeline tasks to support concurrent uploads