Overview
modUpload uses asynchronous processing to handle artwork uploads without blocking the Shiny session. This guide explains the pipeline architecture and patterns for developers working with or extending the module.
Architecture
The processing pipeline consists of three main components:
-
Pipeline Task - Created by
new_pipeline_task(), manages async execution -
Pipeline Runner -
run_pipeline()orchestrates the processing stages - Progress Monitoring - Waiter screens and notifications for user feedback
Pipeline Stages
When an artwork is submitted, it passes through these stages:
- Authentication - Verify canvas signature using {artpixeltrace}
- Deduplication - Check image hash against existing artworks
- File Processing - Extract metadata, generate thumbnails
- Analysis - Run image/video analysis via {artpipelines}
- Storage - Upload to CDN and create database records
Creating Pipeline Tasks
The new_pipeline_task() function creates an ExtendedTask for async processing:
# In modUploadServer
pipeline <- new_pipeline_task()
# Submit files for processing
pipeline$invoke(
artist = r$artist,
artwork = artcore::gen_artwork_id(),
collection = input$art_collection,
art_title = input$art_name,
artist_name = r$appdata$artist$info$artist_name,
art_story = input$art_story,
file_image = input$image_file$datapath,
file_canvas = input$raw_file$datapath,
file_video = input$video_file$datapath,
file_variants = input$variants$datapath,
zip_frames = input$frames_zip$datapath,
file_stats = input$stats_file$datapath
)Monitoring Pipeline Status
The pipeline status can be monitored using reactive observers:
shiny::observeEvent(pipeline$status(), {
status <- pipeline$status()
if (status == "running") {
# Show progress notification
showNotification("Processing artwork...", type = "message")
} else if (status != "running") {
# Pipeline completed
result <- pipeline$result()
if (!is.null(result)) {
showNotification("Artwork ready!", type = "success")
# Refresh appdata with new artwork
r$appdata <- artutils::get_appdata(r$artist, r$artwork)
} else {
showNotification("Processing failed", type = "error")
}
}
})Future/Promises Pattern
The pipeline uses {future} for async execution and {promises} for result handling:
run_pipeline <- function(...) {
params <- list(...)
promises::future_promise(
{
# This runs in a background R process
ll <- artpipelines::launch_artwork_pipeline(
artist = params$artist,
artwork = params$artwork,
# ... other parameters
)
if (is.null(ll)) {
warning("Pipeline Failed")
NULL
} else {
message("Pipeline Completed")
# Build result UI
do.call("fun_buildAboutUI", list(ll))
}
},
packages = "modUpload",
globals = list("fun_buildAboutUI" = fun_buildAboutUI, "params" = params),
future.seed = TRUE
)
}Session Initialization
The pipeline requires proper future plan initialization:
modUploadServer <- function(id, r) {
# Initialize async processing plan
future::plan(future::multisession)
shiny::moduleServer(id, function(input, output, session) {
# ... server logic
})
}Error Handling
Pipeline errors are captured and surfaced to the user:
# In the pipeline observer
if (is.null(result)) {
shinyalert::shinyalert(
"Oops",
"Something went wrong during processing.",
type = "error"
)
}Best Practices
- Always check demo mode before pipeline submission
- Validate inputs before submitting to pipeline
- Clean up observers on session end to prevent memory leaks
- Use unique IDs for pipeline tasks to support concurrent uploads
Related Functions
-
new_pipeline_task()- Create async pipeline task -
run_pipeline()- Execute the processing pipeline -
image_hash_exists()- Check for duplicate images -
buildAboutArtUI()- Generate result UI after processing
