Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

One of the features to the Synapse.org platform is the ability to host a crowd-sourced challenge. Hosting challenges are a great way to crowd-source new computational methods for fundamental questions in systems biology and translational medicine.

...

Property

Description

Example

SYNAPSE_USERNAME

Synapse credentials under which the orchestrator will run.  

The provided user must have access to the evaluation queue(s) being serviced.

dream_user

SYNAPSE_PASSWORD

Password for SYNAPSE_USERNAME.

This can be found under My Dashboard > Settings.

"abcdefghi1234=="

WORKFLOW_OUTPUT_ROOT_ENTITY_ID

Synapse ID synID for "Logs" folder.

Use the synID from Step 4.

syn123

EVALUATION_TEMPLATES

JSON map of evaluation IDs to the workflow repo archive, where the key is the evaluation ID and the value is the link address to the archive.

Use the evaluation IDs from Step 3 as the key(s) and the synIDs from Step 5 as the value(s).

{

"9810678": "syn456", 

 "9810679": "syn456"

}

...

Expand
titleIf using model-to-data template:

Script

TODO

Required?

workflow.cwl

Provide the admin user ID or admin team ID for principalid 

(2 steps: set_submitter_folder_permissions, set_admin_folder_permissions)

yes

Update synapseid to the Synapse ID synID of the Challenge's goldstandard

yes

Set errors_only to false if an email notification about a valid submission should also be sent

(2 steps: email_docker_validation, email_validation)

no

Provide the absolute path to the data directory, denoted as input_dir, to be mounted during the container runs.

yes

Set store to false if log files should be withheld from the Participants

no

Add metrics and scores to private_annotations if they are to be withheld from the Participants

no

validate.cwl

Update the base image if the validation code is not Python

no

Remove the sample validation code and replace with validation code for the Challenge

yes

score.cwl

Update the base image if the validation code is not Python

no

Remove the sample scoring code and replace with scoring code for the Challenge

yes

...

Code Block
languagepy
import synapseclient
import synapseutils
syn = synapseclient.login()

synapseutils.copyWiki(
   syn, "syn1234",  # SynapsesynID ID of staging site
   destinationId="syn2345",  # Synapse IDsynID of live site
   destinationSubPageId=999999  # ID following ../wiki/ of live project URL
)

When using copyWiki, it is important to specify the destinationSubPageId parameter.  This ID can be found in the URL of the live project, where it is the number following .../wiki/.

(warning) Once copyWiki has been used once, DO NOT RUN IT AGAIN! (warning)


Once the wiki has been copied over, all changes to the live project should now be synced with challengeutils' mirrow-wiki.

...

As challenge organizers, we recommend creating a Submission View to easily track and monitor submissions as they come in. This table will especially be useful when participants need help with their submissions.

...

Column Name

Description

Facet values?

evaluationid

Evaluation ID (evaluation ID, but rendered as evaluation name) – recommended for SubmissionViews with multiple queues in scope

(tick) Recommended

id

Submission ID

createdOn

Date and time of the submission (in Epoch, but rendered as MM/dd/yyyy, hh:mm:ss)

submitterid

User or team who submitted (user or team ID, but rendered as username or team name)

(tick) Recommended

dockerrepositoryname

Docker image name – recommended for model-to-data challenges

(error) Not recommended

dockerdigest

Docker SHA digest – recommended for model-to-data challenges

(error) Not recommended

status

Workflow status of the submission (one of [RECEIVED, EVALUATION_IN_PROGRESS, ACCEPTED, INVALID])

(tick) Recommended

submission_status

Evaluation status of the submission (one of [None, VALIDATED, SCORED, INVALID])

(tick) Recommended

submission_errors

(if any) Validation errors for the predictions file

(error) Not recommended

orgSagebionetworksSynapseWorkflowOrchestratorSubmissionFolder

Synapse ID synID to the submission’s logs folder

(error) Not recommended

prediction_fileid

Synapse ID synID to the predictions file (if any)

(error) Not recommended

(any annotations related to scores)

Submission annotations - names used depends on what annotations were used in the scoring step of the workflow

...

For any changes to the CWL scripts or run_docker.py, simply make the edits as needed to the scripts, then push the changes. We highly recommend conducting dryruns whenever there is a change to the workflow, so that errors are addressed in a timely manner.

...