Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

This guide This guide helps organizers create a space within Synapse to host a crowd-sourced Challenge. Challenges are community competitions designed to crowd-source new computational methods for fundamental questions in systems biology and translational medicine. Learn more about Challenges and see examples of past and current projects by visiting Challenges and Benchmarking.

...

Code Block
Logs
 ├── submitteridA
 │  ├── submission01
 │  │  ├── submission01_log.txt
 │  │  └── submission01_logs.zip
 │  ├── submission02
 │  │  ├── submission02_log.txt
 │  │  └── submission02_logs.zip
 │ ...
 │
 ├── submitteridA_LOCKED
 │  ├── submission01
 │  │  └── predictions.csv
 │  ├── submission02
 │  │  └── predictions.csv
 │ ...
 │
...

...

Launch the Challenge

...

Requirements

Important TODOs:

  1. (warning) Before proceeding with the launch, contact Sage Governance to ensure that a clickwrap is in-place for Challenge registration.(warning) You will need to provide Governance with the Synapse ID to the live Challenge site, as well as the team ID of the

...

  1. Participants team.

  2. Share all needed Evaluation queues with the Participants team with Can submit permissions. Once the Challenge is over, we recommend updating their permissions to Can view – this will help enhance their user experience with Evaluation queues.

  3. After the Challenge is launched, create a Folder named “Data” and update its Sharing Settings. (warning) Share the “Data” Folder with the Participants team only! (warning) DO NOT make the Folder public or accessible to all registered Synapse users. The Local Sharing Settings for the “Data” Folder should look something like this:

    Image Added

  4. Upload any Challenge Data that is to be provided to the Participants to the “Data” Folder. Remember to only do this once you have ensured that the Folder is only accessible to the Participants team.

Requirements

To launch the Challenge, that is, to copy the Wiki pages of the staging site over to the live site, use synapseutils' copyWiki in a Python script, e.g.

...

When using copyWiki, it is important to specify the destinationSubPageId parameter.  This ID can be found in the URL of the live site, where it is the integer following .../wiki/<some number>.

(warning) Once copyWiki has been used once, DO NOT USE IT AGAIN!! (warning)


Following this action, all changes to the live site should now be synced over with challengeutils' mirrow-wiki. More on updating the Wikis under the Update the Challenge section below.In addition to copying over the Wiki pages, share all needed Evaluation queues with the Participants team with Can submit permissions. Once the Challenge is over, we recommend updating their permissions to Can view (this help keep their interface clean).

...

Monitor the Submissions

As Challenge Organizers, we recommend creating a Submission View to easily track and monitor submissions as they come in. This table will especially be useful when Participants need help with their submissions.

...

Column Name

Description

Facet values?

evaluationid

Evaluation ID (evaluation ID, but rendered as evaluation name) – recommended for SubmissionViews with multiple queues in scope

(tick) Recommended

id

Submission ID

createdOn

Date and time of the submission (in Epoch, but rendered as MM/dd/yyyy, hh:mm:ss)

submitterid

User or team who submitted (user or team ID, but rendered as username or team name)

(tick) Recommended

dockerrepositoryname

Docker image name – recommended for model-to-data challenges

(error) Not recommended

dockerdigest

Docker SHA digest – recommended for model-to-data challenges

(error) Not recommended

status

Workflow status of the submission (one of [RECEIVED, EVALUATION_IN_PROGRESS, ACCEPTED, INVALID])

(tick) Recommended

submission_status

Evaluation status of the submission (one of [None, VALIDATED, SCORED, INVALID])

(tick) Recommended

submission_errors

(if any) Validation errors for the predictions file

(error) Not recommended

orgSagebionetworksSynapseWorkflowOrchestratorSubmissionFolder

Synapse ID to the submission’s logs folder

(error) Not recommended

prediction_fileid

Synapse ID to the predictions file (if any)

(error) Not recommended

(any annotations related to scores)

Submission annotations - names used depends on what annotations were used in the scoring step of the workflow

...