Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

One of the features to the Synapse.org platform is the ability to host a crowd-sourced challenge. Hosting challenges are a great way to crowd-source new computational methods for fundamental questions in systems biology and translational medicine.

...

Important TODOs:

Info

Following steps 1-5 should be done in the live project.

  1. Before proceeding with the launch, we recommend contacting Sage Governance to add a clickwrap for challenge registration. With a clickwrap in-place, interested participants can only be registered if they agree to the terms and conditions of the challenge data usage.

    • If you are a Sage employee: submit a Jira ticket to the Governance board with the synID of the live project, as well as the team ID of the participants team.

  2. Share all needed evaluation queues with the participants team with Can submit permissions. Once the challenge is over, we recommend updating the permissions to Can view to prevent late submissions.

  3. We also recommend sharing the evaluation queues with the general public so that the leaderboards are openly accessible.

  4. After the challenge is launched, create a folder called “Data” and update its Sharing Settings. Share the “Data” folder with the participants team only. Do not make the folder public or accessible to all registered Synapse users. The sharing settings of the “Data” folder should look something like this:

  5. Upload any challenge data that is to be provided to the participants to the “Data” Folder. DO NOT UPLOAD DATA until you have updated its sharing settings.

...

When using copyWiki, it is important to specify the destinationSubPageId parameter.  This ID can be found in the URL of the live project, where it is the number following .../wiki/.

(warning) Once copyWiki has been used once, DO NOT RUN IT AGAIN! (warning)


Once the wiki has been copied over, all changes to the live project should now be synced with challengeutils' mirrow-wiki.

...

Column Name

Description

Facet values?

evaluationid

Evaluation ID (evaluation ID, but rendered as evaluation name) – recommended for SubmissionViews with multiple queues in scope

(tick) Recommended

id

Submission ID

createdOn

Date and time of the submission (in Epoch, but rendered as MM/dd/yyyy, hh:mm:ss)

submitterid

User or team who submitted (user or team ID, but rendered as username or team name)

(tick) Recommended

dockerrepositoryname

Docker image name – recommended for model-to-data challenges

(error) Not recommended

dockerdigest

Docker SHA digest – recommended for model-to-data challenges

(error) Not recommended

status

Workflow status of the submission (one of [RECEIVED, EVALUATION_IN_PROGRESS, ACCEPTED, INVALID])

(tick) Recommended

submission_status

Evaluation status of the submission (one of [None, VALIDATED, SCORED, INVALID])

(tick) Recommended

submission_errors

(if any) Validation errors for the predictions file

(error) Not recommended

orgSagebionetworksSynapseWorkflowOrchestratorSubmissionFolder

synID to the submission’s logs folder

(error) Not recommended

prediction_fileid

synID to the predictions file (if any)

(error) Not recommended

(any annotations related to scores)

Submission annotations - names used depends on what annotations were used in the scoring step of the workflow

...