At the end of every Challenge, we aim to collect writeups from all participating users/teams, in which they will provide information about their submission(s). Currently, writeups are collected as Synapse Projects (sample template here), and we ask the participants to include information such as their methodologies, external data sources (if any), and source code and/or Docker images. Typically, writeups are *required* in every Challenge, in order to be considered for “top performer” eligibility and other incentives, including byline authorship.
...
Synapse account
(for local testing) CWL runner of choice, e.g. cwltool
Access to cloud compute services, e.g. AWS, GCP, etc.
Outcome
Once set up, the workflow will continuously monitor the writeup queue for new submissions, perform a quick validation, and, if valid, create an archive of the submission. An archive is created to ensure a copy of the writeup is always available to the Organizers team, in case the Participant/Team delete the original version or remove accessibility.
Steps
1. On the live version of the Challenge site, go to the Challenge tab and check whether there is already an Evaluation queue for collecting writeups (skip to Step 4 3 if so).
Note |
---|
Note that by default, Evaluation entities are only accessible to the Evaluation creator; if you are not currently an admin for the Challenge, double-check with other Organizers to ensure that you have the correct permissions to all available queues. |
...
3. Update the quota of the writeup Evaluation queue, such as the Duration (Round Start, Round End) and Submission Limits. Generally, there are no submission limits for writeups, so that this field can be left blank.
Panel | ||||||||
---|---|---|---|---|---|---|---|---|
| ||||||||
Get the 7-digit Evaluation ID as it will later be needed in Step 9. |
4. Go to the Sage-Bionetworks-Challenges GH orgorganization, and create a New repo.
For Repository template, select “Sage-Bionetworks-Challenges/writeup-workflow”. The Owner can be left as default (“Sage-Bionetworks-Challenges”), and give any name you’d like for Repository name.
...
Panel | ||||||||
---|---|---|---|---|---|---|---|---|
| ||||||||
This file will be what links the evaluation queue to the orchestrator. Make note of this File ID for use later in Step 9. |
8. Add an annotation to the file called ROOT_TEMPLATE by clicking on the annotations icon:
...
Test Case | Workflow Configurations | Expected Outcome |
---|---|---|
Submitting the Challenge site |
| |
Submitting a private Synapse project | Lines 49-50 is used (writeup should be accessible to the Organizers team) |
|
Submitting a private Synapse project | Lines 47-48 is used (writeup should be publicly accessible) |
|
Submitting a Private Synapse project that is shared with the Organizers team | Lines 49-50 is used (writeup should be accessible to the Organizers team) |
|
Submitting a Private Synapse project that is shared with the Organizers team | Lines 47-48 is used (writeup should be publicly accessible) |
|
Submitting a public Synapse project | Lines 47-48 and/or lines 49-50 are used |
|
Once you are satisfied that the writeup workflow is to your expectations, remember to open the queue to the Challenge participants!
You can do so by updating its Sharing Settings so that the Participants team has Can submit
permissions.
...