At the end of every Challenge, we aim to collect writeups from all participating users/teams, in which they will provide information about their submission(s). Currently, writeups are collected as Synapse Projects (sample template here), and we ask the participants to include information such as their methodologies, external data sources (if any), and source code and/or Docker images. Typically, writeups are *required* in every Challenge, in order to be considered for “top performer” eligibility and other incentives, including byline authorship.
This article will outline the steps to collect and validate writeups on Synapse, as well as how to display them in a leaderboard combined with the evaluation results.
...
Workflow Setup
Requirements
Synapse account
(for local testing) CWL runner of choice, e.g. cwltool
Access to cloud compute services, e.g. AWS, GCP, etc.
...
Test Case | Workflow Configurations | Expected Outcome |
---|---|---|
Submitting the Challenge site |
| |
Submitting a private Synapse project | Lines 49-50 is used (writeup should be accessible to the Organizers team) |
|
Submitting a private Synapse project | Lines 47-48 is used (writeup should be publicly accessible) |
|
Submitting a Private Synapse project that is shared with the Organizers team | Lines 49-50 is used (writeup should be accessible to the Organizers team) |
|
Submitting a Private Synapse project that is shared with the Organizers team | Lines 47-48 is used (writeup should be publicly accessible) |
|
Submitting a public Synapse project | Lines 47-48 and/or lines 49-50 are used |
|
Once you are satisfied that the writeup workflow is to your expectations, remember to open the queue to the Challenge participants!
You can do so by updating its Sharing Settings so that the Participants team has Can submit
permissions.