At the end of every Challenge, we aim to collect writeups from all participating users/teams, in which they will provide information about their submission(s). Currently, writeups are collected as Synapse Projects (sample template here), and we ask the participants to include information such as their methodologies, external data sources (if any), and source code and/or Docker images. Typically, writeups are *required* in every Challenge, in order to be considered for “top performer” eligibility and other incentives, including byline authorship.
...
a) tack on the evaluation + workflow onto an existing instance with Orchestratororchestrator, by adding another key-value pair to EVALUATION_TEMPLATES
in the .env
file
b) create a new instance (a t3.small
would be sufficient enough) and setup Orchestrator orchestrator on that machine. See Steps 7-9, 11 of Creating and Managing a Challenge for more details on how to set this up.
...
Test Case | Workflow Configurations | Expected Outcome |
---|---|---|
Submitting the Challenge site |
| |
Submitting a private Synapse project | Lines 49-50 is used (writeup should be accessible to the Organizers team) |
|
Submitting a private Synapse project | Lines 47-48 is used (writeup should be publicly accessible) |
|
Submitting a Private Synapse project that is shared with the Organizers team | Lines 49-50 is used (writeup should be accessible to the Organizers team) |
|
Submitting a Private Synapse project that is shared with the Organizers team | Lines 47-48 is used (writeup should be publicly accessible) |
|
Submitting a public Synapse project | Lines 47-48 and/or lines 49-50 are used |
|
Once you are satisfied that the writeup workflow is to your expectations, remember to open the queue to the Challenge participants!
You can do so by updating its Sharing Settings so that the Participants team has Can submit
permissions.