Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

This guide is meant to help organizers create a space within Synapse to host a crowd-sourced Challenge. Challenges are community competitions designed to crowd-source new computational methods for fundamental questions in systems biology and translational medicine. Learn more about Challenges and see examples of past and current projects by visiting the Challenges and Benchmarking page.

A Challenge space provides participants with a Synapse project to learn about the Challenge, join the Challenge community, submit entries, track progress, and view results. This article will focus on:

...

Info

Note: At first, the live site will be just one page where a general overview about the Challenge is provided.  There will also be a pre-register button that Synapse users can click on if they are interested in the upcoming Challenge:.

...

For the initial deployment of the staging site to live, use synapseutils' copyWiki command, NOT mirror-wiki (more on this under Launch the Challenge).

...

3. On the live site, go to the CHALLENGE tab and create as many /wiki/spaces/DOCS/pages/1985151345 as needed (for example, one per sub-challenge) by clicking on Challenge Tools > Create Evaluation Queue.  By default, create-challenge will create an evaluation queue for writeups, which you will already see listed here.

...

A writeup is something we require of all participants in order to be considered for final evaluation and ranking. A writeup should include all contributing persons, a thorough description of their methods and usage of data outside of the Challenge data, as well as all of their scripts, code, and predictions file(s)/Docker image(s). We require all of these so that we can ensure their code and final output is reproducible if they are a top-performer.

Note

Important: the 7-digits in the parentheses following each evaluation queue name is its evaluation ID. You will need these IDs later for Step 9, so make note of them.

...

Regarding writeups: when will these be accepted?
Should participants submit their writeups during submission evaluations or after the Challenge has closed?

A writeup is something we require of all participants in order to be considered for final evaluation and ranking. A writeup should include all contributing persons, a thorough description of their methods and usage of data outside of the Challenge data, as well as all of their a summary of all contributors, methods, scripts, code, and predictions file(s)/Docker image(s). We require all of these so that we can ensure their code and final output is reproducible if they are a top-performerprediction files/Docker images that a team used for their submission. Writeups are required so that top-performing entries can be reproduced.

Update the Challenge

Challenge Site and Wikis

...

Updating an evaluation queue’s quota can be done in one of two ways:

  1. On Synapse the web via Edit Evaluation Queue in Synapse.

  2. In the terminal via challengeutils' set-evaluation-quota:

...

Info

Note: If there is no daily (or weekly) submission limit, then updating the Round Duration is appropriate.  For example, the final round of a Challenge has a total Submission Limit of 2, that is, participants are only allowed two submissions during the entire phase.  A "round", this time, is considered to be the entire phase, so updating Round Duration (or end_datewhen usingset-evaluation-quota) will be the appropriate step to take when updating the deadline for the queue(s).

Web
To update the quota(s) on Synapsethe web, go to the Synapse live site, then head to CHALLENGE tab.  Edit the evaluation queues as needed; in this case, there are three queues and they will all need to be updated.  There are 57 days between the start date (14 December 2020) and the new end date (9 February 2021), which can be translated as 57 "rounds".  And so, Number of Rounds will be increased to 57 rounds:

...

Terminal

Updating with challengeutils' set-evaluation-quota is more or less the same (except round_duration must be given as Epoch milliseconds):

...