Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

One of the features to the Synapse.org platform is the ability to host a crowd-sourced challenge. Hosting challenges are a great way to crowd-source new computational methods for fundamental questions in systems biology and translational medicine.

Learn more about challenges and see examples of past/current projects by visiting Challenges and Benchmarking.

Running a challenge on Synapse will require creating a challenge space for participants to learn about the challenge, join the challenge community, submit entries, and view results. This article is aimed at challenge organizers, and will focus on:

  • Setting up the infrastructure for submission evaluation

  • Launching and updating the challenge

  • Monitoring the submissions

Before You Begin

🖥️ Required Compute Power

At Sage Bionetworks, we generally provision an AWS EC2 Linux instance to run infrastructure of a challenge, leveraging the SynapseWorkflowOrchestrator to run CWL workflows.  These workflows will be responsible for evaluating and scoring submissions.

  • If Sage is responsible for providing the cloud compute services: we ask that you give us a general estimate of the computing power needed to validate and score the submissions (for example: memory, volume, GPU required?, …).

📋 Using Sensitive Data as Challenge Data

Challenge data can be hosted on Synapse. If the data is sensitive (for example, human data), Synapse can apply access restrictions so that legal requirements are met before participants can access them. Contact the Synapse Access and Compliance Team (act@sagebase.org) for support with the necessary data access procedures for sensitive data.

🛑 Restricted Data Access

If data cannot leave the external site or data provider, it will be the data contributor’s responsibility to set up the challenge infrastructure. Contact the Challenges and Benchmarking team (cnb@sagebase.org) for consultations if needed.

To set up the infrastructure, you may follow Sage’s approach of using the SynapseWorkflowOrchestrator. The following will be required to use the orchestrator:

  • Support for Docker

  • (ideally) Support for docker-compose

  • If Docker is not allowed, then support for Singularity and Java 8 will be a must

Note that the steps outlined in this article will assume the orchestrator will be used.

Challenge Infrastructure Setup

...

Requirements

Outcome

This infrastructure setup will continuously monitor the challenge’s evaluation queue(s) for new submissions. Once a submission is received, it will undergo evaluation including validation and scoring. All submissions will be downloadable to the challenge organizers, including the Docker image (if model-to-data challenge) and/or prediction files. Participants may periodically receive email notifications about their submissions (such as status, scores), depending on the infrastructure configurations.

Steps

1. Create a GitHub repository for the challenge workflow infrastructure. For the orchestrator to work, this repo must be public.

Two templates are available in Sage-Bionetworks-Challenges that you may use as a starting point. Their READMEs outline what will need to be updated within the scripts (under Configurations), but we will return to this later in Step 12.

...

Workflow Template

...

Submission Type

...

data-to-model-challenge-workflow

...

Flat files, like CSV files

...

model-to-data-challenge-workflow

...

Docker images

2. Create the challenge space on Synapse with challengeutils' create-challenge:

Code Block
languagebash
challengeutils create-challenge "challenge_name"

This command will create two Synapse projects:

  • Staging - Organizers will use this project during challenge planning and development to share files and draft the wiki content. create-challenge will intialize the wiki with the DREAM Challenge Wiki Template.

  • Live - Organizers will use this project as the pre-registration page during challenge development. When the challenge is ready for launch, the project will then be replaced with the contents from staging.

We encourage you to use the staging project to make all edits and preview them before officially pushing the updates over to the live project.

Panel
panelIconId27a1
panelIcon:arrow_right:
panelIconText➡️
bgColor#F4F5F7

See Update the Challenge below to learn more about syncing changes from staging to live.

create-challenge will also create four Synapse teams for the challenge:

  • Pre-registrants - This team is used when the challenge is under development. It allows interested Synapse users to join a mailing list to receive notification of challenge launch news.

  • Participants - Once the challenge is launched, Synapse users will join this team in order to download the challenge data and make submissions.

  • Organizers - Synapse users added to this team will have the ability to share files and edit wikis on the staging project. Add users as needed.

  • Admin - Synapse users added to this team will have administrator access to both the live and staging projects. Organizers do not need to be Administrators. Ideally, all admins must have a good understanding of Synapse. Add users as needed.

3. On the live project, go to the Challenge tab and create as many evaluation queues as needed (for example, one per question/task) by clicking on Challenge Tools > Create Evaluation Queuecreate-challenge will create one evaluation queue by default.

Panel
panelIconId270f
panelIcon:pencil2:
panelIconText
bgColor#E3FCEF

The 7-digits in the parentheses following each evaluation queue name is the evaluation ID. You will need these ID(s) later in Step 11.

...

4. While still on the live project, go to the Files tab and create a new folder called “Logs” by clicking on the add-folder icon:

...

Panel
panelIconId270f
panelIcon:pencil2:
panelIconText
bgColor#E3FCEF

This folder will contain the participants' submission logs and prediction files (if any). Make note of its synID for use later in Step 11.

5. On the staging project, go to the Files tab and click on the upload icon to Upload or Link to a File:

...

Click Save.

Panel
panelIconId270f
panelIcon:pencil2:
panelIconText
bgColor#E3FCEF

This file will be what links the evaluation queue to the orchestrator. Make note of its synID for use later in Step 11.

7. Add an annotation to the file called ROOT_TEMPLATE. This annotation will be used by the orchestrator to determine which file among the repo is the workflow script. Click on the annotations icon, followed by Edit:

...

8. For “Value”, enter the filepath to the workflow script as if you had downloaded the repo as a ZIP. For example, model-to-data-challenge-workflow would be downloaded and unzipped as model-to-data-challenge-workflow-main and the path to the workflow script is workflow.cwl:

...

In this example, “Value” will be model-to-data-challenge-workflow-main/workflow.cwl. For the most part, “Value” should look something like this:

{name of repo}-{branch}/workflow.cwl

9. Create a cloud compute environment with the required memory and volume specifications, then log into the instance.

10. On the instance, clone the SynapseWorkflowOrchestrator repo if it’s not already available on the machine. Change directories to SynapseWorkflowOrchestrator/ and create a copy of the .envTemplate file as .env (or rename it to .env):

Code Block
cd SynapseWorkflowOrchestrator/
cp .envTemplate .env

11. Open .env and enter values for the following config variables:

...

Property

...

Description

...

Example

...

SYNAPSE_USERNAME

...

Synapse credentials under which the orchestrator will run.  

The provided user must have access to the evaluation queue(s) being serviced.

...

dream_user

...

SYNAPSE_PASSWORD

...

Password for SYNAPSE_USERNAME.

This can be found under My Dashboard > Settings.

...

"abcdefghi1234=="

...

WORKFLOW_OUTPUT_ROOT_ENTITY_ID

...

synID for "Logs" folder.

Use the synID from Step 4.

...

syn123

...

EVALUATION_TEMPLATES

...

JSON map of evaluation IDs to the workflow repo archive, where the key is the evaluation ID and the value is the link address to the archive.

Use the evaluation IDs from Step 3 as the key(s) and the synIDs from Step 5 as the value(s).

...

{

"9810678": "syn456", 

 "9810679": "syn456"

}

Panel
panelIconId27a1
panelIcon:arrow_right:
panelIconText➡️
bgColor#F4F5F7

Refer to the "Running the Orchestrator with Docker containers" README section for additional configuration options.

12. Clone the workflow repo.  Using a text editor or IDE, make the following updates to the following scripts:

...

titleIf using data-to-model template:

...

Script

...

TODO

...

Required?

...

workflow.cwl

...

Update synapseid to the synID of the Challenge's goldstandard/groundtruth file

...

yes

...

Set errors_only to false if an email notification about a valid submission should also be sent

...

no

...

Add metrics and scores to private_annotations if they are to be withheld from the participants

...

no

...

validate.cwl

...

Update the base image if the validation code is not Python

...

no

...

Remove the sample validation code and replace with validation code for the challenge

...

yes

...

score.cwl

...

Update the base image if the validation code is not Python

...

no

...

Remove the sample scoring code and replace with scoring code for the challenge

...

yes

...

titleIf using model-to-data template:

...

Script

...

TODO

...

Required?

...

workflow.cwl

...

Provide the admin user ID or admin team ID for principalid 

(2 steps: set_submitter_folder_permissions, set_admin_folder_permissions)

...

yes

...

Update synapseid to the synID of the Challenge's goldstandard

...

yes

...

Set errors_only to false if an email notification about a valid submission should also be sent

(2 steps: email_docker_validation, email_validation)

...

no

...

Provide the absolute path to the data directory, denoted as input_dir, to be mounted during the container runs.

...

yes

...

Set store to false if log files should be withheld from the Participants

...

no

...

Add metrics and scores to private_annotations if they are to be withheld from the Participants

...

no

...

validate.cwl

...

Update the base image if the validation code is not Python

...

no

...

Remove the sample validation code and replace with validation code for the Challenge

...

yes

...

score.cwl

...

Update the base image if the validation code is not Python

...

no

...

Remove the sample scoring code and replace with scoring code for the Challenge

...

yes

Push the changes up to GitHub when done.

13. On the instance, change directories to SynapseWorkflowOrchestrator/ and kick-start the orchestrator with:

Code Block
docker-compose up -d

where -d will run orchestrator in the background. This will allow you to exit the instance without terminating the orchestrator.

Note

If validate.cwl/score.cwl is using a Docker image instead of inline code: you must first pull that image onto the instance before starting the orchestrator. Otherwise, the orchestrator will fail, stating that the image cannot be found.

14. To make changes to the .env file (such as updating the number of concurrent submissions), stop the orchestrator with:

Code Block
docker-compose down

Once you are done making updates, save the file and restart the orchestrator to apply the changes.

Log Files

As it’s running, the orchestrator will upload logs and prediction files to the “Logs” folder. For each user or team that submits to the challenge, two folders will be created:

  • <submitterid>/

  • <submitterid>_LOCKED/

where Docker and TOIL logs are uploaded to <submitterid>/, and prediction files are uploaded to <submitterid>_LOCKED/. Note that the LOCKED folders will not be accessible to the participants, in order to prevent data leakage.

The directory structure of “Logs” will look something like this:

Code Block
Logs
 ├── submitteridA
 │  ├── submission01
 │  │  ├── submission01_log.txt
 │  │  └── submission01_logs.zip
 │  ├── submission02
 │  │  ├── submission02_log.txt
 │  │  └── submission02_logs.zip
 │ ...
 │
 ├── submitteridA_LOCKED
 │  ├── submission01
 │  │  └── predictions.csv
 │  ├── submission02
 │  │  └── predictions.csv
 │ ...
 │
...

Launch the Challenge

Requirements

Important TODOs:

  1. Before proceeding with the launch, we recommend contacting Sage Governance to add a clickwrap for challenge registration. With a clickwrap in-place, interested participants can only be registered if they agree to the terms and conditions of the challenge data usage.

    • If you are a Sage employee: submit a Jira ticket to the Governance board with the synID of the live project, as well as the team ID of the participants team.

  2. Share all needed evaluation queues with the participants team with Can submit permissions. Once the challenge is over, we recommend updating the permissions to Can view to prevent late submissions.

  3. We also recommend sharing the evaluation queues with the general public so that the leaderboards are openly accessible.

  4. After the challenge is launched, create a folder called “Data” and update its Sharing Settings. Share the “Data” folder with the participants team only. Do not make the folder public or accessible to all registered Synapse users. The sharing settings of the “Data” folder should look something like this:

    Image Removed

  5. Upload any challenge data that is to be provided to the participants to the “Data” Folder. DO NOT UPLOAD DATA until you have updated its sharing settings.

To launch the Challenge, that is, to copy the wiki pages of the staging project over to the live project, use synapseutils' copyWiki() in a Python script.

For example:

Code Block
languagepy
import synapseclient
import synapseutils
syn = synapseclient.login()

synapseutils.copyWiki(
   syn, "syn1234",  # synID of staging site
   destinationId="syn2345",  # synID of live site
   destinationSubPageId=999999  # ID following ../wiki/ of live project URL
)

When using copyWiki, it is important to specify the destinationSubPageId parameter.  This ID can be found in the URL of the live project, where it is the number following .../wiki/.

(warning) Once copyWiki has been used once, DO NOT RUN IT AGAIN! (warning)

...

Panel
panelIconId27a1
panelIcon:arrow_right:
panelIconText➡️
bgColor#F4F5F7

Learn more about how to Update the Challenge below.

Monitor the Submissions

As challenge organizers, we recommend creating a Submission View to track and monitor submissions as they come in. This table will especially be useful when participants need help with their submissions.

Panel
panelIconId27a1
panelIcon:arrow_right:
panelIconText➡️
bgColor#F4F5F7

Learn more about revealing scores and adding leaderboards in Evaluation Queues.

Steps

1. Go to the staging project and click on the Tables tab. Create a new Submission View by clicking on Add New… > Add Submission View

2. Under "Scope", add evaluation queue(s) you are interested in monitoring. More than one queue can be added. Click Next. On the following screen, select which information to display - this is known as the schema.

We recommend the following schema for monitoring challenge submissions:

...

Column Name

...

Description

...

Facet values?

...

evaluationid

...

Evaluation ID (evaluation ID, but rendered as evaluation name) – recommended for SubmissionViews with multiple queues in scope

...

(tick) Recommended

...

id

...

Submission ID

...

createdOn

...

Date and time of the submission (in Epoch, but rendered as MM/dd/yyyy, hh:mm:ss)

...

submitterid

...

User or team who submitted (user or team ID, but rendered as username or team name)

...

(tick) Recommended

...

dockerrepositoryname

...

Docker image name – recommended for model-to-data challenges

...

(error) Not recommended

...

dockerdigest

...

Docker SHA digest – recommended for model-to-data challenges

...

(error) Not recommended

...

status

...

Workflow status of the submission (one of [RECEIVED, EVALUATION_IN_PROGRESS, ACCEPTED, INVALID])

...

(tick) Recommended

...

submission_status

...

Evaluation status of the submission (one of [None, VALIDATED, SCORED, INVALID])

...

(tick) Recommended

...

submission_errors

...

(if any) Validation errors for the predictions file

...

(error) Not recommended

...

orgSagebionetworksSynapseWorkflowOrchestratorSubmissionFolder

...

synID to the submission’s logs folder

...

(error) Not recommended

...

prediction_fileid

...

synID to the predictions file (if any)

...

(error) Not recommended

...

(any annotations related to scores)

...

Submission annotations - names used depends on what annotations were used in the scoring step of the workflow

Info

The highlighted columns would need to be added manually by clicking the + Add Column button at the bottom of the Edit Columns window.

3. Click Save. A table of the submissions and their metadata will now be available for viewing and querying. Changes to the information displayed can be edited by clicking on the schema icon, followed by Edit Schema:

...

Update the Challenge

Updating Existing Wikis

1. Go to the staging project and navigate to the page(s) you wish to edit. Click on the pencil icon to Edit Project Wiki:

...

Make edits as needed, then click Save.

...

Code Block
challengeutils mirror-wiki staging_synid live_synid [--dryrun]

Use --dryrun to optionally preview which pages will be updated prior to doing an official sync.

Adding a New Wiki Page

1. Go to the staging project, click on Wiki Tools > Add Wiki Subpage. Enter a page title, then click OK. A new page should now be available.

...

Add the page content, then click Save.

3. Go to the live project and create a new wiki page with the same name as the new page in the staging project. mirror-wiki depends on the page titles to be the same for synchronization.

4. Use challengeutils' mirror-wiki to push the changes to the live project:

Code Block
challengeutils mirror-wiki staging_synid live_synid [--dryrun]

Use --dryrun to optionally preview which pages will be updated prior to doing an official sync.

Extending the Deadline

It is not unheard of for there to be a change in the submission deadline. To extend the submission deadline date/time, you can either:

  • edit the Round End of an existing round; or

  • add a new round that will immediately start after the current one (note that this approach will reset everyone’s submission limit)

Updating the Workflow

For any changes to the CWL scripts or run_docker.py, make edits as needed to the scripts, then push the changes. We highly recommend conducting dryruns immediately after, so that errors are addressed in a timely manner.

Evaluation Docker Image

If your workflow is using a Docker image in validate.cwl and/or score.cwl, and updates were made, pull the latest changes on the instance with:

Code Block
docker pull <image name>:<version>

🌟 For additional assistance or guidance, contact the Challenges and Benchmarking team at cnb@sagebase.orgSynapse enables you to host challenges, providing an excellent avenue to crowd-source new computational methods for fundamental questions in systems biology and translational medicine.

Setting up and running your own Challenge on Synapse is free*! This tutorial will teach you the steps to hosting a Challenge on Synapse. For a visual walkthrough, you can also refer to this Storylane.

Need personalized support with data governance, infrastructure setup, or any other aspect of your Challenge? Our dedicated Challenges & Benchmarking (CNB) team is here to help! Get started by outlining your Challenge plan here and a CNB team member will be in touch.

*up to 100 GB data storage

...

Setting Up Your Challenge

A Synapse Challenge always starts with two key Synapse entities:

  • Participant Team: A Synapse Team that serves as the central hub for challenge registration. Once users register for your Challenge, they will be automatically added to this team+. This makes it simple to communicate with all registered participants through the Synapse team email or by tagging the team name in a Discussion thread for announcements.

+the Team can be configured to require manager approval before participants are added

  • Challenge Project: A Synapse Project acts as your challenge's official "website". It is the primary place where participants can:

    • find all the crucial information, such as details about the challenge tasks and the scientific motivation behind them

    • access data (if hosted on Synapse)

    • make submissions

    • contact the challenge organizers and other participants

For challenge data, you have flexibility: you can either host the data directly on Synapse, or if the data is hosted elsewhere, you can link to it from Synapse by creating a File Link.

Keep reading to learn how to set up your Challenge, including how to create Teams, add Evaluation Queues and enable Challenge registration.

Challenge Team(s)

At a minimum, your Challenge requires a Synapse Team to be the “Registered Participants Team”. You can either use an existing Synapse Team or create a new one specifically for this challenge. We recommend a clear, descriptive name like "YOUR_CHALLENGE_NAME Participants Team" for easy identification.

If you have multiple organizers who are planning to actively manage the Challenge, we also suggest creating a separate “Organizers Team”. Using this Team will greatly streamline internal communication, simplify permission updates, and make sharing resources among your organizing group more efficient.

(plus) Learn how to create and manage Teams here.

Challenge Project (aka the Website)

Your Challenge “website” is technically a Synapse Project with the following features:

  • Evaluation Queues

  • Registration button

(plus) To create a Synapse Project, see Creating a Project.

Navigating your Challenge Project

Every Synapse Project includes several tabs relevant to managing your Challenge:

  • Wiki - this is where you can provide details about your Challenge.

  • Files - (if hosted on Synapse) this is where you can share challenge data with participants.

  • Tables - this is where you can create tables to view and monitor submissions.
    (plus) To add a leaderboard table into a Wiki page, see Embed a Submission View in a Wiki Page.

  • Challenge - (once enabled; see below) this is where you can create and manage queues for participants to submit their predictions or Docker images.

  • Discussion - this is where organizers and participants can interact and communicate with each other.

Enable Evaluation Queues

By default, Evaluation Queues (a Synapse feature for accepting submissions) are not enabled for Synapse Projects. To activate the Evaluation Queues feature:

  1. Click on the Project Tools menu in the top-right corner of your Project, followed by Run Challenge.

  2. A new window will appear, prompting you for a “Participant Team”. Enter the name of a Synapse Team you’ve designated as the participants team for your Challenge.

  3. Click Create Challenge to save.

You’ll then be directed to a new Challenge tab within your Project. Here, you can update the Registered Participants team as needed, and create or delete Evaluation Queues.

Info

The Challenge tab is only visible to Synapse users with "Admin" privileges to at least one Evaluation Queue.

(plus) For more detailed information on how to create and manage the queues, see Evaluation Queues.

Add Challenge Registration

Participant Registration

Challenge registration can be added to a Wiki page by leveraging the Join Team Button widget. To add challenge registration:

  1. Navigate to the Wiki page where you'd like the registration button to appear. We recommend using the main Wiki page, as it's often the first page new users see.

  2. Click the pencil icon to Edit Project Wiki. An editing window will open.

  3. Click on +Insert then select Join Team Button:

    Adding a 'Join. Team Button' widget to a Wiki page.Image Added

  4. Another window will display. Complete the fields as prompted, ensuring to enter the same Synapse Team that is currently designated as the “Registered Participants Team” for your Challenge.

  5. Before saving the widget configuration, make sure to enable the checkbox next to “Is this a challenge sign-up?”

  6. Click Save, and a new Markdown will be provided in the editing window. It will look something like this:

${jointeam?teamId=PARTICIPANT_TEAM_ID&isChallenge=true&isMemberMessage=Registered for CHALLENGE_NAME&text=Click Here to Register&isSimpleRequestButton=true&requestOpenText=Your registration is in progress%2E&successMessage=Your registration is in progress%2E}

  1. Click Preview to review the button placement; move around the Markdown as needed.

  2. Click Save to finalize the changes onto the Wiki page.

Team Registration

Once participants have registered individually, they can also register their team for your Challenge. For example:

...

To add a team registration button to your Challenge:

  1. Navigate to the Wiki page where you’d like the team registration button to appear.

  2. Click on the pencil icon to Edit Project Wiki. An editing window will open.

  3. Add the following Markdown, replacing YOUR_CHALLENGE_ID with the ID of your Challenge (which you can find in the Challenges tab) and BUTTON_TITLE with the text you want displayed on the button:

${registerChallengeTeam?challengeId=YOUR_CHALLENGE_ID&buttonText=BUTTON_TITLE}

  1. Click Preview to review the button and its placement. Move around the Markdown as needed.

  2. Click Save to finalize the changes onto the Wiki page.

Upload Challenge Data

For easier file management, we recommend creating Folders first and then uploading the data into them, rather than directly uploading Files into the project. For example:

...

(plus) Learn how to create Folders and upload Files at Uploading and Organizing Data.

Notice how these Folders (and subsequently, the Files) are marked as Private, as indicated by the lock icons. This is important because, generally, only specific users (like registered participants) should have access to the challenge data. Or even no access at all, as in the case of the "Groundtruth" Folder/Files.

We'll cover how to set these permissions in more detail below!

Example Challenge Projects

...

Launching Your Challenge

To ensure your Challenge is discoverable, accessible, and open for registrations to anyone on the web, you will need to make your Project publicly viewable. By default, newly created Synapse Projects are only visible to the creator.

To make your Challenge publicly viewable:

  1. Click on the Project Tools menu, followed by Project Sharing Settings.

  2. Click Make public, then adjust the access level for All Synapse users from “Can download” to “Can view”.

Panel
panelIconId203c
panelIcon:bangbang:
panelIconText‼️
bgColor#FFEBE6

Important! If your Project is hosting challenge data, please read the next section before making your Challenge public.

Important Considerations Before Launch

By default, all entities within your Synapse Project, like Files, Folders, and Submission Views, inherit the sharing settings of the main Project. However, for a challenge, you may only want registered participants to access and download the challenge data, not just anyone on the web.

Therefore, before your Challenge goes live, you may need to update the permissions for certain Synapse entities. This includes, but isn't limited to:

  • Challenge data

  • Evaluation Queues

By creating Local Sharing Settings for the data, you can ensure that the data won't become public, even after your main Challenge project does.

(plus) Learn more about Local Sharing Settings.

Manage Challenge Data Permissions

Whether your challenge data is organized into Folders or uploaded directly as Files, the steps are generally the same.

Info

Using Folders to organize your data is recommended, as setting the Local Sharing Settings on a Folder automatically applies those permissions to all Files within it, saving you the effort of individually setting permissions for each File.

To update the permissions for each Folder or File containing challenge data:

  1. Navigate to the Files tab of your Challenge project.

  2. For each Folder or File containing challenge data:

    1. Click on the Folder or File to navigate to its details page

    2. Click on the Folder Tools or File Tools menu, then select Folder Sharing Settings (for a folder) or File Sharing Settings (for a file).

    3. In the sharing settings window, click on + Create Local Sharing Settings.

    4. If necessary, revoke “Can download” access from all Synapse users and/or teams that should not have download permissions to the challenge data. Note: “Can edit” and “Administrator” will also have download access, so adjust users with those roles as needed as well.

    5. Under Add More, enter the same Synapse Team currently designated as the “Registered Participants Team” for your Challenge, and set their permissions level to “Can download”.

    6. Important: If the File is a ground truth file, do NOT share it with the Participants Team or with anyone (unless they specifically need access, e.g. an organizer).

    7. Once you have made your changes, click Save to apply the new sharing settings.

Here's an example of what sharing settings for a Folder containing challenge data might look like:

...


Manage Evaluation Queues Permissions

Unlike other child entities, Evaluation Queues do not inherit settings from your main Project. By default, new queues are only accessible to the creator.

To allow others to view results and make submissions, you will need to adjust their access settings:

  1. Navigate to the Challenge tab of your Challenge project.

  2. For each Evaluation Queue:

    1. Click on the three dots, followed by Modify Access.

    2. Click Make public. Ensure that "Anyone on the web" has “Can view” permissions. This will allow anyone on the web to see submissions and their results in a Submission View or leaderboard, without needing to sign into Synapse.

    3. Under Add More People, enter:

      1. the same Synapse Team designated as your Challenge's "Registered Participants Team" and set their permission level to “Can submit”.

      2. (if available) the Organizers Team for your Challenge, and set their permission level to “Can score”. This allows the organizing group members to download the submissions.

Note

Do NOT remove yourself as "Admin" - doing so can cause irreversible changes that the Synapse IT team will be unable to assist with.

Here's an example of what sharing settings for an Evaluation Queue might look like:

...

(plus) To learn how to evaluate submissions, see Evaluating Submissions.

...

Closing Your Challenge

After your Challenge has concluded, we recommend taking the following actions:

  1. Prevent new registrations.
    To prevent users from joining the Participants Team and accessing challenge data after the challenge has ended:

    1. From the Dashboard, click on the Teams icon from the left navigation bar.

    2. Find or search for the Synapse Team designated as the “Registered Participants Team” for your Challenge. Click on the Team to go to its team page.

    3. Click on the Team Actions menu, followed by Edit Team.

    4. Under Access, select “Team is locked, users may not join or request access. New users must be invited by a team manager.”

  2. Disable registration on your website.
    Remove or hide all Join Team Button widgets to disable Challenge registration directly on the website. You can replace the registration button with an informative alert like this:

    challenge-closed-banner.pngImage Added
    Code Block
    languagehtml
    <div class="alert alert-success">
    
    ###! Challenge closed 🏆
    
    Final results are [available here](YOUR_LINK_HERE).  Thank you to all who participated!
    
    </div>

To add this alert, paste the HTML above into the Wiki editing window.

  1. Stop active Evaluation Queue(s).
    If any Evaluation Queues are still active, be sure to stop them.

    For safe measure, you can also remove “Can submit” permissions from the Participants team to prevent any post-challenge submissions.