This guide helps organizers create a space within Synapse to host a crowd-sourced Challenge. Challenges are community competitions designed to crowd-source new computational methods for fundamental questions in systems biology and translational medicine. Learn more about Challenges and see examples of past and current projects by visiting Challenges and Benchmarking.
...
Panel | ||||||||
---|---|---|---|---|---|---|---|---|
| ||||||||
This file will be what links the evaluation queue to the orchestrator. Make note of this File ID for use later in Step 10. |
7. /wiki/spaces/DOCS/pages/2667708522 Add an annotation to the file called ROOT_TEMPLATE by clicking on Files Tools > Annotations > Edit. The "Value" will be the path to the workflow script, written as:
...
8. Create a cloud compute environment with the required memory and volume specifications, then SSH into the instance.
If you are a Sage employee: you can follow our internal instructions on /wiki/spaces/CHAL/pages/2806087732 here.
If you are not a Sage employee: follow the instructions listed under "Setting up linux environment" to install and run Docker as well as
docker-compose
onto the compute environment of choice.
9. On the instance, clone the SynapseWorkflowOrchestrator repo if needed. Change directories to SynapseWorkflowOrchestrator/
and create a copy of the .envTemplate
file as .env
(or simply rename it to .env
):
...
When using copyWiki
, it is important to specify the destinationSubPageId
parameter. This ID can be found in the URL of the live site, where it is the integer following .../wiki/<some number>
.
Once copyWiki
has been used once, DO NOT USE IT AGAIN!!
Following this action, all changes to the live site should now be synced over with challengeutils' mirrow-wiki
. More on updating the Wikis under the Update the Challenge section below.
...
Column Name | Description | Facet values? |
---|---|---|
| Evaluation ID (evaluation ID, but rendered as evaluation name) – recommended for SubmissionViews with multiple queues in scope | Recommended |
| Submission ID | |
| Date and time of the submission (in Epoch, but rendered as | |
| User or team who submitted (user or team ID, but rendered as username or team name) | Recommended |
| Docker image name – recommended for model-to-data challenges | Not recommended |
| Docker SHA digest – recommended for model-to-data challenges | Not recommended |
| Workflow status of the submission (one of [ | Recommended |
| Evaluation status of the submission (one of [None, | Recommended |
| (if any) Validation errors for the predictions file | Not recommended |
| Synapse ID to the submission’s logs folder | Not recommended |
| Synapse ID to the predictions file (if any) | Not recommended |
(any annotations related to scores) | Submission annotations - names used depends on what annotations were used in the scoring step of the workflow |
...