Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

  1. Create a workflow infrastructure GitHub repository for the Challenge.  We have created two templates in Sage-Bionetworks-Challenges that you may use as a starting point. The READMEs outline what will need to be updated within the scripts, but we will return to this later in Step 10.

    1. data-to-model-challenge-workflow (submission type: prediction files)

    2. model-to-data-challenge-workflow (submission type: Docker images)

  2. Create the Challenge site on Synapse.  This can easily be done with challengeutils:
    challengeutils create-challenge "challenge_name"

    This command will create two Synapse Projects: one staging site and one live site.  You may think of them as development and production, in that all edits must be done in the staging site, NOT live.  Changes to the live site will instead be synced over with challengeutils' mirror-wiki (more on this under Update the Challenge).

    Note: at first, the live site will be just one page where a general overview about the Challenge is provided.  There will also be a pre-register button that Synapse users can click on if they are interested in the upcoming Challenge:

    Image Added

    For the initial deployment of the staging site to live, use synapseutils' copyWiki command, NOT mirror-wiki (more on this under Launch the Challenge).

    create-challenge will also create four Synapse Teams for the Challenge: * Preregistrants, * Participants, * Organizers, and * Admin, where * is the Challenge name.  Add users to the Organizers and Admin teams as needed.

  3. On the live site, go to the CHALLENGE tab and create as many Evaluation Queues as needed, e.g. one per sub-challenge, etc. by clicking on Challenge Tools > Create Evaluation Queue.  By default, create-challenge will create an Evaluation Queue for writeups, which you will already see listed here.

    Important: the 7-digits in the parentheses following each Evaluation Queue name is its evaluation IDs, e.g. 

    Image Added

    You will need these IDs later for Step 9, so make note of them.

  4. While still on the live site, go to the FILES tab and create a new Folder called "Logs" by clicking on Files Tools > Add New Folder.

    Important: this will be where the participants' submission logs and prediction files are uploaded, so make note of its Synapse ID for later usage in Step 9.

  5. On the staging site, go to the FILES tab and create a new File by clicking on Files Tools > Upload or Link to a File > Link to URL.

    For "URL", enter the link address to the zipped download of the workflow infrastructure repository.  You may get this address by going to the repository and clicking on Code > right-clicking Download Zip > Copy Link Address:


    Image Added

    Name the File whatever you like (we generally use "workflow"), then hit Save.

    Important: this File will be what links the Evaluation Queue to the orchestrator, so make note of its Synapse ID for later usage in Step 9.

  6. Add an Annotation to the File called ROOT_TEMPLATE by clicking on Files Tools > Annotations > Edit.  The "Value" will be the path to the workflow script, written as: {infrastructure workflow repo}-{branch}/path/to/workflow.cwl For example, this is the path to workflow.cwl of the model-to-data template repo: model-to-data-challenge-workflow-main/workflow.cwl

    Important: the ROOT_TEMPLATE annotation is what the orchestrator uses to determine which file among the repo is the workflow script.

  7. Create a cloud compute environment with the required memory and volume specifications.  Once it spins up, log into the instance and clone the orchestrator:

  8. While still on the instance, change directories to SynapseWorkflowOrchestrator/ and create a copy of the .envTemplate file as .env (or simply rename it to .env):

  9. Open .env and enter values for the following property variables:

  10. Return to the workflow infrastructure repository and clone it onto your local machine.  Open the repo in your editor of choice and make the following edits to the scripts:

  11. On the instance, change directories to SynapseWorkflowOrchestrator/ and kick-start the orchestrator with:

  12. Go to the staging site and click on the TABLES tab.  Create a new Submission View by clicking on Table Tools > Add Submission View.  Under "Scope", add the Evaluation Queue(s) you are interested in monitoring (you may add more than one), then click Next.  On the next screen, select which information to display, then hit Save.  A Synapse table of the submissions and their metadata is now available for viewing and querying

  13. On the live site, go to the CHALLENGE tab and share the appropriate Evaluation Queues with the Participants team, giving them "Can submit" permissions.

  14. Use the copyWiki command provided by synapseutils to copy over all pages from the staging site to the live site.  When using copyWiki, it is important to also specify the destinationSubPageId parameter.  This ID can be found in the URL of the live site, where it is the integer following .../wiki/, e.g.

  15. On the instance, enter: 

...