Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Requirements

  • Synapse account

  • Python 3.7+

  • (for local testing) CWL runner of choice, e.g. cwltool

  • Access to cloud compute services, e.g. AWS, GCP, etc.

...

3. Update the quota of the Evaluation queue, such as the Duration (Round Start, Round End) and Submission Limits. Generally, there are no submission limits for write-ups, so that field can be left blank.

Panel
panelIconId270f
panelIcon:pencil2:
panelIconText✏️
bgColor#E3FCEF

Get the 7-digit Evaluation ID as it will later be needed in Step 9.

4. Go to the Sage Bionetworks Challenges repo group, and create a New repo.

...

7. In the pop-up window, switch tabs to Link to URL. For "URL", you will need to provide the link enter the web address to the zipped archive download of the write-up workflow infrastructure repository.  You can may get this link address by going back to the repo, repository and clicking on Code, then > right-clicking on Download ZIPZip > Copy Link Address. Give the File Link any Name you’d like, :

...

Name the file (e.g. "writeup-workflow8. Click on File Tools > Annotations. Click on Edit and add an annotation called ROOT_TEMPLATE. For “Value”, enter "), then click Save.

Panel
panelIconId270f
panelIcon:pencil2:
panelIconText✏️
bgColor#E3FCEF

This file will be what links the evaluation queue to the orchestrator. Make note of this File ID for use later in Step 9.

8. Add an annotation to the file called ROOT_TEMPLATE by clicking on the annotations icon:

...

The “Value” will be the path to the workflow script, written as:

{infrastructure workflow repo}-{branch}/{path/to/workflow.cwl}

For example:

my-writeup-repo-main/workflow.cwl

...

Feel free to look at the other infrastructure files to see how the ROOT_TEMPLATE annotations are written.

9.

...

9. There are two approaches for running the writeup workflow. You can either:

a) tack on the evaluation + workflow onto an existing instance with Orchestrator, by adding another key-value pair to EVALUATION_TEMPLATES

b) create a new instance (a t3.small would be sufficient enough) and setup Orchestrator on that machine. See Steps 7-9, 11 of Creating and Managing a Challenge for more details on how to set this up.