...
- Synapse as dashboard of all her projects, regardless of where the data is living or who the collaborators are (this is likely one of many projects she is switching among)
- Wiki as notebook for future self.
- Annotations to enhance ability to later find the data / project if it goes dormant for a while.
- Ability to easily move projects between different work environments (PC, shared computational servers, cloud)
Ad hoc analysis - Collaborative
...
- Authorization controls over project contents
- Synchronize files among multiple environments with parallel concurrent use (different instution's in house systems, cloud offerings, etc)
- Shared online collaborative workspace to pull key findings together from multiple people and document project status.
Reproducible Ad hoc analysis
...
An extension of this scenario in the case where both users are working in Amazon would include capturing the specifics of the environment used to run the analysis (AMI, size, etc) as additional parts of the provenance record. These environment descriptions could be stored as Files pointing to publicly-accessible AMIs, allowing anyone to execute the work (in their own AWS account). In fact, Alice may want to rerun the analysis on Amazon again before publication to ensure that her reviewer can step into her analysis, using her project as supplemental materials to her paper.
Benefits:
- Ease of additional people to step and in review / contribute to work
- Move project towards publicly publishable state.
Pipelined Analysis
It turns out that Alice's paper is a hit and now she has lots of biologists asking for help running similar analyses on different data sets. She converges on a particular structure to capture the results of various intermediate stages of her analysis, e.g. to help out a new collaborator (Diane) in a new project:
...
If we have many of these sorts of objects, an extension to this use case is for Synapse to provide central storage, retrieval, of these object definitions, and / or ways to autogenerate the objects and helper functions them from existing synapse data structures used as prototype instances.
Benefits:
- Consistent structures and hardened pipelines evolve out of ad hoc work supporting common preprocessing
- Structure work for large scale comparison of methods / data in public or private challenges