App-Server Integration
The goal of this document is to capture how integration between medical apps and Bridge server should be done, based on discussions between Bridge engineers and YML engineers. This document covers requirements gathering, design, QA, automation, and other future improvements (both process and technical).
Gathering Requirements and Design
Requirements gathering and design should be done on a design doc in a shared Confluence page (currently in the Bridge confluence space). This doc should be accessible by app developers (including those from YML), Bridge developers (client and server), and where applicable, researchers and Synapse developers. In some cases where it's not practical to have external research teams reviewing and editing design docs, a technical-researcher liaison (such as Brian Bot) will represent the researchers, present requirements, and review designs on their behalf. Confluence is preferred over design by email thread or con call, since there is a record and a "single source of truth" for what the requirements and design decisions are.
To avoid a "design by committee" approach, there should be one engineer leading the design who will do the primary writing and updating of the design doc. They will be responsible for reviewing the design with each stakeholder (app developer, UI designer, server engineer, researcher).
Design docs should include the following:
- overall goals of the feature (aka, the big picture)
- general requirements (what the feature should look like, what kind of data is generated)
- design details (UI design, upload data format w/ examples, relevant identifiers, expected Synapse table columns and data)
- links to JIRA items for tracking work items
TODO: link example design docs
Design Docs and QA
Design docs should also be used to drive QA. Once the requirements are gathered and design is finalized, the design doc should also document
- expected behaviors
- specific test cases that should be manually tested (how to repro and expected results)
- specific test cases that should be automated
This takes the guesswork out of QA and release validation. Specifically, instead of relying on word-of-mouth (unreliable) or the QA's interpretation of how the feature should work (subjective), we can determine if something is working to spec. (Whether the spec itself should be changed to match expectations is a related but separate discussion).
Design Docs for Existing Features
Many of the current features are poorly documented or aren't documented at all. This makes QA and release validation difficult, since there's no consensus on how the feature should behave. We should write design docs for all existing features, at minimum for the Breast Cancer and Parkinson's apps, ideally for all apps.
TODO: JIRA item for documenting existing features.
Identifiers, Schemas, and Data Formats
TODO: JIRA items for improvements for identifiers, schemas, and data formats.
Short-Term: Sync Identifiers Through Documentation
In the short term, to make sure app developers and Bridge team are in sync with identifiers, schemas, and data formats, these should all be documented on Confluence. (As noted before, Confluence is preferred, since it removes the ambiguity of word-of-mouth and of email chains.)
Identifiers are used as schema IDs on the server side, and are also used in the "item" field in the info.json in the uploads. These will be documented in Activity Identifiers. TODO: Fill in that wiki page.
Schemas and data formats should be documented on each feature's design doc (see previous section on Design Docs for Existing Features). These design docs will serve as a model for both app developers and Bridge engineers to code against.
Long-Term: Sync Identifiers Through Server Data
Long-term, instead of validating identifiers through documentation, we'll want the apps and the server to automatically sync identifiers. This could be achieved in the following ways:
- Tasks and schedules come from the server, and these include identifiers. This way, we can ensure that the server knows about a task before we start sending data for it.
- Server-side surveys. If the surveys come from the server, then the server already knows the survey and question identifiers and the answer types.
- TODO: We'll need to find a way for non-survey data formats to come from the server. In particular, if the server says "Tapping Activity", how does the app know this is the tapping activity, and what happens if the app's results don't match what the server expects?
QA and Release Validation
Manual QA
In addition to verifying the app behaves correctly, QA also needs to validate that the data is being sent to the Bridge Server and that the Bridge Server is correctly receiving and parsing the data. (This includes not just dedicated QA resources, but also engineers making code changes.) Validating the data can be done with the following steps. (Note that App Core will automatically call the Upload Status API and dump the results into the logs.)
Pre-requisites:
- Download and install an HTTP tool, such as Firefox Poster (https://addons.mozilla.org/en-US/firefox/addon/poster/) or Chrome Poster (https://chrome.google.com/webstore/detail/chrome-poster/cdjfedloinmbppobahmonnjigpmlajcd)
- This can theoretically be done with tools like wget or curl, although you'd need to manage user sessions and cookies manually.
Steps:
- Open the iOS console logs and look for log messages that look like "Successfully called upload complete for upload ID 280dd6f4-6fb2-45b5-846a-9d582e30196b, check status at https://webservices-staging.sagebridge.org/api/v1/upload/280dd6f4-6fb2-45b5-846a-9d582e30196b/status).
- Using Poster, sign into Bridge using the same credentials used to log into the app. The request is:
POST https://webservices-staging.sagebridge.org/api/v1/auth/signIn
{
"username":"YourUsername",
"password":"YourPassword",
"study":"studyId"
}- Current study IDs are asthma, breastcancer, cardiovascular, diabetes, parkinson. Note that these are different from the study names.
- If you have an existing session, you may need to sign out (GET https://webservices-staging.sagebridge.org/api/v1/auth/signOut) before you can sign in.
- Note that this URL points to the staging stack. If you need to test against Prod, the base URL is https://webservices.sagebridge.org/
- Take the URL from the logs in step 1 and either paste it into Poster or into the same browser session (with the same cookies). You should get a blob of JSON back, which will include a field saying "status":"succeeded" or "status":"validation_in_progress" or "status":"validation failed".
- If it says "status":"succeeded", chances are, everything is good. (See TODO below for future improvements done by the Bridge team.)
- If it says "status":"validation_in_progress", you need to wait a bit longer, since server-side upload validation is done asynchronously. If it takes longer than ~30 seconds, something is likely wrong with the server, and you should reach out to the Bridge server team with your upload ID.
- If it says "status":"validation_failed". The messages in the JSON blob should give you a hint as to what the error is. If you need additional support, reach out to the Bridge server team with your upload ID.
- There may be status messages even if the upload succeeded. These are generally not a problem, but if you see them a lot, feel free to reach out to the Bridge server team.
- Also in the upload status blob is the "record" blob, which contains a "data" blob with the data that the server has recorded. You should check this blob to make sure the data you sent is the data you recorded. Some fields in the data will just be guids. These are attachment IDs, used for large data blobs (like accelerometer data) or freeform text (like study feedback).
- If you're switching apps, or if you want to clean out your Bridge session, call sign out (GET https://webservices-staging.sagebridge.org/api/v1/auth/signOut). This can be done in Poster or in the browser, whichever one has the session cookie.
TODO: Build a web UI to get health data records from upload IDs. This will save engineers and QA from having to construct HTTP requests by hand. This may or may not serve as a basis for research participants (users) getting their own data back. JIRA item for this change.
QA Automation
TODO: JIRA items for automation improvements.
Automate Upload Validation
YML has a data verification server. We may be able to leverage that to automate upload validation. The data verification server can scrub the logs for upload IDs, log into the Bridge server, and get the upload status on their behalf.
Ideally, the SBBUploadManager can send a signal back to the data verification server with the upload ID, so the data verification server doesn't need to scrub the logs.
Full Test Automation
The ideal test automation should follow this workflow:
- Code is checked into git.
- An OS X Server builds and launches a simulator with the new code.
- UI automation tests the apps through the UI. (I believe Xcode 7 supports this.)
- Test bed injects data into HealthKit. TODO: JIRA item for investigating how to programmatically inject data into HealthKit.
- Apps have hooks that can be only accessed by the simulator. These hooks do things that aren't practical to do through the UI, like flush HealthKit data uploads or other passive activities.
- Once the apps have been exercised, the test bed hooks into the data verification server (described in the previous section) to validate uploads.
For app developers, they should be able to also kick off integration tests using code in their local dev branch.