Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

  • User requests their data in the app, specifying the start date and end date. (App may or may not supply default start and end dates.)
  • The app calls Bridge REST API API with the start date and end date (requires user authentication).
  • Bridge Server writes the request to an internal SQS queue. This request contains study ID, username, start date, and end date.
  • BUDD reads from the SQS, aggregates the requested data (which takes roughly a minute, depending on the amount of data), and sends an email to the user with a link to where they can download the data. This link will expire after 24 12 hours.

...

BUDD

...

Internal Structure

The main entry point into BUDD is BridgeUddWorker https://github.com/Sage-Bionetworks/BridgeUserDataDownloadService/blob/develop/src/main/java/org/sagebionetworks/bridge/udd/

...

worker/

...

BridgeUddWorker.java. This contains:

  • A loop which polls SQS for requests and parses those requests. (There is a wait time configured so that while testing, if you Ctrl+C out of the process, you don't get a bunch of errors from "can't connect to SQS".)
  • Gets the study from Dynamo DB Study table DynamoDB (because accounts and data are partitioned by study).
  • Gets the account from Stormpath by email address. (The code says username, but we recently changed Bridge Server so that all usernames are the same as email address, and everything just keys off email address.)
  • Gets the health ID from the user's account and queries DDB with the health ID to get the health code.
  • Queries DDB SynapseTables to get a list of all Synapse health data tables for that study and SynapseSurveyTables to get a list of all survey metadata tables for that study.
  • Calls the SynapsePackager (https://github.com/Sage-Bionetworks/BridgeUserDataDownloadService/blob/develop/src/main/java/org/sagebionetworks/bridge/udd/dynamodbsynapse/DynamoHelperSynapsePackager.java#L41java) - This is needed to get the Stormpath information to get the user's account, as well as to get the configured "from" email address.Gets the user's health ID (user to obtain health code) and email address from Stormpath to download all the user's data from Synapse (within the specified date range). The SynapsePackager does the following.accountsStormpathHelperGets
    • , which queries a Synapse table for the user's
    health code from the Dynamo DB HealthId table and queries for uploads with that
    • data by health code and
    with upload dates within the start and end date (inclusive) from the Dynamo DB Upload table, index healthCode-uploadDate-index dynamodb/DynamoHelper.java#L59)S3 Packager (
    • synapse/SynapseDownloadSurveyTask.java), which downloads the complete table of survey metadata as a TSV.
    • Zips up all files (TSVs and file handles) into a master zip file.
    • Uploads the master zip file to S3
    • Creates a pre-signed URL for the master zip file and returns the pre-signed URL to BridgeUddWorker.
  • Emails the pre-signed URL back to the user.

Development

Local Development

Create a fork from https://github.com/Sage-Bionetworks/BridgeUserDataDownloadService. Follow the steps in https:/

...

/

...

  • Downloads the uploads from S3.
  • Decrypts the uploads and writes them to a temp directory. (A new temp directory is created for every request.) Individual files are named in the format YYYY-MM-DD-UploadId.zip, so users can organize their uploads by date.
  • Errors in downloading or decrypting are written to a error.log in the temp directory, which is included in the master zip file.
  • Creates a master zip file called userdata-YYYY-MM-DD-to-YYYY-MM-DD-randomGuid.zip (start date and end date) and zips all upload files and error.log into the master zip file.
  • Uploads the master zip file to S3.
  • Creates a pre-signed URL for the master zip file, for HTTP GET only and with an expiration date 24 hours from now.
  • Deletes the temp files and temp directories.

...

Deployment

TODO

Future Improvements

TODOgithub.com/Sage-Bionetworks/BridgeUserDataDownloadService/blob/develop/README.md (If you're only planning on running the code but not on editing, you should be able to pull from the root fork directly.)

Testing

First, make sure your test account has uploads for the time range you want to test with.

To test through the Bridge Server, use the following example request:

POST https://webservices.sagebridge.org/v3/users/self/emailData
{
  "startDate":"2015-08-15",
  "endDate":"2015-08-19",
  "type":"DateRange"
}

To test against BUDD directly, log into the AWS Console, go to the SQS dashboard, and submit the following example request as an SQS message:

{
  "studyId":"api",
  "username":"dwayne.jeng+test01@sagebase.org",
  "startDate":"2015-07-23",
  "endDate":"2015-07-30"
}

Either method will send an email to your registered email address.

Deploy to Dev

Submit your code changes to your own personal fork. Create a pull request to the root fork. Once the pull request has been merged, Travis will automatically build and deploy to the dev server on Elastic Beanstalk.

Deploy to Staging/Prod

  • Create a workspace from the root fork (https://github.com/Sage-Bionetworks/BridgeUserDataDownloadService) if you don't already have one.
  • Make sure all branches are up to date (git pull as necessary).
  • Go to the staging branch (git checkout uat), merge from develop (git merge --ff-only develop).
  • Push back to GitHub (git push). This should trigger Travis to automatically build and deploy to the staging server on Elastic Beanstalk.

Similar steps for Prod.

Rolling Back Deployments

  • Log into the AWS Console and go to the Elastic Beanstalk Dashboard.
  • In the top nav bar drop down, go to Application Versions.
  • You'll see a list of versions named travis-[git commit hash]-[timestamp in epoch seconds]. Check the version you want to roll back to and click Deploy.
  • Select the environment from the drop down and click Deploy.

Access Logs

Logs can be found at https://logentries.com/. Credentials to the root Logentries account can be found at belltown:/work/platform/PasswordsAndCredentials/passwords.txt. Alternatively, get someone with account admin access to add your user account to Logentries.

If for some reason, the logs aren't showing up in Logentries, file a support ticket with Logentries. The alternate steps to reach the logs are below

  • Log into the AWS Console and go to the Elastic Beanstalk Dashboard.
  • Select the environment you want to view logs for.
  • Click on Logs in the left nav bar.
  • In the drop down (top right), click Request Logs.
    • Last 100 Lines will give you a link to a page with the logs on screen.
    • Full Logs will give you a link to a zip file you can download.

If this still doesn't work, you can SSH directly into BUDD hosts (see below) and find logs at /var/log/tomcat8/catalina.out

Logging Into BUDD Hosts

You may need to be in the Fred Hutch intranet or logged into the Fred Hutch VPN for this to work.

  • Log into the AWS Console and go to the EC2 Dashboard.
  • Click on Instances in the left nav bar.
  • In the table, find the host(s) with the name Bridge-UDD-Worker-Dev (or whatever environment you want to log into). Select that host. (If there's more than one in the environment you want, select just one host.)
  • In the information panel on the bottom, find the field Public DNS host. This is the hostname you want to SSH into. But first, you'll need the PEM file to log in.
  • Log into belltown and download the PEM files from /work/platform/PasswordsAndCredentials
  • On your machine, run ssh -i [path to PEM file] ec2-user@[EC2 hostname]

You can save yourself some time with an entry in your ~/.ssh/config that looks like

host Bridge-UDD-Dev
     HostName ec2-52-20-91-245.compute-1.amazonaws.com
     User ec2-user
     IdentityFile ~/Bridge-UDD-Dev.pem

Now you can just run ssh Bridge-UDD-Dev.

Next Steps

Short/Medium-Term

  • Jira Legacy
    serverJIRA (sagebionetworks.jira.com)
    serverIdba6fb084-9827-3160-8067-8ac7470f78b2
    keyBRIDGE-735
    - add User Data Download to iOS SDK
  • Jira Legacy
    serverJIRA (sagebionetworks.jira.com)
    serverIdba6fb084-9827-3160-8067-8ac7470f78b2
    keyBRIDGE-761
    - Log archiving and alarms
  • Jira Legacy
    serverJIRA (sagebionetworks.jira.com)
    serverIdba6fb084-9827-3160-8067-8ac7470f78b2
    keyBRIDGE-762
    - Monitoring
  • Jira Legacy
    serverJIRA (sagebionetworks.jira.com)
    serverIdba6fb084-9827-3160-8067-8ac7470f78b2
    keyBRIDGE-763
    - Refactor shared copy-pasted code into shared package
  • Jira Legacy
    serverJIRA (sagebionetworks.jira.com)
    serverIdba6fb084-9827-3160-8067-8ac7470f78b2
    keyBRIDGE-764
    - Audit IAM credentials
  • Jira Legacy
    serverJIRA (sagebionetworks.jira.com)
    serverIdba6fb084-9827-3160-8067-8ac7470f78b2
    keyBRIDGE-765
    - Move Stormpath keys from env vars to key management solution

Long-Term

  • Performance improvements - Multi-threading? Map-Reduce?
  • Web Portal - Better user interface than email?
  • Data visualization - More useful than raw JSON dump
  • Caching/De-duping - If the user requests the same data again, use the existing master zip file instead of generating a new one. Also helpful if their link expires and they want to get the data again.
  • Cleanup task to delete old user requests and user-requested data?

See Also

Original design doc: Design Doc