Document toolboxDocument toolbox

Developer Bootstrap

On This page


On Related Pages

Getting Started

Install the following dependencies:

  1. Java Development Kit - Version 11: we use Amazon Corretto in production.
  2. Git: Set up your local git environment according to Sage’s GitHub Security guidance
  3. Maven 3
  4. Eclipse (or other preferred Java IDE, like IntelliJ IDEA, VS Code etc)
    Note: Spring Tools provides an Eclipse build that bundles a lot of useful plugins
  5. MySQL server (version 8.0.33 or check which version is currently used for production under the "EngineVersion" key, newest releases may have issues): See https://downloads.mysql.com/archives/community/ and select the same version as used in production. As an alternative the official docker image https://hub.docker.com/_/mysql can be used to start an instance. Mac user should  Launching MySQL with Docker to start an instance through docker compose.

    There are also GUIs available for MySQL that can be useful for management, among which MySQL Workbench and HeidiSQL.

    Note: explicit_defaults_for_timestamp needs to be set to ON.
    Instruction on setting explicit_defaults_for_timestamp: 

     
    1. Open localhost connection. On the left panel, Under INSTANCE, choose "Options File"
    2. Under the "General" tab, scroll down to "SQL", make sure that explicit_defaults_for_timestamp is checked
    3. Click Apply
    4. Restart MySQL server

See Developer Tools for links to the installers.

Setting up dev environment on Sage laptop – Windows x64

  1. Install Java JDK (64-bit, 11) – set up JAVA_HOME variable (JDK's install directory, not the /bin folder) and add JAVA_HOME/bin to PATH
    1. It is possible to have multiple JDK versions installed, just make sure that the IDE and maven are using 11 when building the project
  2. Install Eclipse (64-bit)
  3. Install Maven (most recent stable release); and add to PATH
  4. Install M2Eclipse plugin from http://download.eclipse.org/technology/m2e/releases
    1. Sometimes, fresh copies of Eclipse will not have all the dependencies installed.  Particularly, M2E may complain about SLF4J 1.6.2.
    2. Either reinstall Eclipse or search the marketplace for the module.  
  5. Point Eclipse to the JDK
    1. Open Eclipse and navigate to Window->Preferences
    2. Open the submenu Java->Installed JREs
    3. Click Add..->Standard VM->Directory, and input the JDK directory
    4. Then finish and check the JDK (uncheck the JRE)
    5. Make sure the path to the JDK is ahead of "%SystemRoot%\system32", since Eclipse will otherwise use some other Java VM when starting up Eclipse.
  6. Turn off indexing (at least on the project directory).

Setting up dev environment on macOS (June 2018)

Make sure you have the Xcode Command Line Tools, which you can install with

xcode-select --install

I recommend using Homebrew to install Maven and Git. You can also install Eclipse and MySQLWorkbench with Homebrew if desired.

brew install maven git
brew tap homebrew/cask
brew install eclipse-ide mysqlworkbench

Using Homebrew to install the JDK or MySQL should be done with caution, because can be difficult to configure Homebrew to use older versions of this software (which Synapse may require as new versions of the JDK and MySQL are released). You can access binaries for the older versions of Java and MySQL with an Oracle account (Google should lead you to these downloads easily).

Your ~/.bash_profile (Note: will be ~/.zshenv if you are on MacOS Catalina or later) should look something like this (where version numbers may differ slightly, and locations may differ entirely if you installed your software differently; check your machine to make sure these folders exist):

export JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk11.jdk/Contents/Home/
export M2_HOME=/usr/local/Cellar/maven/3.5.3/libexec
export M2=$M2_HOME/bin

# The following may not be necessary: 
export PATH=${PATH}:/usr/local/mysql/bin 	# To fix MySQL installation after removing a previous installation via Homebrew

Using the steps above, you can configure your MySQL server with MySQLWorkbench. You may need to create a file at /etc/my.cnf to edit the configuration and set explicit_defaults_for_timestamp.

Though it might be dated, consider checking out Macintosh Bootstrap Tips for additional reference.

Synapse Platform Codebase

Get the Maven Build working

  1. Fork the Sage-Bionetworks Synapse-Repository-Services repository into your own GitHub account: https://help.github.com/articles/fork-a-repo

  2. Make sure you do not have any spaces in the path of your development directory, the parent directory for PLFM and SWC. For example: C:\cygwin64\home\Njand\Sage Bionetworks\Synapse-Repository-Services needs to become: C:\cygwin64\home\Njand\SageBionetworks\Synapse-Repository-Services

  3. Check out everything

    git clone https://github.com/[YOUR GITHUB NAME]/Synapse-Repository-Services.git
  4. Setup the Sage Bionetwork Synapse-Repository-Services repository to be the "upstream" remote for your clone

    # change into local clone directory
    cd Synapse-Repository-Services
     
    # set Sage-Bionetwork's Synapse-Repository-Services as remote called "upstream"
    git remote add upstream https://github.com/Sage-Bionetworks/Synapse-Repository-Services
  5. Download your forked repo's develop branch, then pull in changes from the central Sage-Bionetwork's Synapse-Repository-Services develop branch

    # bring origin's develop branch down locally
    git checkout -b develop remotes/origin/develop
     
    # fetch and merge changes from the Sage-Bionetworks repo into your local clone
    git fetch upstream
    git merge upstream/develop

    Note: this is NOT how you should update your local repo in your regular workflow.  For that see the Git Workflow page.

  6. Stack Name
    • Your stack name will be dev
    • Your stack instance will be whatever you like, we recommend your unix username
  7. Setup MySQL 
    1. Mac user should  Launching MySQL with Docker to start an instance through docker compose.
    2. Start a command-line mysql session with "sudo mysql"
    3. Create a MySQL user with your named dev[YourUsername] with a password of platform.

      create user 'dev[YourUsername]'@'%' identified BY 'platform';
      grant all on *.* to 'dev[YourUsername]'@'%' with grant option;
    4. Create a MySQL schema named dev[YourUsername] and grant permissions to your user.

      create database `dev[YourUsername]`;
      # This might not be needed anymore
      grant all on `dev[YourUsername]`.* to 'dev[YourUsername]'@'%';
      
    5. Create a schema for the tables feature named dev[YourUsername]tables

      create database `dev[YourUsername]tables`;
      # This might not be needed anymore
      grant all on `dev[YourUsername]tables`.* to 'dev[YourUsername]'@'%';
    6. This allows legacy MySQL stored functions to work. It may not be necessary in the future if all MySQL functions are updated to MySQL 8 standards.

      set global log_bin_trust_function_creators=1;

      Note: Use set PERSIST in order to keep the configuration parameter between MySQL restarts (e.g. if the MySQL service is not setup to start automatically). Alternatively, use the Options Files to set this checkbox (under the Logging tab, under Binlog Options).

    7. This allows queries with ORDER BY/GROUP BY clauses where the SELECT list does not match the ORDER BY/GROUP BY clause.

      SET GLOBAL sql_mode=(SELECT REPLACE(@@sql_mode,'ONLY_FULL_GROUP_BY',''));
  8. Setup AWS Resources
    The build expects certain AWS resources to be instantiated. Use a Jenkins stack builder job to automatically instantiate the build for you. Note that you will need a IAM account before the build can provision resources on AWS. To create an IAM account, fork and edit this template. (See /wiki/spaces/IT/pages/392658945).

    If you wish to delete any of the resources you created see the "Developer Bootstrap#Deleting CloudFormation Stacks" section.
    1. Log in to https://build-system-synapse.dev.sagebase.org and create a new job (click New Item in dashboard).
      You will need to be on the VPN to access the Jenkins Server, directions to set up VPN can be found
      here, as well as an account on the Jenkins server (contact IT).
    2. name the project stackbuilder-dev-[YourUsername] and copy settings from stackbuilder-repo-develop-build. Then click OK
    3. Click on the "Build" tab and modify the line
      " -Dorg.sagebionetworks.instance=YourUsername"\ 
      Then click Save.

       
      NoteThis may be represented as a String Parameter: " -Dorg.sagebionetworks.instance=${INSTANCE}"\, in that case, change the value of the INSTANCE parameter to YourUsername

    4. On the next page, click "Build with Parameters" in the left navigation then "Build". Proceed to the next steps while waiting for this build to complete.


  9. Create your maven settings file ~/.m2/settings.xml
    1. The settings file tells maven where to find your property file and what encryption key should be used to decrypt passwords.
    2. Use this settings.xml as your template
       

      settings.xml
      <settings xmlns="http://maven.apache.org/SETTINGS/1.0.0"
        xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
        xsi:schemaLocation="http://maven.apache.org/SETTINGS/1.0.0
                            http://maven.apache.org/xsd/settings-1.0.0.xsd">
        <localRepository/>
        <interactiveMode/>
        <usePluginRegistry/>
        <offline/>
        <pluginGroups/>
        <servers/>
        <mirrors/>
        <proxies/>
        <profiles>
          <profile>
            <id>dev-environment</id>
            <activation>
              <activeByDefault>true</activeByDefault>
            </activation>
            <properties>
              <org.sagebionetworks.stackEncryptionKey>20c832f5c262b9d228c721c190567ae0</org.sagebionetworks.stackEncryptionKey>
             
         		<org.sagebionetworks.developer>YourUsername</org.sagebionetworks.developer>
              <org.sagebionetworks.stack.instance>YourUsername</org.sagebionetworks.stack.instance>
      		
              <org.sagebionetworks.stack.iam.id>yourAwsDevIamId</org.sagebionetworks.stack.iam.id>
              <org.sagebionetworks.stack.iam.key>yourAwsDevIamKey</org.sagebionetworks.stack.iam.key>
      
      
      		<org.sagebionetworks.google.cloud.enabled>true</org.sagebionetworks.google.cloud.enabled>
      		<!-- If using not using Google Cloud, set the above setting to false. The following org.sagebionetworks.google.cloud.* settings may be omitted -->
      		<org.sagebionetworks.google.cloud.client.id>googleCloudDevAccountId</org.sagebionetworks.google.cloud.client.id> 
      		<org.sagebionetworks.google.cloud.client.email>googleCloudDevAccountEmail</org.sagebionetworks.google.cloud.client.email> 
      		<org.sagebionetworks.google.cloud.key>googleCloudDevAccountPrivateKey</org.sagebionetworks.google.cloud.key> 
      		<org.sagebionetworks.google.cloud.key.id>googleCloudDevAccountPrivateKeyId</org.sagebionetworks.google.cloud.key.id>      
      
      
      		<org.sagebionetworks.stack>dev</org.sagebionetworks.stack>                
            </properties>
          </profile>
        </profiles>
        <activeProfiles/>
      </settings>
    3. Everywhere you see "YourUsername", change it to your stack instance name (i.e. the username you used to set up MySQL and the AWS shared-resources)
    4. Get AWS IAM ID and Key from http://aws.amazon.com/iam/ (see whoever manages those accounts) and get admin privileges for AWS IAM
      1. Click on "My security credentials"
      2. On the AWS IAM credentials tab and under "Access keys for CLI, SDK, & API access", click on "Create access key" 
      3. The ID will look something like AKIAIOSFODNN7EXAMPLE, and the Key will look something like wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY
    5. If using Google Cloud (recommended if you are working on file upload features), get the Google Cloud developer credentials from the shared LastPass entry titled 'synapse-google-storage-dev-key'. Paste the corresponding fields in the attached JSON file into the org.sagebionetworks.google.cloud elements. Note the private key should be given on one line, with new lines separated with '\n' and NOT '\\n'.
      1. If you don't need to use Google Cloud features, then set org.sagebionetworks.google.cloud.enabled to be false. The other Google Cloud parameters can be omitted.
    6. Once your settings file is setup, you can Build everything via `mvn` which will run all the tests and then install any resulting jars and wars in your local maven repository.

      cd Synapse-Repository-Services
      mvn install
      
      1. Note that you might get an error that says Illegal access: this web application instance has been stopped already. This is normal. Scroll up to see whether the build succeeded or failed. (You may have to scroll through several pages of this error.)
      2. If tests fail, then you should run mvn clean install after debugging your failed build.
      3. Many tests assume you have a clean database. Before running tests, be sure to drop all tables in your database. Synapse will automatically recreate them as part of the tests.
      4. When you pull updates from upstream, be sure to run mvn clean install -DskipTests from the root project, which will rebuild everything. If necessary, run mvn test from the root to ensure all tests pass.
      5. Tests can also fail because sometimes IntelliJ does not configure itself appropriately, if this case, please following these steps.
        1. Go to the Synapse-Repository-Services repository and execute ls -a to find '.idea' folder, which is the configuration folder of IntelliJ IDEA.
        2. Execute 'rm -r .idea' to delete configuration folder and re-configure IntelliJ IDEA (please follow 'Configuring IntelliJ IDEA' section again)
        3. Either terminal in IntelliJ IDEA  or own terminal, execute 'mvn clean install -Dmaven.test.skip=true' to make sure all files are configured correctly.  (skipTests are deprecated in Maven Surefire Plugin Version 3.0.0-M3)
      6. If the integration tests fail, try restarting the computer.  This resets the open connections that may have become saturated while building/testing.  

  10. Take a look at the javadocs, they are in target/apidocs/index.html and target/testapidocs/index.html for each package

Configuring IntelliJ IDEA (June 2018)

IntelliJ IDEA is an IDE that can be easier to configure to work with the Synapse build than Eclipse. Follow this section if you want to use it instead of Eclipse. Otherwise, you can skip this section.

When you install IntelliJ IDEA, make sure you also install the Maven plugin (should install by default).

There are two settings that you should change in your preferences before installing. You should enable 'Build Project Automatically', and set your Maven home directory to the location of the Maven installation that you used to successfully build the codebase.

After configuring IntelliJ IDEA, you can import the project by pointing to the folder 'Synapse-Repository-Services' and select Maven from 'Import project from external model'. The default settings can be used to finish importing the project.

After this, you need to configure your CLASSPATH. You can easily do this via right-clicking the following folders; mark /lib-auto-generated/target/auto-generated-pojos/ as Sources Root (or Generated Sources Root), and mark /services/repository/src/main/webapp/WEB-INF/ as Sources Root. You may want to confirm that the other folders (listed below in Get your Eclipse Build working) have been marked as Sources Root, but IntelliJ IDEA should have done this automatically. This should complete your configuration, and you can run a few tests to confirm that your repository works.

Note that every time you re-import the above Maven projects, you will need to mark them again.



Get your Eclipse Build working

  1. Use the eclipse maven plugin to import all the projects
    1. Synapse-Repository-Services has several sub-projects. Eclipse creates a project for each. It's convenient to group these together into a Working Set called PLFM. (From the menu, select File|New|Java Working Set...)
  2. Configure build path
    1. Add the generated sources to the following projects. (right click and select Build-Path->Use as Source-Folder).  If you don't see those folders, refresh the projects until they appear. 

      1. /repository-managers/target/generated-sources/jjtree

      2. /repository-managers/target/generated-sources/javacc
      3. /lib-auto-generated/target/auto-generated-pojos
      4. /lib-table-query/target/generated-sources/javacc

    2. If you're missing ParseException, you need to confirm the above are added as source folders without any exclusion filters enabled.

  3. Right-click each project that has auto-generated source files, and under Properties->Build Path->Source, all exclusion filters should be cleared.
    1. Be sure to remove the "Excluded" filter on this folder
      1. /lib-auto-generated/target/auto-generated-pojos

Fixing Eclipse errors after Update

Occasionally, we will make dramatic changes to the PLFM project, like moving or renaming sub-projects. This can cause Eclipse to complain about missing dependencies even after a clean + build (Projects -> Clean... -> "Clean All Projects"). The following steps should help eclipse resolve its dependency problems:

  1. Close Eclipse.
  2. Delete your org.sagebionetworks from your local .m2 directory (i.e. Windows user: C:\Users\<your username>\.m2\repository\org)
  3. Rebuild all sage artifacts in your local maven repository by running the following from the command line:

    mvn clean install -Dmaven.test.skip=true
    

    Since tests will be skipped this should only take a few minutes. Any build failures in integration-tests can be ignored since the project does not produce any artifacts.

  4. Open Eclipse.
  5. Refresh from the root folder (i.e.trunk). Either right-click the root folder and select "refresh" or select the root folder and press F5.
  6. At this point you need to look for any sub-projects or folders that are not in SVN, that you did not create. In Eclipse these will show up with the little '?' on the icon, or they will show up as "additions" when you synchronize with SVN. All of these projects and folders need to be deleted from your local machine.
  7. Next we need to detect any new projects. Here is the simplest way to do this in Eclipse:
    1. Right-click on the root folder (i.e. trunk) and select "Import..."
    2. From the dialog select Maven->"Existing Maven Projects"
    3. Click next
    4. All existing projects should be greyed out and all new projects should automatically be selected. If not select the new projects.
    5. Click finish
  8. Believe it or not, Eclipse needs another refresh (F5).
  9. Now we need to force Eclipse to do a full clean and build of all projects.
    1. Select "Projects" from the main menu
    2. Select "Clean..."
    3. Select "Clean all Projects"
  10. We now need Eclipse to update its dependencies for all projects with errors. However, the order in which you do this is important. If project B depends on project A, then you must update project A first, then B. To update the maven dependencies for a project: right-click on the project and select Maven->"Update Dependencies"
  11. If you still have error, update the dependencies in a different order. When you get the order correct, you should have no more missing dependencies errors.
  12. Note:  Ensure that source folders for auto-generated classes are on the projects' class paths.  For more information, see the section "Build Troubleshooting", below.

Service Development

  1. First make sure you have followed all the instructions to set up the Synapse Platform Codebase
  2. Run a local tomcat servlet container and send it a few requests
    • cd services/repository/
    • mvn tomcat:run
    • You know its running successfully when the last line says "INFO: The server is running at http://localhost:8080/"
  3. Alternatively (preferably?), you can cd integration-tests/ and mvn cargo:run . This should launch a tomcat server on port 8080. When running the application using mvn cargo:run, the context for the API requests will be relative to the version of the repository services. This endpoint will be set to http://localhost:8080/services-repository-${project.version}, where {$project-version} is develop-SNAPSHOT: http://localhost:8080/services-repository-develop-SNAPSHOT/.
    1. Note: try visiting http://localhost:8080/services-repository-develop-SNAPSHOT/repo/v1/version to see the version and stackInstance of the local project. The web page should load: {"version":"develop-SNAPSHOT","stackInstance":"<your_username>"}
       
  4. To create users for testing, you can use the following script. (Note that the actual username and API key are in fact "migrationAdmin" and "fake". These are only valid on local builds.)

    export REPO_ENDPOINT=http://localhost:8080
    export ADMIN_USERNAME=migrationAdmin
    export ADMIN_APIKEY=fake
    export USERNAME_TO_CREATE=[username]
    export PASSWORD_TO_CREATE=[password]
    export EMAIL_TO_CREATE=[email]
    
    curl -s https://raw.githubusercontent.com/Sage-Bionetworks/CI-Build-Tools/master/dev-stack/create_user.sh | bash
  5. For API information, see https://rest-docs.synapse.org/rest/index.html Use your favorite HTTP request client (curl, Postman, etc) to construct HTTP requests. Particularly useful are the auth APIs https://rest-docs.synapse.org/rest/index.html#org.sagebionetworks.auth.AuthenticationController
  6. See Repository Service API for curl examples.
  7. Take a look at the continuous build for this service http://sagebionetworks.jira.com/builds/browse/PLFM-REPOSVCTRUNK
  8. Learn more about the Tomcat Maven Plugin http://mojo.codehaus.org/tomcat-maven-plugin/
  9. See the Service API Design REST Details section wiki page for links to Spring MVC 3.0 info and other good stuff.
  10. See Configuring a Web Project to run in Tomcat or AWS if you want to run tomcat via eclipse

How to run the unit tests

To run all the unit tests:

mvn test

To run just one unit test:

mvn test -Dtest=DatasetControllerTest

How to run the unit tests against a local MySQL

See how to set up a local MySQL

Note that the default user for a locally installed MySQL is root and the default password is the empty string. You only need to use PARAM1 and PARAM2 below if you have configured your local MySQL differently.

To run all the unit tests:

mvn test -DJDBC_CONNECTION_STRING=jdbc:mysql://localhost/test2 [-DPARAM1=theMysqlUser -DPARAM2=theMysqlPassword]

To run just one unit test:

mvn test -DJDBC_CONNECTION_STRING=jdbc:mysql://localhost/test2 [-DPARAM1=theMysqlUser -DPARAM2=theMysqlPassword] -Dtest=DatasetControllerTest

Note that these all pass and should continue to do so.

How to run a local instance of the service

You can use the integration tests, described below, to run a local instance of the service.

How to debug a local instance using the Eclipse remote debugger

  1. In the integration-test/pom.xml file, uncomment lines 141 - 143 (starts with <cargo.jvmargs>). This enables remote debugging on the deployment stack.
  2. In Eclipse, navigate to the Debug Configurations.
    1. Create a new Remote Java Application configuration.
    2. On the 'Connect' tab, select the 'services-repository' project in the 'Project' field, and set the connection properties to use localhost port 8989.
    3. On the 'Source' tab, add all projects contained within Synapse-Repository-Services.
    4. Click 'Apply' to save the debug config.
  3. Start the repository services (mvn cargo:run in the 'integration-test' folder).
    1. (You should see a message that indicates that the repo services are in debug mode, listening on port 8989.)
  4. Start the debug configuration you just created in Eclipse.
    1. (The repo services should finish initializing.)
  5. Start the Portal.html in GWT dev mode in Eclipse.

Be sure to restore the integration-test/pom.xml file when you are done using remote debugging.

How to run the REST API documentation generator

See REST API Documentation Generation

Integration Test Development

How to debug the integration tests

  1. The wiki generator and several of our integration tests expect a subset of prod SageBioCurated data to exist in the service. To populate a local stack with that data, do this by

    1. running the following single integration tests:
      ~/platform/trunk/integration-test>mvn -Dit.test=IT100BackupRestoration verify
      
    2. and then once the database is populated we can restart the local stack in debug mode:
      ~/platform/trunk/integration-test>mvn cargo:run
      
  2. In eclipse stick a break point at the beginning of the test you want to debug
  3. You'll need the following VM Args for the tests or you can put these in your local properties file

    -Dorg.sagebionetworks.auth.service.base.url=http://localhost:8080/services-authentication-0.10-SNAPSHOT/auth/v1
    -Dorg.sagebionetworks.repository.service.base.url=http://localhost:8080/services-repository-0.10-SNAPSHOT/repo/v1
    
  4. Debug all the tests in order. Some of the tests depend upon the fact that the data loader test has run and populated the repository service. The debugger will stop at the test where you set the break point and you can step through the test from there.

Build Troubleshooting

Several things to try if your local build is failing.

  1. Refresh all files (From within Eclipse, select the projects to be refreshed and either press F5 or right-click and select 'Refresh').
  2. If the build fails at services-repository, ensure that MySQL is running your development database.
  3. If it still fails at services-repository, and you have a local property override file your properties file may be incorrectly named. Ensure that the name is dev<your_username>.properties (e.g. devdeflaux.properties). (To be more specific--the properties file name should match the stack-instance name in your devmvn-settings.xml file).
  4. If a file or files in a project aren't 'seeing' something they should (if there's an error for an import statement, for instance), try deleting the project and re-importing it.
    1. Make sure NOT to delete the project code on file when doing this. Delete only from inside Eclipse.
  5. If the build is still failing for no discernible reason, especially if it fails at lib-jdomodels, try dropping your development database, then recreating it.
    1. You can do this on Windows using the MySQL Command Line Client. Enter your password and run:

      DROP DATABASE <database_name>;
      CREATE DATABASE <database_name>;
      

Running a Dockerized Local Build


Prepare and execute a script, local_build.sh, located in the root directory of the git clone like this:

#!/bin/bash
export user=
export m2_cache_parent_folder=
export src_folder=
export org_sagebionetworks_stack_iam_id=
export org_sagebionetworks_stack_iam_key=
export org_sagebionetworks_stackEncryptionKey=
export rds_password=platform
BASEDIR=$(dirname "$0")
${BASEDIR}/docker_build.sh

This allows running a build without first installing Java, Maven or MySQL. You do need to install Docker first. To execute, run

./local_build.sh


How to create a Private Jenkins Build

With a private Jenkins build, you can run unit and integration tests in a controlled environment. In addition, you can integrate your build with your GitHub pull requests to demonstrate that your build has no failing tests (see Push Private Jenkins Build Status to Github as Commit Status).

To access the Jenkins build system you now need /wiki/spaces/IT/pages/352976898.

To create a private build on Jenkins, create two new jobs

  1. Connect to the VPN and log in to http://build-system-synapse.sagebase.org:8081/ 
  2. Create a new stack builder job to create resources needed to run a private build by copying an existing job of the form "stackbuilder-dev-<name>". This is the same as the Shared Resources build that you may have set up in step 8 of Synapse Platform Codebase, so if you have already done that, you might want to skip the creation of a new stackbuilder job as it can be treated as a one-off operation in order to prepare the environment.
    1. Modify the configuration of your new build to use your Stack Builder fork (or the main repository). Additionally, change the following parameter:
      "-Dorg.sagebionetworks.instance=p<user>"
    2. Note that you should keep your username short (at least until PLFM-5284 is resolved)
  3. Create a new job ( http://build-system-synapse.sagebase.org:8081/view/All/newJob ) 'copying' an existing job of the form "devp<name>".
    1. Edit your new job (click on your new job and then "Configure"). Set the following parameters in the "Execute shell" section: 
export user=p<user>
export org_sagebionetworks_stack_iam_id=
export org_sagebionetworks_stack_iam_key=
export org_sagebionetworks_stackEncryptionKey=
export rds_password=platform
export org_sagebionetworks_repository_database_connection_url=
export org_sagebionetworks_table_cluster_endpoint_0=
/var/lib/jenkins/workspace/${JOB_NAME}/jenkins_build.sh

Note, that we add the prefix 'p' to the variable 'user', by convention, to allow local and Jenkins builds to run concurrently without colliding. This is an important aspect since if a local build and a remote build are run concurrently sharing the same stack (e.g. the one that was setup in step 8 of the Synapse Platform Codebase) they might consume shared resources (e.g. queues) leading to race conditions and tests failing. The values for org_sagebionetworks_stack_iam_id, org_sagebionetworks_stack_iam_key, (your IAM ID and Key for your developer account) and org_sagebionetworks_stackEncryptionKey are the same for both local and Jenkins builds.  You may point the Jenkins build to the feature branch of your private fork so that it builds whenever you push updates to Github. Also, make sure you change the notification email address to your own in both builds.

org_sagebionetworks_repository_database_connection_url and org_sagebionetworks_table_cluster_endpoint_0 are stack AWS RDS resources created by the stack builder. You can find the endpoints in the RDS console, note that also in this case the database names will contain a reference to the instance created by the stack builder, they should look like something on the lines of:

dev-p<user>-db.blablabla.us-east-1.rds.amazonaws.com

dev-p<user>-table-0.blablabla.us-east-1.rds.amazonaws.com


Now whenever you push changes to the specified fork of your private repository Jenkins will build the stack and run the complete test suite, notifying you of any failures.  Once you get a clean build you are ready to create a pull request to the Sage-Bionetworks GitHub repository.

At the end of the local setup + the remote build job you should end up with at least two different stacks in AWS for your account, in the form of dev-<user> and dev-p<user>.

Note: when the stack changes (e.g. new queues are added, new AWS services are setup etc), you might need to run the stack builder again before running your build to avoid having test failing that depends on new or updated resources in AWS. In order to prevent this issue you can setup your private build to always run the stack builder before the build itself:

  1. Open the configuration of your job http://build-system-synapse.sagebase.org:8081/job/devp<user>/configure
  2. In the "Build" section, "Add build step..." or type "Trigger":
  3. Configure the trigger to invoke your stack builder and select "Block until the triggered projects finish their builds":
  4. Move the trigger before any other build step

Now every time the build runs the stack builder will be executed first, make sure that your stack builder job is up to date (e.g. if you use a fork of the Stack Builder, make sure to keep it in sync with the latest upstream develop).

Optionally you might want to parameterize both your stack builder and your build to take in input the branch to run off (e.g. for example if you need a particular stack builder setup for a branch of the repository), you can add parameters to be passed to the stack builder job in the trigger, the parameter value can be prefix with the dollar sign ($) to reference a parameter defined in the current build:


An example of such configuration can be found here: http://build-system-synapse.sagebase.org:8081/job/devpmarco/configure

Note how the job has 3 parameters:

  • INSTANCE: with a default value pointing to the private build instance user, this can be changed on demand to start a job on a different instance
  • BRANCH: with a default value, usually develop. This can be changed on demand to build a different branch (e.g. when developing on a feature branch just change the default value to the name of the branch)
  • STACK_BRANCH: with a default value, usually develop. This can be changed on demand to run on a different branch of the stack builder.

The stack builder job itself is parameterized with a BRANCH parameter (http://build-system-synapse.sagebase.org:8081/job/stackbuilder-dev-marco/configure), the value of this parameter is passed from the upstream job as a "Predefined parameter" (See pic above).

Deleting CloudFormation Stacks

  1. Verify the name the CloudFormation stack you wish to delete at https://console.aws.amazon.com/cloudformation under the "Stack Name" column
  2. Go to http://build-system-synapse.sagebase.org:8081/job/Delete%20CloudFormation%20Stack/build?delay=0sec
  3. Enter the "Stack Name" you wish to delete as the parameter of the build and click the "Build" button

Push Private Jenkins Build Status to Github as Commit Status

Create a Github Token.

Go to: https://github.com/settings/tokens

Click "Generate new token" button.

Only check "repo:status" and "public_repo".

Create the token and use it in your private Jenkins build's configuration:

export user=p<user>
export org_sagebionetworks_stack_iam_id=
export org_sagebionetworks_stack_iam_key=
export org_sagebionetworks_stackEncryptionKey=
export rds_password=platform
export github_token=<token-created-from-github>

/var/lib/jenkins/workspace/${JOB_NAME}/jenkins_build.sh


NOTE: You will need to be a member of the Sage GitHub Organization (https://github.com/orgs/Sage-Bionetworks/teams/synapse-developers) in order to for your token to work. Reach out to IT if you are not yet a member.