Automate your dependency management using update tool

Software often consists of not just your own code but also is dependent of third party libraries and other software which has their own update cycle and new versions are released now and then with fixes to vulnerabilities and with new features. Now the question is what is your dependency management strategy and how do you automate it?

Fortunately automated dependency updates for multiple languages is a solved problem as there are several update tools to help you: Renovate, Dependabot (GitHub), Greenkeeper ($), Depfu ($) and Dependencies.io ($) to name some alternatives. In this blog post I will concentrate on using Renovate and integrate it with GitLab CI.

Renovate your dependencies

Renovate is open source tool which works with most git hosting platforms (public or self-hosted) and it's possible to host Renovate Bot yourself. It’s installable via npm/yarn or Docker Hub.

In short, the idea and workflow of dependency update tools are following:

  1. Checks for updates: pulls down your dependency files and looks for any outdated or insecure requirements.
  2. Opens pull requests: If any of your dependencies are out-of-date, tool opens individual pull requests to update each one.
  3. Review and merge: You check that your tests pass, scan the included changelog and release notes, then hit merge with confidence.

Now you just run the depedency update tool on regular basis on your continuous integration, watch how the pull requests fly and you get to keep your dependencies secure and up-to-date.

The manual chore of checking for updates, looking for changelogs, making changes, running tests, writing pull requests and more is now moved to reviewing pull requests with better confidence of what has changed.

Self-hosted in GitLab CI

Renovate Bot is a node.js application so you’ve couple of alternative ways to run it on your CI/CD environment. You can use a node docker image which installs and runs renovate, or you can use Renovate Bot's own Docker image as I chose to do. We are using docker-in-docker approach of running the Renovate docker container. That means you can start Docker containers from within an other Docker container.

First create an  account for the bot on the Gitlab instance (the best choice) or use your own account. Then generate a personal access token with the api scope for renovate to access the repositories and create the branches and merge requests containing the dependency updates.

Then create a repository for the configuration and renovate will use that repo’s CI pipelines. Paste your Gitlab token under CI / CD > Variables as a new variable and give it the name RENOVATE_TOKEN. Set it to protected and masked to hide the token from the CI logs and to only use it for Pipelines starting on protected branches (your master branch is protected by default).

You'll also need a Github access token with the repos scope for renovate to read sources and changelogs of dependencies hosted on Github. It’s not important what Github account is used as it's just needed because Github's rate-limiting would block your bot making unauthenticated requests. Paste it as an other variable with the name GITHUB_COM_TOKEN.

To configure Renovate we need to add three files to our repository:

config.js for configuring renovate:

module.exports = {
  platform: ‘gitlab’,
  endpoint: ‘https://gitlab.com/api/v4/',
  assignees: [‘your-username’],
  baseBranches: [‘master’],
  labels: ['renovate', 'dependencies', 'automated'],
  onboarding: true,
  onboardingConfig: {
    extends: ['config:base'],
  },
};

repositories.txt for repositories we want to check:

openpatch/ui-core
openpatch/template

.gitlab-ci.yml to run renovate:

default:
  image: docker:19
  services:
    - docker:19-dind

# Because our GitLab runner doesn’t have TLS certs mounted and runs on K8s
variables:
  DOCKER_HOST: tcp://localhost:2375
  DOCKER_DRIVER: overlay2
  DOCKER_TLS_CERTDIR: ‘'

renovate:
  stage: build
  script:
    - docker run -e RENOVATE_TOKEN="$RENOVATE_TOKEN" -e GITHUB_COM_TOKEN="$GITHUB_COM_TOKEN" -v $PWD/config.js:/usr/src/app/config.js renovate/renovate:13 $(cat repositories.txt | xargs)
  only:
    - master

Now everything is finished and when you run the pipeline renovate will check the repositories in repositories.txt and create merge request if a dependency needs to be updated.

The first merge request to repository is Configure Renovate which helps you to understand and configure settings before regular Merge Requests begin.

As a last step create a Pipeline Schedule to run the pipeline every x hours or x day or whatever you like. You can do this in the bot's config project / repository under CI / CD > Schedules by creating a new schedule and chosing the frequency to run your bot.

You can also reduce noise by Package Grouping and Automerging. Here’s an example of grouping eslint themed packages and automerging them if tests pass.

Project’s renovate.json:

{
    "packageRules": [
        {
          "packagePatterns": [ "eslint" ],
          "groupName": "eslint",
          "automerge": true,
          "automergeType": "branch"
        }
      ]
}

Summary

Congratulations! You’ve now automated the dependency updating with GitLab CI. Just keep waiting for the merge requests and see if your test suites are successful.  If you are really trusting your test suite, you can even let renovate auto-merge the request, if the pipeline succeeds.

Visual Studio Code Extensions for better programming

Visual Studio Code has become "The Editor" for many in software development and it has many extensions which you can use to extend the functionality for your needs and customize it. Here’s a short list of the extensions I use for frontend (React, JavaScript, Node.js), backend (GraphQL, Python, Node.js, Java, PHP, Docker) and database (PostgreSQL, MongoDB) development.

General

editorconfig
Attempts to override user/workspace settings with settings found in .editorconfig files.

Visual Studio IntelliCode
Provides AI-assisted development features for Python, TypeScript/JavaScript and Java developers in Visual Studio Code, with insights based on understanding your code context combined with machine learning.

GitLens
Visualize code authorship at a glance via Git blame annotations and code lens, seamlessly navigate and explore Git repositories, gain valuable insights via powerful comparison commands, and so much more.
Git Blame
See Git Blame information in the status bar for the currently selected line.

Local History
Plugin for maintaining local history of files.

Language and technology specific

ESlint
Integrates ESLint into VS Code.

Prettier
Opinionated code formatter which enforces a consistent style by parsing your code and re-printing it with its own rules that take the maximum line length into account, wrapping code when necessary.

Python
Linting, Debugging (multi-threaded, remote), Intellisense, Jupyter Notebooks, code formatting, refactoring, unit tests, and more.

PHP Intelephense
PHP code intelligence for Visual Studio Code is a high performance PHP language server packed full of essential features for productive PHP development.

Java Extension Pack
Popular extensions for Java development and more.

Docker
Makes it easy to build, manage, and deploy containerized applications from Visual Studio Code. It also provides one-click debugging of Node.js, Python, and .NET Core inside a container.

Markdown All in One
All you need to write Markdown (keyboard shortcuts, table of contents, auto preview and more)
Markdownlint
Includes a library of rules to encourage standards and consistency for Markdown files.
Markdown Preview Enhanced
Provides you with many useful functionalities such as automatic scroll sync, math typesetting, mermaid, PlantUML, pandoc, PDF export, code chunk, presentation writer, etc.

Prettify JSON
Prettify ugly JSON inside VSCode.

PlantUML
Rich PlantUML support for Visual Studio Code.

HashiCorp Terraform
Syntax highlighting and autocompletion for Terraform

Database

PostgreSQL
Query tool for PostgreSQL databases. While there is a database explorer it is NOT meant for creating/dropping databases or tables. The explorer is a visual aid for helping to craft your queries.

MongoDB
Makes it easy to work with MongoDB.

GraphQL
Adds syntax highlighting, validation, and language features like go to definition, hover information and autocompletion for graphql projects. This extension also works with queries annotated with gql tag.
GraphQL for VSCode
VSCode extension for GraphQL schema authoring & consumption.
Apollo GraphQL for VS Code
Rich editor support for GraphQL client and server development that seamlessly integrates with the Apollo platform.

Javascript

Babel
JavaScript syntax highlighting for ES201x, React JSX, Flow and GraphQL.

Jest
Use Facebook's Jest with pleasure.

npm
Supports running npm scripts defined in the package.json file and validating the installed modules against the dependencies defined in the package.json.

User Interface specific

indent-rainbow
Simple extension to make indentation more readable

Rainbow Brackets
Rainbow colors for the round brackets, the square brackets and the squiggly brackets.

vscode-icons
Icons for filetypes in file browser.

Other tips

VScode Show Full Path in Title Bar
With Code open, hit: Command+ , “window.title”: “{activeEditorLong}activeEditorLong{separator}${rootName}”

Slow integrated terminal in macOS
codesign --remove-signature /Applications/Visual\ Studio\ Code.app/Contents/Frameworks/Code\ Helper\ (Renderer).app

Running static analysis tools for PHP

We all write bug free code but analyzing your code is still important part of software development if for some reason there could've been some mishap with typing. Here's a short introduction how to run static analysis for PHP code.

Static analysis tools for PHP

The curated list of static analysis tools for PHP show you many options for doing analysis. Too much you say? Yes but fortunately you can start with some tools and continue with the specific needs you have.

You can run different analysis tools by installing them with composer or you can use the Toolbox which helps to discover and install tools. You can use it as a Docker container.

First, fetch the docker image with static analysis tools for PHP:

$ docker pull jakzal/phpqa:<your php version>
e.g.
$ docker pull jakzal/phpqa:php7.4-alpine

PHPMD: PHP Mess Detector

One of the tools provided in the image is PHPMD which aims to be a PHP equivalent of the well known Java tool PMD. PHPMD can be seen as an user friendly and easy to configure frontend for the raw metrics measured by PHP Depend.

It looks for several potential problems within that source like:

  • Possible bugs
  • Suboptimal code
  • Overcomplicated expressions
  • Unused parameters, methods, properties

You can install the phpmd with composer: composer require phpmd/phpmd. Then run it with e.g. ./vendor/bin/phpmd src html unusedcode --reportfile phpmd.html

Or run the command below which runs phpmd in a docker container and mounts the current working directory as a /project.

docker run -it --rm -v $(pwd):/project -w /project jakzal/phpqa:php7.4-alpine \
    phpmd src html cleancode,codesize,controversial,design,naming,unusedcode --reportfile phpmd.html

You can also make your custom rules to reduce false positives: phpmd.test.xml

<?xml version="1.0"?>
<ruleset name="VV PHPMD rule set"
         xmlns="http://pmd.sf.net/ruleset/1.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://pmd.sf.net/ruleset/1.0.0
                     http://pmd.sf.net/ruleset_xml_schema.xsd"
         xsi:noNamespaceSchemaLocation="
                     http://pmd.sf.net/ruleset_xml_schema.xsd">
    <description>
        Custom rule set that checks my code.
    </description>
<rule ref="rulesets/codesize.xml">
    <exclude name="CyclomaticComplexity"/>
    <exclude name="ExcessiveMethodLength"/>
    <exclude name="NPathComplexity"/>
    <exclude name="TooManyMethods"/>
    <exclude name="ExcessiveClassComplexity"/>
    <exclude name="ExcessivePublicCount"/>
    <exclude name="TooManyPublicMethods"/>
    <exclude name="TooManyFields"/>
</rule>
<rule ref="rulesets/codesize.xml/TooManyFields">
    <properties>
        <property name="maxfields" value="21"/>
    </properties>
</rule>
<rule ref="rulesets/cleancode.xml">
    <exclude name="StaticAccess"/>
    <exclude name="ElseExpression"/>
    <exclude name="MissingImport" />
</rule>
<rule ref="rulesets/controversial.xml">
    <exclude name="CamelCaseParameterName" />
    <exclude name="CamelCaseVariableName" />
    <exclude name="Superglobals" />
</rule>
<rule ref="rulesets/design.xml">
    <exclude name="CouplingBetweenObjects" />
    <exclude name="NumberOfChildren" />
</rule>
<rule ref="rulesets/design.xml/NumberOfChildren">
    <properties>
        <property name="minimum" value="20"/>
    </properties>
</rule>
<rule ref="rulesets/naming.xml">
    <exclude name="ShortVariable"/>
    <exclude name="LongVariable"/>
</rule>
<rule ref="rulesets/unusedcode.xml">
    <exclude name="UnusedFormalParameter"/>
</rule>
<rule ref="rulesets/codesize.xml/ExcessiveClassLength">
    <properties>
        <property name="minimum" value="1500"/>
    </properties>
</rule>
</ruleset>

Then run your analysis with:

docker run -it --rm -v $(pwd):/project -w /project jakzal/phpqa:php7.4-alpine phpmd src html phpmd.test.xml unusedcode --reportfile phpmd.html

You get a list of found issues formatted to a HTML file

PHPMD Report

PHPStan - PHP Static Analysis Tool

"PHPstan focuses on finding errors in your code without actually running it. It catches whole classes of bugs even before you write tests for the code. It moves PHP closer to compiled languages in the sense that the correctness of each line of the code can be checked before you run the actual line."

Installing with composer: composer require --dev phpstan/phpstan

Or run on Docker container:

docker run -it --rm -v $(pwd):/project -w /project jakzal/phpqa:php7.4-alpine phpstan analyse --level 1 src

By default you will get a report to console formatted to a table and grouped errors by file, colorized. For human consumption.

PHPStan report

By default PHPStan is performing only the most basic checks and you can pass a higher rule level through the --level option (0 is the loosest and 8 is the strictest) to analyse code more thoroughly. Start with 0 and increase the level as you go fixing possible issues.

PHPStan found some more issues which PHPMD didn't find but the output of the PHPStan could be better. There's a Web UI for browsing found errors and you can click and open your editor of choice on the offending line but you've to pay for it. PHPStan Pro costs 7 EUR for individuals monthly, 70 EUR for teams.

VS Code extension for PHP

If you're using Visual Studio Code for PHP programming there are some extensions to help you.

PHP Intelephense
PHP code intelligence for Visual Studio Code provides better intellisense then VS Code builtin and also does some signature checking etc. The extension has also premium version for some additional features.

Short notes on tech 49/2020

Week 49, 2020

Development and Operations

Using SSL certificates from Let’s Encrypt in your Kubernetes Ingress via cert-manager
Walkthrough of the process of automating the issuance and renewal of certificates provided by Let's Encrypt for Kubernetes Ingress using the cert-manager add-on. (from cloudseclist.com)

Use Amazon EC2 Mac Instances to Build & Test macOS, iOS, ipadOS, tvOS, and watchOS Apps
"Powered by Mac mini hardware and the AWS Nitro System, you can use Amazon EC2 Mac instances to build, test, package, and sign Xcode applications for the Apple platform including macOS, iOS, iPadOS, tvOS, watchOS, and Safari." The downside of this is that "The instances are launched as EC2 Dedicated Hosts with a minimum tenancy of 24 hours" which is due Apple EULA and thus one CI build costs about $26. And what I read from HN the real viable option is still to use MacStadium.

Tools of the trade

cloudquery
"cloudquery transforms your cloud infrastructure into queryable SQL tables for easy monitoring, governance and security." (from cloudseclist.com)

k8s-security-policies
"Repository providing a security policies library that is used for securing Kubernetes clusters configurations. The security policies are created based on CIS Kubernetes benchmark and rules defined in Kubesec.io." (from cloudseclist.com)

alyssaxuu/screenity
"Screenity is a feature-packed screen and camera recorder for Chrome. Annotate your screen to give feedback, emphasize your clicks, edit your recording, and much more." (from Weekend Reading)

Miscellanous

Why Apple's replacement for Intel processors works really, really well
"They added Intel's memory-ordering to their CPU. When running translated x86 code, they switch the mode of the CPU to conform to Intel's memory ordering."

What software and hardware I use

There was a discussion in Koodiklinikka Slack about what software people use and that people have made "/uses" pages for that purpose. And inspired by Wes Bos /uses from "Syntax" Podcast here's my list.

Check my /uses page to see what software and hardware I use for full-stack development in JavaScript, Node.js, Java, Kotlin, GraphQL, PostgreSQL and more. The list excludes the tools from different customers like using GitLab, Rocket.Chat, etc.

For more choices check uses.tech.

Generating JWT and JWK for information exchange between services

Securely transmitting information between services and authorization can be achieved with using JSON Web Tokens. JWTs are an open, industry standard RFC 7519 method for representing claims securely between two parties. Here's a short explanation and guide of what they are, their use and how to generate the needed things.

"JSON Web Token (JWT) is an open standard (RFC 7519) that defines a compact and self-contained way for securely transmitting information between parties as a JSON object. This information can be verified and trusted because it is digitally signed. JWTs can be signed using a secret (with the HMAC algorithm) or a public/private key pair using RSA or ECDSA."

jwt.io

You should read the introduction to JWT to understand it's role and there's also a handy JWT Debugger to test things. For more detailed info you can read a JWT handbook.

In short, authorization and information exchange are some scenarios where JSON Web Tokens are useful. They essentially encode any sets of identity claims into a payload, provide some header data about how it is to be signed, then calculate a signature using one of several algorithms and append that signature to the header and claims. JWTs can also be encrypted to provide secrecy between parties. When a server receives a JWT, it can guarantee the data it contains can be trusted because it's signed by the source.

Usually two algorithms are supported for signing JSON Web Tokens: RS256 and HS256. RS256 generates an asymmetric signature, which means a private key must be used to sign the JWT and a different public key must be used to verify the signature.

JSON Web Key

JSON Web Key (JWK) provides a mechanism for distributing the public keys that can be used to verify JWTs. The specification is used to represent the cryptographic keys used for signing RS256 tokens. This specification defines two high level data structures: JSON Web Key (JWK) and JSON Web Key Set (JWKS):

  • JSON Web Key (JWK): A JSON object that represents a cryptographic key. The members of the object represent properties of the key, including its value.
  • JSON Web Key Set (JWKS): A JSON object that represents a set of JWKs. The JSON object MUST have a keys member, which is an array of JWKs. The JWKS is a set of keys containing the public keys that should be used to verify any JWT.

In short, the service signs JWT-tokens with it's private key (in this case PKCS12 format) and the receiving service checks the signature with the public key which is in JWK format.

Generating keys and certificate for JWT

In this example we are using JWTs for information exchange as they are a good way of securely transmitting information between parties. Because JWTs can be signed—for example, using public/private key pairs — you can be sure the senders are who they say they are. Additionally, as the signature is calculated using the header and the payload, you can also verify that the content hasn't been tampered with.

Generate the certificate for JWT with OpenSSL, in this case self-signed is enough:

$ openssl genrsa -out private.pem 4096

Generate public key from earlier generated private key for if pem-jwk needs it, it isn't needed otherwise

$ openssl rsa -in private.pem -out public.pem -pubout

If you try to insert private and public keys to PKCS12 format without a certificate you get an error:

openssl pkcs12 -export -inkey private.pem -in public.pem -out keys.p12
unable to load certificates

Generate self-signed certificate with aforesaid key for 10 years. This certificate isn't used for anything as the counterpart is JWK with just public key, no certificate.

$ openssl req -key private.pem -new -x509 -days 3650 -subj "/C=FI/ST=Helsinki/O=Rule of Tech/OU=Information unit/CN=ruleoftech.com" -out cert.pem

Convert the above private key and certificate to PKCS12 format

$ openssl pkcs12 -export -inkey private.pem -in cert.pem -out keys.pfx -name "my alias"

Check the keystore:

$ keytool -list -keystore keys.pfx
OR
$ keytool -v -list -keystore keys.pfx -storetype PKCS12 -storepass
Enter keystore password:  
Keystore type: PKCS12
Keystore provider: SUN
Your keystore contains 1 entry
1, Jan 18, 2019, PrivateKeyEntry,
Certificate fingerprint (SHA-256): 0D:61:30:12:CB:0E:71:C0:F1:A0:77:EB:62:2F:91:9B:55:08:FC:3B:A5:C8:B4:C7:B4:CD:08:E9:2C:FD:2D:8A

If you didn't set alias for the key when creating the PKCS12 you can change it

keytool -changealias -alias "original alias" -destalias "my awesome alias" -keystore keys.pfx -storetype PKCS12 -storepass "password"

Now we finally get to the part where we generate the JWK. The final result is a JSON file which contains the public key from earlier created certificate in JWK-format so that the service can accept the signed tokens.

The JWK is in format of:

" 
{
"keys": [
….,
{
"kid": "something",
"kty": "RSA",
"use": "sig",
"n": "…base64 public key values …",
"e": "…base64 public key values …"
}
]
}
"

Convert the PEM to JWK format with e.g. pem-jwk or with pem_to_jwks.py. The key is in pkcs12 format. The values for public key's values n and e are extracted from private key with following commands. jq part extracts the public parts and excludes the private parts.

$ npm install -g pem-jwk
$ ssh-keygen -e -m pkcs8 -f private.pem | pem-jwk | jq '{kid: "something", kty: .kty , use: "sig", n: .n , e: .e }'

...

To check things, you can do the following.

Extract a private key and certificates from a PKCS12 file using OpenSSL:

$ openssl pkcs12 -in keys.p12 -out keys_out.txt

The private key, certificate, and any chain files will be parsed and dumped into the "keys_out.txt" file. The private key will still be encrypted.

To extract just the private key from p12 (key is still encrypted):

$ openssl pkcs12 -in keys.p12 -nocerts -out privatekey.pem

Decrypt the private key:

$ openssl rsa -in privatekey.pem -out privatekey_uenc.pem

Now if you convert the PEM to JWK you should get the same values as before.

More to read: JWTs? JWKs? ‘kid’s? ‘x5t’s? Oh my!

Keep Maven dependencies up to date

Software development projects come usually with lots of dependencies and keeping them up to date can be burdensome if done manually. Fortunately there are tools to help you. For Node.js projects there are e.g. npm-check and npm-check-updates and for Maven projects there are OWASP/Dependency-Check and Versions Maven plugins. Here's a short introduction how to setup your Maven project to automatically check dependencies for vulnerabilities and if there's outdated dependencies.

OWASP/Dependency-Check

OWASP dependency-check is an open source solution the OWASP Top 10 2013 entry: "A9 - Using Components with Known Vulnerabilities".

Dependency-check can currently be used to scan Java and .NET applications to identify the use of known vulnerable components. The dependency-check plugin is, by default, tied to the verify or site phase depending on if it is configured as a build or reporting plugin.

The example below can be executed using mvn verify:

<project>
     ...
     <build>
         ...
         <plugins>
             ...
<plugin> 
    <groupId>org.owasp</groupId> 
    <artifactId>dependency-check-maven</artifactId> 
    <version>5.0.0-M3</version> 
    <configuration>
        <failBuildOnCVSS>8</failBuildOnCVSS>
        <skipProvidedScope>true</skipProvidedScope> 
        <skipRuntimeScope>true</skipRuntimeScope> 
    </configuration> 
    <executions> 
        <execution> 
            <goals> 
                <goal>check</goal> 
            </goals> 
        </execution> 
    </executions> 
</plugin>
            ...
         </plugins>
         ...
     </build>
     ...
</project>

The example fails the build for CVSS greater than or equal to 8 and skips scanning the provided and runtime scoped dependencies.

Versions Maven Plugin

The Versions Maven Plugin is the de facto standard way to manage versions of artifacts in a project's POM. From high-level comparisons between remote repositories up to low-level timestamp-locking for SNAPSHOT versions, its massive list of goals allows us to take care of every aspect of our projects involving dependencies.

The example configuration of versions-maven-plugin:

<plugin>
    <groupId>org.codehaus.mojo</groupId>
    <artifactId>versions-maven-plugin</artifactId>
    <version>2.7</version>
    <configuration>
        <allowAnyUpdates>false</allowAnyUpdates>
        <allowMajorUpdates>false</allowMajorUpdates>
        <allowMinorUpdates>false</allowMinorUpdates>
        <processDependencyManagement>false</processDependencyManagement>
    </configuration>
</plugin>

You could use goals that modify the pom.xml as described in the usage documentation but often it's easier to check versions manually  as you might not be able to update all of the suggested dependencies.

The display-dependency-updates goal will check all the dependencies used in your project and display a list of those dependencies with newer versions available.

Check new dependencies with:

mvn versions:display-dependency-updates

Check new plugin versions with:

mvn versions:display-plugin-updates

Summary

Using OWASP/Dependency-Check in your Continuous Integration build flow to automatically check dependencies for vulnerabilities and running periodically Versions Maven Plugin to check if there are outdated dependencies helps you to keep your project up to date and secure. Small but important things to remember while developing and maintaining a software project.

Automate versioning and changelog with release-it on GitLab CI/CD

It’s said that you should automate all the things and one of the things could be versioning your software. Incrementing the version number in your e.g. package.json is easy but it’s easier when you bundle it to your continuous integration and continuous deployment process. There are different tools you can use to achieve your needs and in this article we are using release-it. Other options are for example standard-version and semantic-release.

🚀 Automate versioning and package publishing

Using release-it with CI/CD pipeline

Release It is a generic CLI tool to automate versioning and package publishing related tasks. It’s installation requires npm but package.json is not needed. With it you can i.a. bump version (in e.g. package.json), create git commit, tag and push, create release at GitHub or GitLab, generate changelog and make a release from any CI/CD environment.

Here is an example setup how to use release-it on Node.js project with Gitlab CI/CD.

Install and configure release-it

Install release-it with npm init release-it which ask you questions or manually with npm install --save-dev release-it .

For example the package.json can look the following where commit message has been customized to have "v" before version number and npm publish is disabled (although private: true should be enough for that). You could add [skip ci] to "commitMessage" for i.a. GitLab CI/CD to skip running pipeline on release commit or use Git Push option ci.skip.

package.json
{
  "name": “example-frontend",
  "version": "0.1.2",
  "private": true,
  "scripts": {
    ...
    "release": "release-it"
  },
  "dependencies": {
    …
  },
  "devDependencies": {
    ...
    "release-it": "^12.4.3”
},
"release-it": {
    "git": {
      "tagName": "v${version}",
      "requireCleanWorkingDir": false,
      "requireUpstream": false,
      "commitMessage": "Release v%s"
    },
    "npm": {
      "publish": false
    }
  }
}

Now you can run npm run release from the command line:

npm run release
npm run release -- patch --ci

In the latter command things are run without prompts (--ci) and patch increases the 0.0.x number.

Using release-it with GitLab CI/CD

Now it’s time to combine release-it with GitLab CI/CD. Adding release-it stage is quite straigthforward but you need to do couple of things. First in order to push the release commit and tag back to the remote, we need the CI/CD environment to be authenticated with the original host and we use SSH and public key for that. You could also use private token with HTTPS.

  1. Create SSH keys as we are using the Docker executorssh-keygen -t ed25519.
  2. Create a new SSH_PRIVATE_KEY variable in "project > repository > CI / CD Settings" where and paste the content of your private key that you created to the Value field.
  3. In your "project > repository > Repository" add new deploy key where Title is something describing and Key is the content of your public key that you created.
  4. Tap "Write access allowed".

Now you’re ready for git activity for your repository in CI/CD pipeline. Your .gitlab-ci.yaml release stage could look following.

image: docker:19.03.1

stages:
  - release

Release:
  stage: release
  image: node:12-alpine
  only:
    - master
  before_script:
    - apk add --update openssh-client git
    # Using Deploy keys and ssh for pushing to git
    # Run ssh-agent (inside the build environment)
    - eval $(ssh-agent -s)
    # Add the SSH key stored in SSH_PRIVATE_KEY variable to the agent store
    - echo "$SSH_PRIVATE_KEY" | tr -d '\r' | ssh-add -
    # Create the SSH directory and give it the right permissions
    - mkdir -p ~/.ssh
    - chmod 700 ~/.ssh
    # Don't verify Host key
    - '[[ -f /.dockerenv ]] && echo -e "Host *\n\tStrictHostKeyChecking no\n\n" > ~/.ssh/config'
    - git config user.email "gitlab-runner@your-domain.com"
    - git config user.name "Gitlab Runner"
  script:
    # See https://gist.github.com/serdroid/7bd7e171681aa17109e3f350abe97817
    # Set remote push URL
    # We need to extract the ssh/git URL as the runner uses a tokenized URL
    # Replace start of the string up to '@'  with git@' and append a ':' before first '/'
    - export CI_PUSH_REPO=$(echo "$CI_REPOSITORY_URL" | sed -e "s|.*@\(.*\)|git@\1|" -e "s|/|:/|" )
    - git remote set-url --push origin "ssh://${CI_PUSH_REPO}"
    # runner runs on a detached HEAD, checkout current branch for editing
    - git reset --hard
    - git clean -fd
    - git checkout $CI_COMMIT_REF_NAME
    - git pull origin $CI_COMMIT_REF_NAME
    # Run release-it to bump version and tag
    - npm ci
    - npm run release -- patch --ci --verbose

We are running release-it here with patch increment. If you want to skip CI pipeline on release-it commit you can either use the ci.skip Git Push option package.json git.pushArgs which tells GitLab CI/CD to not create a CI pipeline for the latest push. This way we don't need to add [skip ci] to commit message.

And now you're ready to run the pipeline with release stage and enjoy of automated patch updates to your application's version number. And you also get GitLab Releases if you want.

Setting up the script step was not so clear but fortunately people in the Internet had done it earlier and Google found a working gist and comment on GitLab issue. Interacting with git in the GitLab CI/CD could be easier and there are some feature requests for that like allowing runners to push via their CI token.

Customizing when pipelines are run

There are some more options for GitLab CI/CD pipelines if you want to run pipelines after you've tagged your version. Here's snippet of running "release" stage on commits to master branch and skipping it if commit message is for release.

Release:
  stage: release
  image: node:12-alpine
  only:
    refs:
      - master
    variables:
      # Run only on master and commit message doesn't start with "Release v"
      - $CI_COMMIT_MESSAGE !~ /^Release v.*/
  before_script:
    ...
  script:
    ...

Now we can build a new container for the deployment of our application after it has been tagged and version bumped. Also we are reading the package.json version for tagging the image.

variables:
  PACKAGE_VERSION: $(cat package.json | grep version | head -1 | awk -F= "{ print $2 }" | sed 's/[version:,\",]//g' | tr -d '[[:space:]]')

Build dev:
  before_script:
    - export VERSION=`eval $PACKAGE_VERSION`
  stage: build
  script:
    - >
      docker build
      --pull
      --tag your-docker-image:latest
      --tag your-docker-image:$VERSION.dev
      .
    - docker push your-docker-image:latest
    - docker push your-docker-image:$VERSION.dev
  only:
    refs:
      - master
    variables:
      # Run only on master and commit message starts with "Release v"
      - $CI_COMMIT_MESSAGE =~ /^Release v.*/

Using release-it on detached HEAD

In the previous example we made a checkout to current branch for editing as the runner runs on detached HEAD. You can use the detached HEAD as shown below but the downside is that you can't create GitLab Releases from the pipeline as it fails to "ERROR Response code 422 (Unprocessable Entity)". This is because (I suppose) it doesn't make git push as it's done in manually with git.

Then the .gitlab-ci.yml is following:

...
script:
    - export CI_PUSH_REPO=$(echo "$CI_REPOSITORY_URL" | sed -e "s|.*@\(.*\)|git@\1|" -e "s|/|:/|" )
    - git remote set-url --push origin "ssh://${CI_PUSH_REPO}"
    # gitlab-runner runs on a detached HEAD, create a temporary local branch for editing
    - git checkout -b ci
    # Run release-it to bump version and tag
    - npm ci
    - DEBUG=release-it:* npm run release -- patch --ci --verbose --no-git.push
    # Push changes to originating branch
    # Always return true so that the build does not fail if there are no changes
    - git push --follow-tags origin ci_processing:${CI_COMMIT_REF_NAME} || true

Reset Hasura migrations and squash files

Using GraphQL for creating REST APIs is nowadays popular and there are different tools you can use. One of them is Hasura which is an open-source engine that gives you realtime GraphQL APIs on new or existing Postgres databases. Hasura is quite easy to work with but if your GraphQL schemas change a lot it creates plentiful of migration files. This has some unwanted consequences (for example slowing down the hasura migrate apply or even blocking it). Here’s some notes how to reset the state and create new migrations from the state that is on the server.

Note: From Hasura 1.0.0 onwards squashing is easier with hasura migrate squash command. It's still in preview. But before Hasura 1.0.0 version you have to squash migrations manually and this blog post explains how. The results are the same: squashing multiple migrations into a single one.

Hasura documentation provides a good guide how to squash migrations but in practice there are couple of other things you may need to address. So let’s combine the steps Hasura gives and some extra steps.

Reset Hasura migrations

First make a backup branch:

  1. $ git checkout master
  2. Create a backup branch:
    $ git checkout -b backup/migrations-before-resetting-20XX-XX-XX
  3. Update the backup branch to origin:
    $ git push origin backup/migrations-before-resetting-20XX-XX-XX

We are assuming you've local Hasura running on Docker with something like the following docker-compose.yml

version: "3.6"
services:
  postgres:
    image: postgres:11-alpine
    restart: always
    ports:
      - "5432:5432"
    volumes:
      - db_data:/var/lib/postgresql/data
    command: postgres -c max_locks_per_transaction=2000
  graphql-engine:
    image: hasura/graphql-engine:v1.0.0-beta.6
    ports:
      - "8080:8080"
    depends_on:
      - "postgres"
    restart: always
    environment:
      HASURA_GRAPHQL_DATABASE_URL: postgres://postgres:@postgres:5432/postgres
      HASURA_GRAPHQL_ENABLE_CONSOLE: "true" # set to "false" to disable console
      HASURA_GRAPHQL_ADMIN_SECRET: changeme
      HASURA_GRAPHQL_ENABLED_LOG_TYPES: startup, http-log, webhook-log, websocket-log, query-log
volumes:
  db_data:

Create local instance of Hasura with up to date migrations:

  1. $ docker-compose down -v
  2. $ docker-compose up
  3. $ hasura migrate apply --endpoint=http://localhost:8080 --admin-secret=changeme

Reset migrations to master:

  1. git checkout master
  2. git checkout -b reset-hasura-migrations
  3. rm -rf migrations/*

Reset the migration history on server. On hasura SQL console, http://localhost:8080/console:

TRUNCATE hdb_catalog.schema_migrations;

Setup fresh migrations by taking the schema and metadata from the server. By default init only takes public schema if others not mentioned with the --schema "your schema" parameter. Note down the version for later use.

  1. Create migration file:
    $ hasura migrate create "init" --from-server
  2. Mark the migration as applied on this server:
    $ hasura migrate apply --version "" --skip-execution
  3. Verify status of migrations, should show only one migration with Present status:
    $ hasura migrate status
  4. You have brand new migrations now!

Resetting migrations on other environments

  1. Checkout the reset branch on local machine:
    $ git checkout -b reset-hasura-migrations
  2. Reset the migration history on remote server. On Hasura SQL console:
    TRUNCATE hdb_catalog.schema_migrations;
  3. Apply migration status to remote server:
    $ hasura migrate apply --version "<version>" --skip-execution

Local environment Hasura status

For other developers please refer these instructions in order to get the backend into same state.

Option 1: Keep old data

  1. Checkout the backup branch on local machine:
    $ git checkout backup/migrations-before-resetting-20XX-XX-XX
  2. Reset the migration history on local server. On Hasura SQL console:
    TRUNCATE hdb_catalog.schema_migrations;
  3. Apply migration status to local server:
    $ hasura migrate apply --version "<version>" --skip-execution

Option 2: Remove all and start from beginning

  1. Clean up the old docker volumes:
    $ docker-compose down -v
  2. Start up services:
    $ docker-compose up
  3. Checkout master:
    $ git checkout master
  4. Apply migrations:
    $ hasura migrate apply --endpoint=http://localhost:8080 --admin-secret=changeme

Possible extra steps

Now your Hasura migrations and database tables are in one migration init file but sometimes things don’t work out when applying it to empty database. We are using Hasura audit-trigger and had to reorder the SQL clauses done by the migrate init and add some missing parts.

  1. Move schema creations after audit clauses
  2. Move audit.audit_table(target_table regclass) to last audit clause and copy it from audit.sql
  3. Add pg_trgm extension as done previously (fixes "operator does not exist: text <%!t(MISSING)ext" in public.search_customers_by_name)
  4. Drop session constraints / index before creating new
  5. Create session table only if not exists

Tracking vulnerabilities and keeping Node.js packages up to date

Software evolves quickly and new versions of libraries are released but how do you keep track of updated dependencies and vulnerable libraries? Managing dependencies has always been somewhat a pain point but an important part of software development as it's better to be tracking vulnerabilities and running fresh packages than being pwned.

There are couple of tools for JavaScript projects which use npm to manage dependencies to check new versions and some tools to track vulnerabilities. Here's a short introduction to npm audit, depcheck, npm-check-updates and npm-check to help you on your way.

If your project is using yarn adjust your workflow accordingly. There's for example yarn audit and yarn-check to match tools for npm. And it goes without saying that don't use npm if your project uses yarn.

Running security audit with npm audit

From version 6 onwards npm comes build with audit command which checks for vulnerabilities in your dependencies and runs automatically when you install a package with npm install. You can also run npm audit manually on your locally installed packages to conduct a security audit of the package and produce a report of dependency vulnerabilities and suggested patches.

The npm audit command submits a description of the dependencies configured in your package to your default registry and asks for a report of known vulnerabilities. It checks direct dependencies, devDependencies, bundledDependencies, and optionalDependencies, but does not check peerDependencies.

If your npm registry doesn't support npm audit, like Artifactory, you can pass in the --registry flag to point to public npm. The downside is that now you can't audit private packages that are on the Artifactory registry.

$ npm audit --registry=https://registry.npmjs.org

"Running npm audit will produce a report of security vulnerabilities with the affected package name, vulnerability severity and description, path, and other information, and, if available, commands to apply patches to resolve vulnerabilities."

Example: partial output of npm audit run

Using npm audit is useful also in Continuous Integration as it will return a non-zero response code if security vulnerabilities are found.

For more information read npm's Auditing dependencies for security vulnerabilities.

Updating packages with npm outdated

It's recommended to regularly update the local packages your project depends on to improve your code as improvements to its dependencies are made. In your project root directory, run the update command and then outdated. There should not be any output.

$ npm update
$ npm outdated 
Example of results from npm outdated

You can also update globally-installed packages. To see which global packages need to be updated run outdated first with --depth=0.

$ npm outdated -g --depth=0
$ npm outdated -g

For more information read updating packages downloaded from the registry.

Check updates with npm-check-updates

Package.json contains dependencies with semantic versioning policy and to find newer versions of package dependencies than what your package.json allows you need tools like npm-check-updates. It can upgrade your package.json dependencies to the latest versions, ignoring specified versions while maintaining your existing semantic versioning policies.

Install npm-check-updates globally with:

$ npm install -g npm-check-updates 

And run it with:

$ ncu

The result shows any new dependencies for the project in the current directory. See documentation for i.a. configuration files for filtering and excluding dependencies.

Example of results from ncu

And finally you can run ncu -u to upgrade the package.json.

Check updates with npm-check

Similar tool to npm-check-updates is npm-check which additionally gives more information about the version changes available and also lets you interactively pick which packages to update instead of an all or nothing approach. It checks for outdated, incorrect, and unused dependencies.

Install npm-check globally with:

$ npm i -g npm-check

Now you can run the command inside your project directory:

$ npm-check
Or
$ npm-check --registry=https://registry.npmjs.org

It will display all possible updates with information about the type of update, project URL, commands, and will attempt to check if the package is still in use. You can easily parse through the results and see what packages might be safe to update. When updates are required it will return a non-zero response code that you can use in your CI tools.

The check for unused dependencies uses depcheck and isn't able to foresee all ways dependencies can be used so be vary with careless removing of packages.

To see an interactive UI for choosing which modules to update run:

$ npm-check –u

Analyze dependencies with depcheck

Your package.json is filled with dependencies and some of them might be useless or even missing from package.json. Depcheck is a tool for analyzing the dependencies in a project to see how each dependency is used, which dependencies are useless, and which dependencies are missing. It does not only recognizes the dependencies in JavaScript files, but also supports i.a. React JSX and Typescript.

Install depcheck with:

$ npm install -g depcheck
And with additional syntax support for Typescript
$ npm install -g depcheck typescript

Run depcheck with:

$ depcheck [directory]
Example of results from depcheck

Summary

tl;dr;

  1. Use npm audit in your CI pipeline
  2. Update dependencies with npm outdated
  3. Check new versions of dependencies with either npm-check-updates or npm-check
  4. Analyze dependencies with depcheck