Automate validating code changes with Git hooks

What could be more annoying than committing code changes to repository and noticing afterwards that formatting isn’t right or tests are failing? Your automated tests on Continuous Integration shows rain clouds and you need to get back to the code and fix minor issues with extra commits polluting the git history? Fortunately with small enhancements to your development workflow you can automatically prevent all the hassle and check your changes before committing them. The answer is to use Git hooks for example on pre-commit for running linters and tests.

Git Hooks

Git hooks are scripts that Git executes before or after events such as: commit, push, and receive. They’re a built-in feature and run locally. Hook scripts are only limited by a developer’s imagination. Some example hook scripts include:

  • pre-commit: Check the commit for linting errors.
  • pre-receive: Enforce project coding standards.
  • post-commit: Email team members of a new commit.
  • post-receive: Push the code to production.

Every Git repository has a .git/hooks folder with a script for each hook you can bind to. You’re free to change or update these scripts as necessary, and Git will execute them when those events occur.

Git hooks can greatly increase your productivity as a developer as you can automate tasks and ensure that your code is ready for commit or pushing to remote repository.

For more reading about Git hooks you can check missing Git hooks documentation, read the basics and check tutorial how to use Git hooks on local Git clients and Git servers.

Pre-commit

One productive way to use Git hooks is pre-commit framework for managing and maintaining multi-language pre-commit hooks. Read tips for using a pre-commit hook.

Pre-commit is nice for example running linters to ensure that your changes conform to coding standards. All you need is to install pre-commit and then add hooks.

Installing pre-commit, ktlint and pre-commit-hook on MacOS with Homebrew:

$ brew install pre-commit
$ brew install ktlint
$ ktlint --install-git-pre-commit-hook

For example the pre-commit hook to run ktlint with auto-correct option looks like the following in projects .git/hooks/pre-commit. The “export PATH=/usr/local/bin:$PATH” is for SourceTree to find git on MacOS.

#!/bin/sh
export PATH=/usr/local/bin:$PATH
# https://github.com/shyiko/ktlint pre-commit hook
git diff --name-only --cached --relative | grep '\.kt[s"]\?$' | xargs ktlint -F --relative .
if [ $? -ne 0 ]; then exit 1; else git add .; fi

The main disadvantage is using pre-commit and local git hooks is that hooks are kept within .git directory and it never comes to the remote repository. Each contributor will have to install them manually in his local repository which may be overlooked.

Maven projects

Githook Maven plugin deals with the problem of providing hook configuration to the repository and automates their installation. It binds to Maven projects build process and configures and installs local git hooks.

It keeps a mapping between the hook name and the script by creating a respective file in .git/hooks for each hook containing given script in Maven project’s initial lifecycle phase. It’s good to notice that the plugin rewrites hooks.

Usage Example:

<build>
    <plugins>
	<plugin>
	    <groupId>org.sandbox</groupId>
	    <artifactId>githook-maven-plugin</artifactId>
	    <version>1.0.0</version>
	    <executions>
	        <execution>
	            <goals>
	                <goal>install</goal>
	            </goals>
	            <configuration>
	                <hooks>
	                    <pre-commit>
	                         echo running validation build
	                         exec mvn clean install
	                    </pre-commit>
	                </hooks>
	            </configuration>
	        </execution>
	    </executions>
	</plugin>
    </plugins>
</build>

Git hooks for Node.js projects

On Node.js projects you can define scripts in package.json and run them with npm which enables an another approach to running Git hooks.

🐶 Husky is Git hooks made easy for Node.js projects. It keeps existing user hooks, supports GUI Git clients and all Git hooks.

Installing Husky is like any other npm library

npm install husky --save-dev

The following configuration on your package.json runs lint (e.g. eslint with –fix) command when you try to commit and runs lint and tests (e.g. mocha, jest) when you try to push to remote repository.

"husky": {
   "hooks": {
     "pre-commit": "npm run lint",
     "pre-push": "npm run lint && npm run test"
   }
}

Another useful tool is lint-staged which utilizes husky and runs linters against staged git files.

Summary

Make your development workflow easier by automating all the things. Check your changes before committing them with pre-commit, husky or Githook Maven plugin. You get better code and commit quality for free and your team is happier.

This article was originally published at 15.7.2019 on Gofore’s blog.

Monthly notes 44

Summer holidays are over and it’s time to get back to work and monthly notes. I spent almost whole August enjoying nature, mountain biking, hiking and coaching young mountainbikers. Less computers, more relaxing. This month’s notes are about writing great Docker images, validate code using git hooks, log management, story about npm registry, working remotely and effective Kotlin. Happy reading.

Issue 44, 6.9.2019

Microservices

How to write great Docker container images
It’s easy with these great tips and examples. I would add that use small base image like Alpine Linux if possible. (from @walokra)

Kubernetes: A Detailed Example of Deployment of a Stateful Application
Article goes through the overview to Kubernetes by covering “What are the design principles and architecture of Kubernetes?” and “How to use Kubernetes, and a simple example.” (from @java)

Software Development

Automate validating code changes with Git hooks
What could be more annoying than committing code changes and noticing afterwards that the formatting isn’t right or tests are failing? Read these tips how automate validating code changes with git hooks and make your flow smooth.

Fast log management for your apps
Nicolas Frankel talked at Berlin Buzzwords about logging. Good overview to the issue. TL;DR; no computation to logs, filesystem matters, asynchronous vs. reliability, no expensive meta-data, schema on write, send JSON.

Use morning hours for open source and improving

You’re better at your work when you’re improving your technical craft.

JavaScript

Story of money and ownership and control
“the economics of open source [in JavaScript, Node.js and npm]”. Important point of views to problems with (privately controlled) [npm] package registry. (from @walokra)

Team work

11 Best Practices for Working Remotely
Good tips for working remotely. The biggest hurdles are communication, social opportunities and loneliness and isolation. “With consistent effort, you can overcome the challenges of remote work and create a healthy, happy, productive environment for yourself and for your team.” (from @dunjardl)

If you ever have to lead a remote dev team…
The Remote Workflow:simple, transparent, predictable, frictionless. (from @ThePracticalDev)

Books

Effective Kotlin beta release
Adding this to my reading list! “First official version of Effective Kotlin is finally in distribution (as an ebook)”. Having read Effective Java this book is totally worth it.

Something different

Watch 14 minutes of new Cyberpunk 2077 gameplay footage
A new look at different gameplay styles for the upcoming open-world RPG.

Monthly notes 43

Issue 43, 25.7.2019

Microservices

How to write great container images
Article shows the principles of what writes consider “Dockerfile best practices”, and simultaneously walks through them with a real example. I would add that use small base image like Alpine Linux if possible.

Micro Frontends
The article describes breaking up frontend monoliths into many smaller, more manageable pieces, and how this architecture can increase the effectiveness and efficiency of teams working on frontend code. As well as talking about the various benefits and costs, it covers some of the implementation options that are available, and dives deep into a full example application that demonstrates the technique.

Performance

Performance Analysis Methodology
Informative presentation of Performance Analysis Methodology by Brendan Gregg at LISA ’12. Focus on the USE method which all staff can use for identifying common bottlenecks and errors. Check for: Utilization, Saturation, Errors. (from walokra)

Fast log management for your apps
You’ve migrated your application to Reactive Microservices to get the last ounce of performance from your servers. But what about logs? Logs can be one of the few roadblocks on the road to ultimate performance. Nicolas Frankel shows in his talk at Berlin Buzzwords 2019 some insider tips and tricks taken from our experience put you on the track toward fast(er) log management.

JavaScript

single-spa
A javascript framework for front-end microservices.

Node.js Memory Management in Container Environments
Best practices for managing memory in container-based Node apps. (from JavaScript Daily)

CTU JavaScript Guide
Opinionated guide to ground rules for an application’s JavaScript code, such that it’s highly readable and consistent across different developers on a team. The focus is put on quality and coherence across the different pieces of your application.

Security

Nginx Admin’s Handbook
nginx is a powerful web server but with great power comes great responsibility (to configure it for security and performance). “Nginx Admin’s Handbook” is a good collection of rules, helpers, notes and papers, best practices and recommendations to achieve it. (from walokra)

GOTCHA: Taking phishing to a whole new level
Without X-FRAME-OPTIONS you can build a  UI redressing attack that allows attackers to extract valuable information from API endpoints. tl; dr; extract chars with CSS, add captcha form, scramble chars, get user to fill in the password-captcha.

Staying Safe on GitHub: The Ultimate GitHub Security Tools Roundup
Nice overview to #security tools for #GitHub repositories. GitHub Security Alerts is provided by default, additionally use one of these: Snyk, WhiteSource Bolt, Sonatype DepShield. (from walokra)

Something different

It’s Summer and there’s plenty of Natural Parks in Finland. Go and create your Summer adventure in the wilderness. From Southern Archipelago to Northern Fells: Pallas-Yllästunturi, UKK, Pyhä-Luosto, Koli, Nuuksio.

Ignoring files and folders in Subversion with propset

Before committing code to the Subversion repository we always set the svn:ignore property on the directory to prevent some files and directories to be checked in. You would usually want to exclude the IDE project files and the target/ directory.

It’s useful to put all the ignored files and directories into a file: .svnignore. Your .svnignore could look like:

*.iml
target/*

Put the .svnignore file in the project folder and commit it to your repository so the ignored files are shared between committers.

Now reference the file with the -F option:
$ svn propset svn:ignore -F .svnignore.

Of course I hope everyone has by now moved to git and uses .gitignore for this same purpose.

Monthly notes 42

Midsummer is couple of days away and it’s time to take a short break from work and enjoy the Summer nights and nature. And if you have time here is a short list of articles to read and videos from React Finland 2019 conference to watch.

Issue 42, 20.6.2019

Software Development

Consulting or con-$ulting
A theory on how Hertz’s inexperience in buying software — combined with Accenture’s incompetence to deliver it — flushed $32M+ down the drain. “The lack of transparency and technical expertise combined with the lack of ownership/responsibility was ultimately the reason why Hertz managed to blow tens of millions USD, instead of just a couple.” Lessons learned: “If you are buying software for tens of millions, you must have an in-house technical expert as part of the software development process”.

Dont’ stop writing code comments
“You should write comments which matter.” @nicolas_frankel)

Improve your technical craft
More knowledge helps you to do your work (more efficiently and in less time). In knowledge work time spent at the office doesn’t equal productivity or job well done.

Or rather than 1hr per day, batch it all on e.g. Fridays when things are slow.

React Finland 2019 presentations
29 videos from React Finland 2019 conference. Some picks: Automation and Exploratory testing, More Accessible React Apps, guide to building your design system infrastructure.

Databases

Be careful with CTE in PostgreSQL
PostgreSQL doesn’t inline common table expressions, WITH clause, it materializes it and thus is unable to utilize the index => expensive. Good to know if you’re used to Oracle which doesn’t materialize CTEs by default. (from walokra)

UX

Can’t Unsee
“The devil is in the details”. A game where your attention to details earns you a lot of coins. Fun game which teaches you some UX rules and attention to details. With 5780 coins I’m a beginner :/ (or need glasses :))

It feels fine on my phone
“You literally can’t afford desktop or iphone levels of JS if you’re trying to make good web experiences for anyone but the world’s richest users, and that likely means re-evaluating your toolchain.”

“Gap between “what I get when I trade in my phone every 2 years” and the true low-end is now a gaping chasm, and even 5 years ago, that wasn’t true.”

Something different

Arofly Link brings power meters under $200 using a tire pressure monitor
Forget power meters in cranks and pedals, here’s Arofly Link. “Unlike most power meters that measure the actual force being put into your bike’s drivetrain between the pedals and the rear hub, Arofly takes it one step further apparently measuring power where the rubber meets the road.”

Monthly Notes 41

Issue 41, 31.5.2019

Software development

Gitmoji
If not considering the issue on Bamboo with this (thread), Using Emojis in Git commit messages is a nice idea. There’s even cool emoji guide for your commit messages. Going to take this into use 😊 (from walokra)

Happy Friday, Don’t push to production?
Good thread of how you should treat your deploys to production. You should deploy often and have good CI/CD practices but the overall question isn’t black or white. “Nothing goes wrong until it does, and then you’d want your people available.”  “If you’re scared of pushing to production on Fridays, I recommend reassigning all your developer cycles off of feature development and onto your CI/CD process and observability tooling for as long as it takes to ✨fix that✨.” (from walokra)

Sleep quality and stress level matter and after 24 hours awake
“Your sleep quality and stress level matter far, far more than the languages you use or the practices you follow. Nothing else comes close”. Good notes of why sleeping and rest matters (thread) 😴 There’s always more work to do, take care of yourself first! (from walokra)

Software architecture

Why software architects fail – and what to do about it
Looks at some of the most common pitfalls that ensure you’ll come up with a disaster, and discusses how they can be avoided.

Cloud

High” levels of ☁️ spending at Lyft
Continuation on the Internet-discussion whether Lyft’s spending on AWS is too high and they could do better on-premises.

Frontend

TSLint in 2019
“Once we consider ESLint feature-complete w.r.t. TSLint, we will deprecate TSLint and help users migrate to ESLint”

User Experience

Printable A3 posters for Laws of UX
(from JonYablonski)

Something different

“Work starts from problems and learning starts from questions. Work is creating value and learning is creating knowledge. Both work and learning require the same things: interaction and engagement.” (from EskoKilpi)

Best Practices for Version Control in 8 steps

Using version control is an essential part of modern software development and using it efficiently should be part of every developer’s tool kit. Knowing the basic rules makes it even more useful. Here are some best practices that help you on your way.

tl; dr;

  1. Commit logical changesets (atomic commits)
  2. Commit Early, Commit Often
  3. Write Reasonable Commit Messages
  4. Don’t Commit Generated Sources
  5. Don’t Commit Half-Done Work
  6. Test Before You Commit
  7. Use Branches
  8. Agree on a Workflow
Simplified Git Flow
Simplified Git Flow (source: buildazure)

Commit logical changesets (atomic commits)

A commit should be a wrapper for related changes. Make sure your change reflects a single purpose: the fixing of a specific bug, the addition of a new feature, or some particular task. Small commits make it easier for other developers to understand the changes and roll them back if something went wrong.

Your commit will create a new revision number which can forever be used as a “name” for the change. You can mention this revision number in bug databases, or use it as an argument to merge should you want to undo the change or port it to another branch. Git makes it easy to create very granular commits.

So if you do many changes to multiple logical components at the same time, commit them in separate parts. That way it’s easier to follow changes and their history. So working with features A, B and C and fixing bugs 1, 2 and 3 should make at least 6 commits.

Commit Early , Commit Often

It is recommended to commit code to version control often which keeps your commits small and, again, helps you commit only related changes. It also allows you to share your code more frequently with others.

It’s easier for everyone to integrate changes regularly and avoid having merge conflicts. Having few large commits and sharing them rarely, in contrast, makes it hard to solve conflicts.

“If the code isn’t checked into source control, it doesn’t exist.”

Coding Horror

Write Reasonable Commit Messages

Always write some reasonable comment on your commit. It should be short and descriptive and tell what was changed and why.

Begin your message with a short summary of your changes (up to 50 characters as a guideline). Separate it from the following body by including a blank line.

It is also useful to add some prefix to your message like Fix or Add, depending on what kind of changes you did. Use the imperative, present tense (“change”, not “changed” or “changes”) to be consistent with generated messages from commands like git merge.

If fixing some bug or making some feature and it has a JIRA ticket, add the ticket identifier as a prefix.

For example: “Fix a few bugs in the interface. Added an ID field. Removed a couple unnecessary functions. Refactored the context check.” or “Fix bad allocations in image processing routines”.

Not like this: “Fixed some bugs.”

The body of your message should provide detailed answers to the following questions: What was the motivation for the change? How does it differ from the previous implementation?

“If the changes you made are not important enough to comment on, they probably are not worth committing either.”

loop label

Don’t Commit Generated Sources

Don’t commit files which are generated dynamically or which are user dependent. Like target folder or IDEA’s .iml files or Eclipse’s .settings and .project files. They change depending what the user likes and don’t relate to project’s code.

Also project’s binary files and Javadocs are files that don’t belong to version control.

Don’t Commit Half-Done Work

You should only commit code when it’s completed. Split the feature’s implementation into logical chunks and remember to commit early and often. Use branches or consider using Git’s Stash feature if you need a clean working copy (to check out a branch, pull in changes, etc.).

On the other hand you should never leave the office without commiting your changes.

“It’s better to have a broken build in your working repository than a working build on your broken hard drive.”

loop label

Test Before You Commit

You should only commit code which is tested and passes tests. And this includes code formatting with linters. Write tests and run tests to make sure the feature or bug fix really is completed and has no side effects (as far as one can tell).

Having your code tested is even more important when it comes to pushing / sharing your code with others.

Use Branches

Branching is one of Git’s most powerful features – and this is not by accident: quick and easy branching was a central requirement from day one. Branches are the perfect tool to help you avoid mixing up different lines of development.

You should use branches extensively in your development workflows: for new features, bug fixes and ideas.

Agree on a Workflow

Git lets you pick from a lot of different workflows: long-running branches, topic branches, merge or rebase, git-flow.

Which one you choose depends on a couple of factors: your project, your overall development and deployment workflows and (maybe most importantly) on your and your teammates’ personal preferences. However you choose to work, just make sure to agree on a common workflow that everyone follows.

Atlassian has done good article of comparing workflows to suit your needs and covers centralized, feature Branch, gitflow and forking workflows.

Summary

Using version control is usually and fortunately an acknowledged best practice and part of software development. By using even couple of the above practices makes working with the code much more pleasant. Adopting at least “Commit logical changesets” and “Reasonable Commit Messages” helps a lot.

Restore single table from full MySQL database dump

Playing with data in databases is sometimes tricky but when you get down to it it’s just couple of lines on the command line. Sometime ago we switched from Piwik PRO to Matomo and of course we wanted to migrate logs. We couldn’t just use the full MySQL / MariaDB database dump and go with it as table names and the schema was different (Piwik PRO 3.1.1. -> Matomo 3.5.1). In short we needed to export couple of tables and rename them to match new instance similarly as discussed in Stack Overflow.

There’s a VisitExport plugin for Piwik/Matomo which lets you export and import log tables with PHP and JSON files but it didn’t seem usable approach for our use case with tables being 500 MB or so.

The more practical solution was to simply create a dump of the tables we wished to restore separately.

1. Dump specific table of the database

mysqldump -u 'user' -p'password' database mytable > mytable.sql

And as I was playing with Docker

docker exec mariadb-container bash -c 'mysqldump -uroot -p$MYSQL_ROOT_PASSWORD database mytable' > ./database_dump_mytable_`date +%F`.sql

2. Change the table name before importing

sed -e 's/`mytable`/`mytable_restored`/g' mytable.sql > mytable_restored.sql

3. Import the new table

mysql -u [user] -p'password' database < mytable_restored.sql

And with Docker

docker exec mariadb-container mysql -uroot -p'password' database < ./database_dump_mytable_`date +%F`.sql

Or you can dump the whole database and use sed to extract the tables having a prefix or a suffix

1. Dump the database

mysqldump -u 'user' -p'password' database > dump.sql

2. Extract table

sed -n -e '/DROP TABLE.*`mytable`/,/UNLOCK TABLES/p' dump.sql > mytable.sql

3. Change the table name before importing

sed -n -e 's/`mytable`/`mytable_restored`/g' mytable.sql > mytable_restored.sql

3. Import the new table

mysql -u [user] -p'password' database < mytable_restored.sql

And with good luck you have now exported single table from full MySQL / MariaDB database dump.

Monthly notes 40

Refactoring, computer science concepts on day job, doing better code reviews, battling CSS and watching cat videos. That’s Monthly notes for April. Not much so enjoy slowly :)

Issue 40, 4.2019

Learning

Refactoring.Guru
Refactoring.Guru makes it easy for you to discover everything you need to know about refactoring, design patterns, SOLID principles and other smart programming topics.

Microservices

CompSci and My Day Job
Rob Conery talked at NDC Conference London 2019 about computer science concepts he used on his day job without actually knowing them. All of this changed as he put together the first two volumes of The Imposter’s Handbook. He talks what he has learned and applied to the applications created on his day job. And gives you more tools under your belt to help you do your job better.

Software development

Code Review: How can we do it better?
Fun Fun Function talks about how to become a better code reviewer and reviews some listeners sent code. General rules for pull requests: make everything readable by humans, title, description, commit comments and most important – your code. DRY KISS

Dev perception

“However, none of the [Formula One] teams used any of the big modern frameworks. They’re mostly WordPress & Drupal, with a lot of jQuery. It makes me feel like I’ve been in a bubble in terms of the technologies that make up the bulk of the web.”

Dev perception

When we’re evaluating technologies for appropriateness, I hope that we will do so through the lens of what’s best for users, not what we feel compelled to use based on a gnawing sense of irrelevancy driven by the perceived popularity of newer technologies.

Engineering guide to writing correct User Stories
Agile people are obsessed with writing user stories. And it is a powerful instrument indeed. But, from my practice a lot of people are doing it wrong…” (from @PracticalDev)

Tweet threads to read

It’s Friday. Pushing to production ?
They say Kubernetes is simple?

Frontend

CSSBattle!
CSS code-golfing is here! Use your CSS skills to replicate targets with smallest possible code. Feel free to check out the targets below and put your CSS skills to test.

Tools of the trade

rvpanoz/luna
Luna – npm management through a modern UI

Something different

Why the Human Mind Can Become More Motivated After Watching Cute Animal Videos
“…it turns out that taking a break to view some cuteness might actually benefit your work there’s a lot we’re still learning but according to some research looking at cute animals is associated with a boost and focus and fine motor skills.” (from Weekend Reading)

Code quality metrics for Kotlin project on SonarQube

Code quality in software development projects is important and a good metric to follow. Code coverage, technical debt, vulnerabilities in dependencies and conforming to code style rules are couple of things you should follow. There are some de facto tools you can use to visualize things and one of them is SonarQube. Here’s a short technical note of how to setup it on Kotlin project and visualize metrics from different tools.

Including what analysis SonarQube’s default plugins provide we are also using Detekt for static source code analysis and OWASP Dependency-Check to detect publicly disclosed vulnerabilities contained within project dependencies.

Visualizing Kotlin project metrics on SonarQube

SonarQube is nice graphical tool to visualize different metrics of your project. Lately it has started to support also Kotlin with SonarKotlin plugin and sonar-kotlin plugin. From typical Java project you need some extra settings to get things working. It’s also good to notice that the support for Kotlin isn’t quite yet there and sonar-kotlin provides better information i.e. what comes to code coverage

Steps to integrate reporting to Sonar with maven build:

  • Add configuration in project pom.xml: Surefire, Failsafe, jaCoCo, Detekt, Dependency-Check
  • Run Sonar in Docker
  • Maven build with sonar:sonar option
  • Check Sonar dashboard
SonarQube overview
SonarQube project overview

Configure Kotlin project

Configure your Kotlin project built with Maven to have test reporting and static analysis. We are using Surefire to run unit tests, Failsafe for integration tests and JaCoCo generates reports for e.g. SonarQube. See the full pom.xml from example project (coming soon).

Test results reporting

pom.xml

<properties> 
<sonar.coverage.jacoco.xmlReportPaths>${project.build.directory}/site/jacoco/jacoco.xml</sonar.coverage.jacoco.xmlReportPaths> 
</properties> 

<build> 
    <plugins>
        <plugin>
            <groupId>org.jacoco</groupId>
            <artifactId>jacoco-maven-plugin</artifactId>
            <executions>
                <execution>
                    <id>default-prepare-agent</id>
                    <goals>
                        <goal>prepare-agent</goal>
                    </goals>
                </execution>
                <execution>
                    <id>pre-integration-test</id>
                    <goals>
                        <goal>prepare-agent-integration</goal>
                    </goals>
                </execution>
                <execution>
                    <id>jacoco-site</id>
                    <phase>verify</phase>
                    <goals>
                        <goal>report</goal>
                    </goals>
                </execution>
            </executions>
        </plugin>
        <plugin>
            <artifactId>maven-surefire-plugin</artifactId>
            <configuration>
                <skipTests>${unit-tests.skip}</skipTests>
                <excludes>
                    <exclude>**/*IT.java</exclude>
                    <exclude>**/*IT.kt</exclude>
                    <exclude>**/*IT.class</exclude>
                </excludes>
            </configuration>
        </plugin>
        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-failsafe-plugin</artifactId>
            <executions>
                <execution>
                    <goals>
                        <goal>integration-test</goal>
                        <goal>verify</goal>
                    </goals>
                </execution>
            </executions>
            <configuration>
                <skipTests>${integration-tests.skip}</skipTests>
                <includes>
                    <include>**/*IT.class</include>
                </includes>
                <runOrder>alphabetical</runOrder>
            </configuration>
        </plugin>
    </plugins> 

    <pluginManagement>
        <plugins>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-surefire-plugin</artifactId>
                <version>2.22.1</version>
            </plugin>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-failsafe-plugin</artifactId>
                <version>2.22.1</version>
            </plugin>
            <plugin>
                <groupId>org.jacoco</groupId>
                <artifactId>jacoco-maven-plugin</artifactId>
                <version>0.8.3</version>
            </plugin>
        </plugins>
    </pluginManagement>

... 
</build> 

Static code analysis with Detekt

Detekt static code analysis configuration as AntRun. There’s also unofficial Maven plugin for Detekt. It’s good to notice that there are some “false positive” findings on Detekt and you can either customize detekt rules or suppress findings if they are intentional such as @Suppress(“MagicNumber”).

Detekt code smells
Detekt code smells

pom.xml

<properties> 
    <sonar.kotlin.detekt.reportPaths>${project.build.directory}/detekt.xml</sonar.kotlin.detekt.reportPaths> 
</properties> 

<build> 
... 
<plugins> 
<plugin> 
    <groupId>org.apache.maven.plugins</groupId> 
    <artifactId>maven-antrun-plugin</artifactId> 
    <version>1.8</version> 
    <executions> 
        <execution> 
            <!-- This can be run separately with mvn antrun:run@detekt --> 
            <id>detekt</id> 
            <phase>verify</phase> 
            <configuration> 
                <target name="detekt"> 
                    <java taskname="detekt" dir="${basedir}" 
                          fork="true" 
                          failonerror="false" 
                          classname="io.gitlab.arturbosch.detekt.cli.Main" 
                          classpathref="maven.plugin.classpath"> 
                        <arg value="--input"/> 
                        <arg value="${basedir}/src"/> 
                        <arg value="--filters"/> 
                        <arg value=".*/target/.*,.*/resources/.*"/> 
                        <arg value="--report"/> 
                        <arg value="xml:${project.build.directory}/detekt.xml"/> 
                    </java> 
                </target> 
            </configuration> 
            <goals> 
                <goal>run</goal> 
            </goals> 
        </execution> 
    </executions> 
    <dependencies> 
        <dependency> 
            <groupId>io.gitlab.arturbosch.detekt</groupId> 
            <artifactId>detekt-cli</artifactId> 
            <version>1.0.0-RC14</version> 
        </dependency> 
    </dependencies> 
</plugin> 
</plugins> 
... 
</build> 

Dependency checks

Dependency check with OWASP Dependency-Check Maven plugin

OWASP Dependency-Check
OWASP Dependency-Check

pom.xml

<properties> 
    <dependency.check.report.dir>${project.build.directory}/dependency-check</dependency.check.report.dir> 
    <sonar.host.url>http://localhost:9000/</sonar.host.url> 
    <sonar.dependencyCheck.reportPath>${dependency.check.report.dir}/dependency-check-report.xml</sonar.dependencyCheck.reportPath>
    <sonar.dependencyCheck.htmlReportPath>${dependency.check.report.dir}/dependency-check-report.html</sonar.dependencyCheck.htmlReportPath>
</properties> 

<build> 
... 
<plugins> 
<plugin> 
    <groupId>org.owasp</groupId> 
    <artifactId>dependency-check-maven</artifactId> 
    <version>4.0.2</version> 
    <configuration> 
        <format>ALL</format> 
        <skipProvidedScope>true</skipProvidedScope> 
        <skipRuntimeScope>true</skipRuntimeScope> 
        <outputDirectory>${dependency.check.report.dir}</outputDirectory> 
    </configuration> 
    <executions> 
        <execution> 
            <goals> 
                <goal>check</goal> 
            </goals> 
        </execution> 
    </executions> 
</plugin> 
</plugins> 
... 
</build>

Sonar scanner to run with Maven

pom.xml

<build> 
... 
    <pluginManagement> 
        <plugins> 
            <plugin> 
                <groupId>org.sonarsource.scanner.maven</groupId> 
                <artifactId>sonar-maven-plugin</artifactId> 
                <version>3.6.0.1398</version> 
            </plugin> 
        </plugins> 
    </pluginManagement> 
... 
</build> 

Running Sonar with Kotlin plugin

Create a SonarQube server with Docker

$ docker run -d --name sonarqube -p 9000:9000 -p 9092:9092 sonarqube 

There’s also OWASP docker image for SonarQube which adds several community plugins to enable SAST. But for our purposes the “plain” SonarQube works nicely.

Use the Kotlin plugin which comes with SonarQube (SonarKotlin) or install the sonar-kotlin plugin which shows information differently. If you want to use sonar-kotlin and are using the official Docker image for SonarQube then you’ve to first remove the SonarKotlin plugin.

Using sonar-kotlin

$ git clone https://github.com/arturbosch/sonar-kotlin 
$ cd sonar-kotlin 
$ mvn package  
$ docker exec -it sonarqube sh -c "ls /opt/sonarqube/extensions/plugins" 
$ docker exec -it sonarqube sh -c "rm /opt/sonarqube/extensions/plugins/sonar-kotlin-plugin-1.5.0.315.jar" 
$ docker cp target/sonar-kotlin-0.5.2.jar sonarqube:/opt/sonarqube/extensions/plugins 
$ docker stop sonarqube 
$ docker start sonarqube 

Adding dependency-check-sonar-plugin to SonarQube

$ curl -JLO https://github.com/SonarSecurityCommunity/dependency-check-sonar-plugin/releases/download/1.2.1/sonar-dependency-check-plugin-1.2.1.jar 
$ docker cp sonar-dependency-check-plugin-1.2.1.jar sonarqube:/opt/sonarqube/extensions/plugins 
$ docker stop sonarqube 
$ docker start sonarqube 

Run test on project and scan with Sonar

The verify phase runs your tests and should generate i.a. jacoco.xml under target/site/jacoco and detekt.xml.

$ mvn clean verify sonar:sonar

Access Sonar via http://localhost:9000/

Code quality metrics? So what?

You now have metrics on Sonar to show to stakeholders but what should you do with those numbers?

One use case is to set quality gates on SonarQube to check that a set of conditions must be met before project can be released into production. Ensuring code quality of “new” code while fixing existing ones is one good way to maintain a good codebase over time. The Quality Gate facilitates setting up rules for validating every new code added to the codebase on subsequent analysis. By default the rules are: coverage on new code < 80%; percentage of duplicated lines on new code > 3; maintainability, reliability or security rating is worse than A.