Sonarqube

How to set up a SonarQube agent for static code analysis

Sonarqube

I have a GitHub repository that containers a working docker compose example here

But there are a few things you should know about setting up a docker version of Sonarqube and some errors you may encounter.

Setting up the database

Assuming you are using ms sql server we need to honour the requirements of a specific collation. a case sentitive (CS) and accent sensitive (AS) collation

you could create a database like:

CREATE DATABASE sonar COLLATE SQL_Latin1_General_CP1_CS_AS;

or alter an existing one:


ALTER DATABASE sonar COLLATE SQL_Latin1_General_CP1_CS_AS;

you can check the collation you have set by running:

SELECT name, database_id, create_date, compatibility_level, collation_name  from sys.databases;

you can list all collations with:

SELECT * FROM ::fn_helpcollations()

i also found that my database creation completed with a closed database which i had to remedy with:

ALTER DATABASE sonar set auto_close OFF

you can also check the version of ms sql server that is running with:

SELECT SERVERPROPERTY('productversion'), SERVERPROPERTY('productlevel'), SERVERPROPERTY('edition')

Docker

the docker run command would be:

docker container run -d -p 9000:9000 --name sonarserver SonarQube:8.2-community

the docker-compose.yml configuration would look like:

version: '3.3'
services:

  app:
    image: sonarqube:8.2-community
    container_name: sonarqube-app
    restart: always
    environment:
      - SONARQUBE_JDBC_URL=${SONARQUBE_DATABASE_CONNECTIONSTRING}
      - SONARQUBE_JDBC_USERNAME=${SONARQUBE_DATABASE_USERNAME}
      - SONARQUBE_JDBC_PASSWORD=${SONARQUBE_DATABASE_PASSWORD}
    networks:
      - sonarnet
      

I set the environment variables on the host which can be pulled in by docker run - when configured - and docker-compose - illustrated above.

note the ' which acts as an escape code for the ;

export SONARQUBE_DATABASE_CONNECTIONSTRING='jdbc:sqlserver://my-database.database.windows.net:1433;databaseName=mytest;'

export SONARQUBE_DATABASE_USERNAME=dbadmin

export SONARQUBE_DATABASE_PASSWORD=dbadmin-password

SSL

To enable SSL, I used CertBot.

First, we need to make sure our A and CNAME records are set correctly with the DNS provider.

The A record would point to the IP address of the machine we are hosting the service.

The CNAME record points to any subdomains like www.

eg. www / CNAME / <domain name>

On the NGINX host we need to make sure both CertBot and the Certbot-NGINX plugin are installed.

apt-get update -y && \
apt-get upgrade -y && \
apt-get install -y python3 python3-pip && pip3 install pip --upgrade && pip3 install certbot-nginx

We can now run the CertBot wizard which will issue an SSL certificate and edit the NGINX configuration file to match.

certbot --nginx

I had issues when my NGINX was not configured to serve from port 80 and the firewall needs port 80 and 443 open for CertBot to register correctly

Configuring for C#

  1. Install the dotnet global tool
dotnet tool install --global dotnet-sonarscanner  
  1. Initialise the scanner

PROJECT_KEY is created with the SQ project, PROJECT_TOKEN is a security token created in the security section of an SQ user profile

dotnet sonarscanner begin /k:"<PROJECT_KEY>" /d:sonar.login="<PROJECT_TOKEN" /d:sonar.host.url="<SONARQUBE_URL>"  
  1. Build the C# project/solution
dotnet build <sln>/<.csproj>  
  1. Upload the analysis
dotnet sonarscanner end /d:sonar.login="<PROJECT_TOKEN>"

Troubleshooting

Issue: The web application [ROOT] appears to have started a thread

WARN  web[][o.a.c.l.WebappClassLoaderBase] The web application [ROOT] appears to have started a thread named [elasticsearch[Pip the Troll][transport_client_worker][T#1]{New I/O worker #1}] but has failed to stop it. This is very likely to create a memory leak. Stack trace of thread:
 sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
 sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:269)
 sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:93)
 sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:86)
 sun.nio.ch.SelectorImpl.select(SelectorImpl.java:97)
 org.jboss.netty.channel.socket.nio.SelectorUtil.select(SelectorUtil.java:68)
 org.jboss.netty.channel.socket.nio.AbstractNioSelector.select(AbstractNioSelector.java:434)
 org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:212)
 org.jboss.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:89)
 org.jboss.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178)
 org.jboss.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108)
 org.jboss.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42)
 java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
 java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
 java.lang.Thread.run(Thread.java:745)

Resolution

this is related to the elasticsearch requirements as documents here.

you can check the current settings on the linux host by running:

sysctl vm.max_map_count
sysctl fs.file-max
ulimit -n
ulimit -u

you can align them with the four commands:

sysctl -w vm.max_map_count=524288
sysctl -w fs.file-max=131072
ulimit -n 131072
ulimit -u 8192

Issue: Web server startup failed: Unable to determine database dialect to use within sonar with dialect null jdbc url

2020.09.10 17:18:54 ERROR web[][o.s.s.p.PlatformImpl] Web server startup failed: Unable to determine database dialect to use within sonar with dialect null jdbc url "jdbc:sqlserver://kaml-test.database.windows.net:1433;database=sonar;encrypt=true;trustServerCertificate=false;hostNameInCertificate=*.database.windows.net;loginTimeout=30;"
2020.09.10 17:18:54 WARN  web[][o.a.c.l.WebappClassLoaderBase] The web application [ROOT] appears to have started a thread named [elasticsearch[_client_][[timer]]] but has failed to stop it. This is very likely to create a memory leak. Stack trace of thread:\n java.base@11.0.6/java.lang.Thread.sleep(Native Method)\n app//org.elasticsearch.threadpool.ThreadPool$CachedTimeThread.run(ThreadPool.java:574)
2020.09.10 17:18:54 WARN  web[][o.a.c.l.WebappClassLoaderBase] The web application [ROOT] appears to have started a thread named [elasticsearch[_client_][scheduler][T#1]] but has failed to stop it. This is very likely to create a memory leak. Stack trace of thread:\n java.base@11.0.6/jdk.internal.misc.Unsafe.park(Native Method)\n java.base@11.0.6/java.util.concurrent.locks.LockSupport.parkNanos(Unknown Source)\n java.base@11.0.6/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(Unknown Source)\n java.base@11.0.6/java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(Unknown Source)\n java.base@11.0.6/java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(Unknown Source)\n java.base@11.0.6/java.util.concurrent.ThreadPoolExecutor.getTask(Unknown Source)\n java.base@11.0.6/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)\n java.base@11.0.6/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)\n java.base@11.0.6/java.lang.Thread.run(Unknown Source)

Resolution

big scary error has a relatively small timid answer. speech marks.

Bad:

- SONARQUBE_JDBC_URL="jdbc:sqlserver://10.1.2.3/sonar"

Good:

- SONARQUBE_JDBC_URL=jdbc:sqlserver://10.1.2.3/sonar

Issue: Why aren't my dotnet projects arent scanning

Resolution

i would need further information but two reasons spring to mind:

Are you scanning projects without having a ProjectGuid element in the project ProjectGroup tag of the project?

When you build a project, Sonarqube needs a bit of help. This comes by adding a ProjectGuid with a unique guid value to the ProjectGroup so that SonarQube can track the projects.

Sonarqube can analyse the results of a solution build with no extra help as it will use the project guids used by the solution file.

eg:

<PropertyGroup>
   <TargetFramework>netcoreapp3.1</TargetFramework>
   <ProjectGuid>xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx</ProjectGuid>
 </PropertyGroup>

Are you running Sonarqube Community, remember that it can only scan the master branch?