Jenkins and GitHub integration with webhooks

This post is about integrating GitHub and Jenkins so that each application can send messages to the other and update their status.

Use case: whenever a new pull request is created or an existing one is modified (i.e. new commit occurs) on GitHub in my repo, I want a Jenkins job to be triggered for that branch and that job should build the project for which the PR was created. After the job has ended, the result should be displayed on GitHub next to the commit which started the whole cycle.

Sounds like a dream, isn’t? Note that in the GitHub repository I had several projects under different folders, so when a new PR / branch is created, it is not obvious at first glance which project to build in Jenkins (using maven and modules).

After some investigation here is the guide how to set this up – probably there are some other solutions as well, but this one works like a charm.

You need to install the following plugins for Jenkins:

  1. Github plugin
  2. Generic webhook trigger plugin
  3. Plain credentials plugin

(I have tried out the Github Pull Request Builder plugin as well, but it has some security concerns plus it is up for adoption which is never a good news, so I didn’t want to use it.)

Used versions : Jenkins – 2.118, github plugin – 1.29, generic webhook trigger plugin – 1.32

  1. To connect Jenkins and GitHub you will need a token which can be generated in GitHub. Log in to GitHub (with the account which was already added as contributor to the repository and which will be used from Jenkins!) and under your user’s setttings you can find “Developer settings” menu. Under it there is the “Personal access token” option. Generate a token and save it for later use. Select admin:repo_hook and repo scopes.
  2. You need to set up a webhook on your GitHub repository as well. The webhook is for the Generic webhook trigger plugin and should be set for pull request events. The url can be found on the plugin’s wiki page. You can set up webhooks under your repo’s “Settings” menu, content type should be application/json.
  3. Next you need to create credentials under Jenkins using your GitHub token. Go to Jenkins / Credentials and probably under system / global credentials you can find the Add credentials link. Click on it and choose “Secret text” as Kind (thanks to Plain credentials plugin). The secret is the token, ID/Description can be set freely.
  4. Next you need to configure the GitHub plugin. Go to Manage Jenkins and find the GitHub section where you can add servers. Add one and select a fancy name for it. Note that credentials will be needed, select the one which has been created in the previous step. Remove the Manage hooks option as you have already set up the hooks. Test the connection, it should work at this point.
  5. Now you can create your job. Create a freestyle job. Find “GitHub project” option and set your repo’s URL. Then set up git as SCM. Then go to Generic Webhook Trigger as Build trigger and  here you can create variables which will get their values from the json webhook messages coming from GitHub. Examples:

variable name: pr_number, expression: $.number, JSONPath

variable name: sender, expression: $.sender.login, JSONPath

v. name: branch, expression: $.pull_request.head.ref, JSONPath

You can get as much data as you want from the pull request json message. The structure of the general GitHub pull request message is here. About how to get out the data from a json file you can find information with google.

Whenever you set all the variables you want to use, you can create a shell script step to build anything and use these variables. Example: you can get the branch name, you can get the user who created / modified the pull request, you can get the title of the PR which can be used as indicator what to build, i.e. when the title contains a specific project code, you can checkout the branch and call “mvn clean install -pl projectname”. The main point is that for building something you need some information and these information can be found in the webhook message. Here comes the reason to use the Generic webhook trigger plugin : with this plugin you can check and use the content of the json message. With the general github plugin you can not.

You could ask at this point that why on earth did we set up the github plugin? The answer is that this plugin can send feedback to GitHub after the job has built the project. This is the last step in setting up the job correctly.

6. At the Post-Build Actions there is a step called “Set GitHub commit status (universal)”. Choose this and configure it: sha can be entered manually as variable coming from the json message ($.pull_request.head.sha) , set your repo and leave other values as default.

Save the job and we are ready!

Now you can test your configuration. Create a pull request (don’t forget to choose a proper title or name) in GitHub and check if the Jenkins job is triggered. Then commit something on the PR and check that – after the job was run – the commit gets the build result in GitHub (select your PR and the commits tab).

I hope this guide helps you to integrate Jenkins and GitHub a bit easier. Enjoy and long live the DevOps!

 

 

 

 

 

 

Advertisements

How to start only a single Jenkins job after commits when using one huge GitHub repository

Let’s assume you have a single Github repository where you store all your code for all your products and / or projects. You have a CI application as well, e.g. Jenkins. Under Jenkins you have a job for each of your products and projects. Dozens of jobs which work a lot on a simple day. The jobs are configured to build the different products based on Maven modules, so each job builds only its part of the code.

If you want to follow the continuous integration guidelines, the Github repository is polled by all Jenkins jobs every minute or so to detect the changes in the code. Whenever e.g. a new comment is inserted in Github repository or whenever a typo is fixed, ALL the Jenkins builds will be started as a result of polling. Is this good? Absolutely not. When you change the code of a project, only that specific project should be built. Building all of them is just waste of time and energy, not to mention the time loss: when you want to have a fresh build NOW but the build is already running because of a previous code change on some other project, it can be very annoying to wait…yes, you can have a coffee meanwhile if your build is short enough but after the third one you will be really upset.

One solution for the problem above would be to create a controller job in Jenkins which can decide what to build based on the information in the last commit. Github has a solution called webhooks. Webhooks can notify Jenkins whenever there is a new commit on your huge repository. If you have installed Github plugin on Jenkins (I’m sure you have), you can set up the url where Jenkins is listening to the notifications, so you can integrate Github and Jenkins easily.

On the new controller job’s configuration page you have to  set up the “Source Code Management” section and you have to select “GitHub hook trigger for GITScm polling” under “Build triggers” section. This means Github will push a notification to Jenkins whenever there is a new commit and the job will be started immediately. Try to test it: put a simple “echo ‘Hello World'” shell command in it as “Execute shell” build step and then change something (e.g. a comment) in Github code repository. The build should be started in a couple of seconds.

The next step would be to find out what was changed in the commit. The controller job  will clone the Github repo, so you will have the last commit in it. Create a build step “Execute shell” and put this command into it:

changes=”$(git diff-tree –no-commit-id –name-only -r $(git log –format=”%H” -n 1))”

This command will give you back all the file names which were changed plus the path to them under the repo. This is valuable information as based on this you can decide what project was changed and which job to start. You can start a job from command line interface with curl, there are different ways to set it up. This bash code is just an example how to decide which project to run:

if [[ $changes = “projectA/”* ]]; then
     echo “Project A was modified”
     curl -I -XPOST –user testuser:testpw http://jenkins.mycompany.com/job/mybuild/build
fi

And here we are, you have triggered one single job with one commit and not all of them. And the proper one. Don’t forget to switch off polling in each of your jobs as from now on they will be started by the controller job.

With this process you can start multiple jobs if multiple projects were changed in the same comment (which is rare but might happen).

Now the load on your Jenkins node is mitigated significantly and if you are using Jenkins slave nodes, maybe some of them can be switched off, saving money for your company, so time to go to your boss to tell him how efficient you are. 😉

 

NRPE client – how to set it up?

Nagios vs NRPE

Recently I had to set up a Nagios server to monitor all our Linux based VMs ( virtual machines, test environments ) and after setting up the server side I started to look after how to set up the remote client sides.

There are at least two “general”, so widely used solution for this, called NRPE and NSCA plugins. I chose the first one and started to check some install guides available in the wild. However, I faced many problems during installation, so I decided to share my adventures.

nrpe

Found this and this as install guides. We use CentOS 6.6 and I had many different problems with these processes. First, NRPE 3.1.x is not available yet as rpm, so I can not use yum to install it. Second, when I started to go through the “hard” way to install (using make), I faced many problems like missing files…it is really strange if this guide works for many others.

Anyway, after a while I managed to hunt down every file I needed, so here is the guide how to set up NRPE client on a remote CentOS Linux node:

  1. create nagios user and nagios group, add the user to the group
  2. wget nagios plugins (you can find it on nagios side, version is always growing of course, example )
  3. wget nrpe ( example )
  4. tar xfv nagios plugins zip
  5. tar xfv nrpe zip
  6. yum install gcc glibc glibc-common openssl-devel -y
  7. cd nagios dir
  8. ./configure
  9. make
  10. make install
  11. chown -R nagios:nagios /usr/local/nagios
  12. cd nrpe dir
  13. ./configure
  14. make all
  15. make install
  16. make install-init
  17. download nrpe service script, example link
  18. remove -n from line 30 if you want to use SSL
  19. upload the script to the client, into /etc/init.d folder and give proper rights to it
  20. create /usr/local/nagios/etc/nrpe.cfg on clients and put your remote commands into it
  21. service nrpe start

That’s it, with this procedure now everything works for me on many clients. I have automated the installation using Ansible and it takes a couple of mins to set up any remote hosts with NRPE.

Enjoy!

Server side is a bit different story, maybe I summarize that as well in the next post.