Jazz Forum Welcome to the Jazz Community Forum Connect and collaborate with IBM Engineering experts and users

Integrating RTC with Jenkins using multiple slaves

  I am trying to integrate RTC builds using Jenkins, but there is one specific issues I'm not able to figure out. How to setup a generic, cross platform "load directory".

 

Our setup is as follows. We have the Jenkins software installed and setup as a master on a common tools server. The master is setup so that it does not perform any builds. And then we have several slaves that are a mix of Linux and Windows build hosts. We're running Jenkins 1.656 and Jazz 6.0.1 fix004; the Jenkins RTC plugin is 1.2.0.

 

Our developers have gotten used to using the RTC eclipse client for all their activity, including submitting build requests (we're currently setup to use RTC and the Rational Build Forge Agent model which we're trying to move off of because it is too limited for our needs). So we'd like to keep using the RTC build definition as the primary interface for interacting with "builds". I can't figure out how to set the "load directory" in RTC on the build definition and in Jenkins on each respective slave. 

 

Most of our builds are written such that we "should" be able to build on any platform. So it stands to reason that the slave needs to set the "load directory"; the slave is the authority for its own file system. So if I have a Windows slave the "load directory" might be c:/rtc/ssd/src and on Linux it might be /opt/rtc/ssd/src. Now let’s say in Jenkins I've setup a "node property" called SRC_BASE which sets the appropriate value depending on the slave. It stands to reason that in RTC I need to consume that property when setting up my "Load Options" in the build definition. So again let’s say I set a new property SRC_HOME which is set to "${SRC_BASE}/proj-name".

 

You should notice that the build definition doesn't care what the actual base is, it's just adding to it so that each project is isolated uniquely on which ever slave it ends up running on.

 

The testing I've done so far I don't see how to bridge this gap. I end up with this:

Using build definition configuration.
 
Substituted the following build property variables:
        team.scm.fetchDestination = ${SRC_HOME}   -->   team.scm.fetchDestination = ${SRC_BASE}/ccn/${buildResultUUID}
 
Fetching files from workspace "env.cfg.ccn.bld-repo-wksp".
RTC Checkout : Fetching files to fetch destination "/home/dcbuilder/workspace/env.cfg.ccn.bld/${SRC_BASE}/ccn/${buildResultUUID}" ...
RTC Checkout : Fetching Completed

 

Notice how SRC_BASE is not coming across from Jenkins, specifically the node configuration (aside from buildResultUUID which isn’t either). I have already setup the recommended “build parameters” and some additional ones, but the problem is the build definition can’t access those node level properties when needed for the “load directory”.

 

I put together a small shell script to execute which just echo’s various stuff. And I get this:

BUILD_TIMESTAMP = 20160502-142857386
BUILD_NUMBER = 136
BUILD_DISPLAY_NAME = #136
JOB_NAME = env.cfg.ccn.bld
BUILD_TAG = jenkins-env.cfg.ccn.bld-136
ANT_HOME = /opt/apache-ant-1.9.6
JAVA_HOME = /opt/jdk1.7.0_79
/opt/jdk1.7.0_79/bin/java
java version "1.7.0_79"
Java(TM) SE Runtime Environment (build 1.7.0_79-b15)
Java HotSpot(TM) 64-Bit Server VM (build 24.79-b02, mixed mode)
SRC_BASE = /opt/rtc/ssd/src
PKG_BASE = /opt/rtc/spindle/pkg
CCN_SRC_HOME = /opt/rtc/ssd/src/ccn/_eoT2sRCTEeakJpQrbzYzDQ
buildResultUUID = _eoT2sRCTEeakJpQrbzYzDQ
buildDefinitionId = env.cfg.ccn.bld-def
buildLabel =
repositoryAddress: https://jazzcm.dcsr.site:9445/ccm/
buildResultUUID: _eoT2sRCTEeakJpQrbzYzDQ
RTCBuildResultUUID: _eoT2sRCTEeakJpQrbzYzDQ

 

So the environment instantiated seems to have access to these properties, just not the build definition. Can I resolve this or what am I doing wrong with this setup…? Is there another way I should try to do this?    

2 votes

Comments

 I would be interested in any experiences as well. 


I set up Jenkins and used a relative path, so the load location was in the job in a separate folder. I don't use the build properties. Would be good to get some feedback from others how they do in this set up.

I have heard from a customer that they have issues with multiple slaves. They trigger many builds on the same stream/build definition. This gets them into the situation that their build repository workspace will get into racing conditions e.g. parallel accept on different slaves.  

It would be great if people can share their experiene. 

It might also be a good ideat to use the Jenkins Forums to get other users perspectives.

 Thanks Ralph, as stated below, I think I can work with using the relative path Jenkins sets.


Accepted answer

Permanent link
Hi Rob,

The "node property" that is set in Jenkins is not available in RTC build definition.

When using a relative path in the build definition's load directory, it is resolved based on the Jenkins job's workspace directory. I believe this should help in your scenario.

Thanks,
Sridevi
Rob Leach selected this answer as the correct answer

1 vote

Comments

 Yes now that I think about your answer, I can understand and appreciate the principle of the solution. It's probably less ideal than I was hoping for, but certainly workable. I'll just need to make adjustments to how we setup the build location on each node using the relative location that Jenkins sets. Thanks!

 One thing to note; the relative path is relative to the build job name. So in my situation on a Linux slave host the starting point is /home/dcbuilder/workspace/<build-job-name>


I was hoping to have a common starting point. Given the above discovery I could use ../ in the path I define which would establish a common location across all build jobs. More to digest....

 Okay, another piece of information. You can actually influence the working directory location. When configuring the node (slave), the "Remote root directory" setting will determine the root working directory for the configured slave.

So this can be changed to whatever you like. Expanding the help for this configurable setting will provide a lot more details.

1 vote

Thanks for sharing some useful setup instructions, Rob!


One other answer

Permanent link
Another problem I've found out about the Jenkins integration is that RTC build definition properties with a dot (like team.scm.fetchDestination) are not passed to jenkins (or not correctly, at least). So I always need to set up a new property on the build definition like teamScmFetchDestination=${team.scm.fetchDestination} in order to make it work properly.

Regarding path I usually work in two ways: on build definition I set up "." as load folder as the repository workspace is always loaded in the jenkins job's workspace (unless defined in the job, so you probably can just use the previously mentioned property, but it seems a little bit too confusing for me). On the other hand, if I need specific platform dependent path I usually create multiple build engine referring to the same jenkins server, which contains the property related to that environment, and then associating the build definition to that one. This works if you have strictly separated build definition for different environment, otherwise you can use the labels on jenkins to achieve a similar behavior.

Michele.

0 votes

Comments

 I think the issue with "properties" with a period in them is that Jenkins (or RTC) pass those properties to the environment (shell) on the build host. And I'm certain that in *NIX environments at least, variables can't have periods in them. So that's probably the issue you're running up against.


I'm going to try relative paths based on Jenkins working directory. That should work okay.

Hi Michele,

For properties with a dot, can you try replacing "." with "_" when accessing them from the environment variables at the Jenkins end?

Thanks,
Sridevi

@Rob yes, you're probably right, I never thought about it.

@Sridevi unfortunately I cannot do anything on jenkins side as they are not set at all, so the only way is manage it on RTC side

I believe when the build properties from RTC are copied to environment variables at the jenkins end, the "." is replaced with a "_". So to access the "team.scm.fetchDestination" property set in the RTC build definition you might have to look up for the environment variable "team_scm_fetchDestination" on the jenkins side. When I get a chance, I will test it and post a comment here.

Thanks,
Sridevi

Your answer

Register or log in to post your answer.

Dashboards and work items are no longer publicly available, so some links may be invalid. We now provide similar information through other means. Learn more here.

Search context
Follow this question

By Email: 

Once you sign in you will be able to subscribe for any updates here.

By RSS:

Answers
Answers and Comments
Question details
× 6,130
× 457
× 151

Question asked: May 02 '16, 5:22 p.m.

Question was seen: 4,528 times

Last updated: May 04 '16, 7:21 a.m.

Confirmation Cancel Confirm