Integrating RTC with Jenkins using multiple slaves
I am trying to integrate RTC builds using Jenkins, but there is one specific issues I'm not able to figure out. How to setup a generic, cross platform "load directory".
Our setup is as follows. We have the Jenkins software installed and setup as a master on a common tools server. The master is setup so that it does not perform any builds. And then we have several slaves that are a mix of Linux and Windows build hosts. We're running Jenkins 1.656 and Jazz 6.0.1 fix004; the Jenkins RTC plugin is 1.2.0.
Our developers have gotten used to using the RTC eclipse client for all their activity, including submitting build requests (we're currently setup to use RTC and the Rational Build Forge Agent model which we're trying to move off of because it is too limited for our needs). So we'd like to keep using the RTC build definition as the primary interface for interacting with "builds". I can't figure out how to set the "load directory" in RTC on the build definition and in Jenkins on each respective slave.
Most of our builds are written such that we "should" be able to build on any platform. So it stands to reason that the slave needs to set the "load directory"; the slave is the authority for its own file system. So if I have a Windows slave the "load directory" might be c:/rtc/ssd/src and on Linux it might be /opt/rtc/ssd/src. Now let’s say in Jenkins I've setup a "node property" called SRC_BASE which sets the appropriate value depending on the slave. It stands to reason that in RTC I need to consume that property when setting up my "Load Options" in the build definition. So again let’s say I set a new property SRC_HOME which is set to "${SRC_BASE}/proj-name".
You should notice that the build definition doesn't care what the actual base is, it's just adding to it so that each project is isolated uniquely on which ever slave it ends up running on.
The testing I've done so far I don't see how to bridge this gap. I end up with this: Using build definition configuration. Substituted the following build property variables: team.scm.fetchDestination = ${SRC_HOME} --> team.scm.fetchDestination = ${SRC_BASE}/ccn/${buildResultUUID} Fetching files from workspace "env.cfg.ccn.bld-repo-wksp". RTC Checkout : Fetching files to fetch destination "/home/dcbuilder/workspace/env.cfg.ccn.bld/${SRC_BASE}/ccn/${buildResultUUID}" ... RTC Checkout : Fetching Completed
Notice how SRC_BASE is not coming across from Jenkins, specifically the node configuration (aside from buildResultUUID which isn’t either). I have already setup the recommended “build parameters” and some additional ones, but the problem is the build definition can’t access those node level properties when needed for the “load directory”.
I put together a small shell script to execute which just echo’s various stuff. And I get this: BUILD_TIMESTAMP = 20160502-142857386 BUILD_NUMBER = 136 BUILD_DISPLAY_NAME = #136 JOB_NAME = env.cfg.ccn.bld BUILD_TAG = jenkins-env.cfg.ccn.bld-136 ANT_HOME = /opt/apache-ant-1.9.6 JAVA_HOME = /opt/jdk1.7.0_79 /opt/jdk1.7.0_79/bin/java java version "1.7.0_79" Java(TM) SE Runtime Environment (build 1.7.0_79-b15) Java HotSpot(TM) 64-Bit Server VM (build 24.79-b02, mixed mode) SRC_BASE = /opt/rtc/ssd/src PKG_BASE = /opt/rtc/spindle/pkg CCN_SRC_HOME = /opt/rtc/ssd/src/ccn/_eoT2sRCTEeakJpQrbzYzDQ buildResultUUID = _eoT2sRCTEeakJpQrbzYzDQ buildDefinitionId = env.cfg.ccn.bld-def buildLabel = repositoryAddress: https://jazzcm.dcsr.site:9445/ccm/ buildResultUUID: _eoT2sRCTEeakJpQrbzYzDQ RTCBuildResultUUID: _eoT2sRCTEeakJpQrbzYzDQ
So the environment instantiated seems to have access to these properties, just not the build definition. Can I resolve this or what am I doing wrong with this setup…? Is there another way I should try to do this? |
Accepted answer
Hi Rob,
The "node property" that is set in Jenkins is not available in RTC build definition. When using a relative path in the build definition's load directory, it is resolved based on the Jenkins job's workspace directory. I believe this should help in your scenario. Thanks, Sridevi Rob Leach selected this answer as the correct answer
Comments
Rob Leach
commented May 03 '16, 10:09 a.m.
Yes now that I think about your answer, I can understand and appreciate the principle of the solution. It's probably less ideal than I was hoping for, but certainly workable. I'll just need to make adjustments to how we setup the build location on each node using the relative location that Jenkins sets. Thanks!
Rob Leach
commented May 03 '16, 11:46 a.m.
One thing to note; the relative path is relative to the build job name. So in my situation on a Linux slave host the starting point is /home/dcbuilder/workspace/<build-job-name>
I was hoping to have a common starting point. Given the above discovery I could use ../ in the path I define which would establish a common location across all build jobs. More to digest....
1
Rob Leach
commented May 03 '16, 2:14 p.m.
Okay, another piece of information. You can actually influence the working directory location. When configuring the node (slave), the "Remote root directory" setting will determine the root working directory for the configured slave.
So this can be changed to whatever you like. Expanding the help for this configurable setting will provide a lot more details.
Thanks for sharing some useful setup instructions, Rob!
|
One other answer
Another problem I've found out about the Jenkins integration is that RTC build definition properties with a dot (like team.scm.fetchDestination) are not passed to jenkins (or not correctly, at least). So I always need to set up a new property on the build definition like teamScmFetchDestination=${team.scm.fetchDestination} in order to make it work properly.
Regarding path I usually work in two ways: on build definition I set up "." as load folder as the repository workspace is always loaded in the jenkins job's workspace (unless defined in the job, so you probably can just use the previously mentioned property, but it seems a little bit too confusing for me). On the other hand, if I need specific platform dependent path I usually create multiple build engine referring to the same jenkins server, which contains the property related to that environment, and then associating the build definition to that one. This works if you have strictly separated build definition for different environment, otherwise you can use the labels on jenkins to achieve a similar behavior. Michele. Comments
Rob Leach
commented May 03 '16, 10:12 a.m.
I think the issue with "properties" with a period in them is that Jenkins (or RTC) pass those properties to the environment (shell) on the build host. And I'm certain that in *NIX environments at least, variables can't have periods in them. So that's probably the issue you're running up against.
I'm going to try relative paths based on Jenkins working directory. That should work okay.
Hi Michele,
Michele Pegoraro
commented May 04 '16, 6:10 a.m.
I believe when the build properties from RTC are copied to environment variables at the jenkins end, the "." is replaced with a "_". So to access the "team.scm.fetchDestination" property set in the RTC build definition you might have to look up for the environment variable "team_scm_fetchDestination" on the jenkins side. When I get a chance, I will test it and post a comment here.
|
Your answer
Dashboards and work items are no longer publicly available, so some links may be invalid. We now provide similar information through other means. Learn more here.
Comments
I would be interested in any experiences as well.
Thanks Ralph, as stated below, I think I can work with using the relative path Jenkins sets.