It's all about the answers!

Ask a question

scm command line problems with multi-platform build


Michael Gray (86126) | asked Apr 12 '10, 1:10 p.m.
I'm working on getting a multi-platform build going which is using RTC
to drive Build Forge builds through the RTC-BF integration. The Build
Forge project in turn executes build steps in parallel across build machines
for various OS'es (Windows, Linux, Solaris, AIX, zLinux, ...) some of
which involve the scm command line. I understand the scm command
line as shipped doesn't support platforms other than Windows and Linux,
but following this tech note, I've got it mostly working:

http://www-01.ibm.com/support/docview.wss?uid=swg21417308

Using scm.sh in place of scm (linux binary) seems to work for the most
part, but startup is slow due to launching a new jvm for every scm
invocation and on Solaris, due to not having an IBM JDK, the following
error is thrown each time scm.sh is invoked:

The IBM Class Sharing Adaptor will not work in this configuration.
You are not running on J9

The lscm script is supposed to address both these issues by launching
a single jvm to act as a server/daemon for subsequent scm command
line invocations and it seems to work, except... Once I run lscm on
one machine, it won't work from any of the others in my build farm.
Once lscm is running on one system, attempts to run in on other
systems results in the following java stack trace:

$ /opt/RTC-2.0.0.2/scmtools/eclipse/lscm -v
java.net.ConnectException: A remote host refused an attempted connect operation.
at java.net.PlainSocketImpl.socketConnect(Native Method)
at java.net.PlainSocketImpl.doConnect(PlainSocketImpl.java:391)
at java.net.PlainSocketImpl.connectToAddress(PlainSocketImpl.java:252)
at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:239)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:389)
at java.net.Socket.connect(Socket.java:556)
at java.net.Socket.connect(Socket.java:504)
at java.net.Socket.<init>(Socket.java:412)
at java.net.Socket.<init>(Socket.java:226)
at com.ibm.team.filesystem.cli.minimal.client.HttpSwitchingClient.negotiate(HttpSwitchingClient.java:88)
at com.ibm.team.filesystem.cli.minimal.client.FrontEndClient.run(FrontEndClient.java:299)
at com.ibm.team.filesystem.cli.minimal.client.FrontEndClient.run(FrontEndClient.java:125)
at com.ibm.team.filesystem.cli.minimal.client.FrontEndClient.main(FrontEndClient.java:236)

After some investigation, I found what I think is the smoking gun but
don't know what to do about it. Our *nix accounts have a common
nfs auto-mounted home directory and in it, I found that lscm seems
to create a subdirectory in ~/.jazz-scm/daemons which apparently holds
state information about the scm daemon process. The existence of
this information seems to confuse attempts to launch lscm on other
systems which share this home directory.

Is there some way to direct lscm to store this state information either
in a local directory (/tmp?) or use a per-machine sub-directory in
~/.jazz-scm/daemons so we can have multiple lscm daemon
processes active for a single user (on different machines) at the
same time?

I noticed from some RTC work items, there seems to be a plan to
support the scm command line on more platforms eventually -- is
there any ETA on that?

3 answers



permanent link
Michael Gray (86126) | answered Apr 12 '10, 1:51 p.m.
I noticed scm has a --config option:
    --config arg - Specify the location of the configuration directory.
I gather I could use that to refer to a local config directory on
each build machine, but that takes away the benefit of having
a home directory based common config for settings, scm login
creds caching etc. I think what I may be seeking is a --daemon-config
option perhaps. Not seeing that on the scm -help output , any ideas? ;-)

permanent link
Rick Patterson (40148) | answered Apr 16 '10, 11:34 a.m.
JAZZ DEVELOPER
Hi

Just to report we have exactly the same situation as described here. We are not entering production yet, but plan to have RTC clients on many UNIXes, as may as 20 or so, all sharing the same NFS mounted home directory as documented here.

I have not tried the lscm yet script yet, (have only used scm.sh) but I glad to have read about this before running into it as well. ARe currently working with RTC 2.0.0.2

permanent link
Michael Gray (86126) | answered Apr 16 '10, 1:21 p.m.
I haven't tried yet, but I think something like the following should work
as a stopgap solution:
    export JAZZ_SCM_CONFIG=/tmp/${USER}$$
    cp -rp ~/.jazz-scm ${JAZZ_SCM_CONFIG}
    lscm --config ${JAZZ_SCM_CONFIG} ...
I think a better solution though would be to store the contents
of .jazz-scm/daemons locally on build machines, not in the
callers home directory which may be shared across multiple
systems...

Your answer


Register or to post your answer.


Dashboards and work items are no longer publicly available, so some links may be invalid. We now provide similar information through other means. Learn more here.