It's all about the answers!

Ask a question

When to accept incoming change set?


Anders Truelsen (16212020) | asked Dec 01 '10, 5:47 a.m.
Coming from ClearCase we're used to the notion of a recommended baseline.
The fact that a baseline is recommended show the team that it is safe to rebase to.

How do we achieve the same in RTC, that is, how do I tell whether an incoming change set is safe to accept.

Encouraging the team to test their work prior to delivery might help improve quality on our integration stream, but in real life (our version of it anyway) errors do make it to the integration.

2 answers



permanent link
Daniel Toczala (88211514) | answered Dec 01 '10, 9:10 a.m.
FORUM MODERATOR / JAZZ DEVELOPER
The issue is one of code quality and your philosophy behind using streams. Many people think of streams as a way to isolate your workspace from work and changes being done elsewhere in the development team. I like to think about it a bit more deeply, and consider WHY I am using streams. I often think of streams as being hierarchical (in RTC they don't need to be, they just exist). As you move up the hierarchy, from a team stream, to an integration stream, to a release candidate stream, you are moving up in confidence and software stability. There should be a testing hurdle that needs to be met to move any change "up" this hierarchy. So by the time a change makes it to the top of my hierarchy, it has been built and tested at least 3 or 4 times, and more llikely about a dozen times.

So in my mind there are two ways to handle this issue.

Approach #1) Buyer beware - don't accept anything into your workspace unless it is part of a baseline from a higher level stream. People only create/deliver baselines to streams when code has been tested and validated. So only accept a new baseline (or snapshot) from your parent stream. This is great because you ensure that your workspace doesn't get a bunch of bad changes applied to it. The bad piece of this is that merges can become larger, since you are holding off change and applying it in rather large increments.

Approach #2) Developers need to be disciplined - accept any incoming changes from your parent stream. Limit deliveries to the stream and have a thorough review process for changes being applied to the stream. BUILD QUALITY INTO YOUR DELIVERY PROCESS. Each delivery should include some form of testing documentation. It could be more formal with links to test plans, test cases, unit test results, and test execution results. it could be less formal with just some links to unit testing information.

Agile and Continuous Integration teams will resonate more with the second approach, it stresses surfacing issues/risks more quickly, and addressing them immediately.

permanent link
Geoffrey Clemm (30.1k33035) | answered Dec 02 '10, 11:53 p.m.
FORUM ADMINISTRATOR / FORUM MODERATOR / JAZZ DEVELOPER
I agree with Dan ... a couple of additional comments.

Approach 1: One way to implement that is to introduce a "recommended"
stream. When you have a configuration of the team stream that you want
to recommend, then just deliver that configuration to the "recommended"
stream. Then a given user can decide whether to accept changes from the
recommended stream, or to accept changes from the team stream. (You
always deliver to the team stream).

The advantage of accepting changes from the recommended stream is that
you are probably are getting safer/stabler code.

The advantage of accepting changes from the team stream is that you are
decreasing the risk of needing to merge (when there are changes in the
team stream that have not yet been recommended but that modify the same
files that you need to modify).

Approach 2: With 3.0, you can now enforce in process that a user accepts
all changes from a stream before delivering to the stream. That is a
great way to help ensure that people aren't tossing untested
configurations into the team stream.

Cheers,
Geoff

On 12/1/2010 9:23 AM, dtoczala wrote:
The issue is one of code quality and your philosophy behind using
streams. Many people think of streams as a way to isolate your
workspace from work and changes being done elsewhere in the
development team. I like to think about it a bit more deeply, and
consider WHY I am using streams. I often think of streams as being
hierarchical (in RTC they don't need to be, they just exist). As you
move up the hierarchy, from a team stream, to an integration stream,
to a release candidate stream, you are moving up in confidence and
software stability. There should be a testing hurdle that needs to
be met to move any change "up" this hierarchy. So by the
time a change makes it to the top of my hierarchy, it has been built
and tested at least 3 or 4 times, and more llikely about a dozen
times.

So in my mind there are two ways to handle this issue.

Approach #1) Buyer beware - don't accept anything into your workspace
unless it is part of a baseline from a higher level stream. People
only create/deliver baselines to streams when code has been tested
and validated. So only accept a new baseline (or snapshot) from your
parent stream. This is great because you ensure that your workspace
doesn't get a bunch of bad changes applied to it. The bad piece of
this is that merges can become larger, since you are holding off
change and applying it in rather large increments.

Approach #2) Developers need to be disciplined - accept any incoming
changes from your parent stream. Limit deliveries to the stream and
have a thorough review process for changes being applied to the
stream. BUILD QUALITY INTO YOUR DELIVERY PROCESS. Each delivery
should include some form of testing documentation. It could be more
formal with links to test plans, test cases, unit test results, and
test execution results. it could be less formal with just some links
to unit testing information.

Agile and Continuous Integration teams will resonate more with the
second approach, it stresses surfacing issues/risks more quickly, and
addressing them immediately.

Your answer


Register or to post your answer.


Dashboards and work items are no longer publicly available, so some links may be invalid. We now provide similar information through other means. Learn more here.