It's all about the answers!

Ask a question

How do we tell if the data returned by JazzMon is what we should expect and are there descriptions of the different Async Tasks?


0
1
Michael Walker (99215201157) | asked Aug 16 '12, 5:33 p.m.
Hi,

I downloaded, configured, and ran the JazzMon tool.  Very easy to setup and run and it probably took longer to download than to start using.  Very cool..

After running the tool for awhile I gathered and analyzed the data and viewed the output files in Excel.  A couple questions:

1.  Is there any guidance for what should be a good target value for these Web Services and Async Tasks?  For example, if Async Task A has a 5 second avg response time and is consistent between all the snapshots, how do we know 5 seconds is a good avg to have for that Task.  Maybe it should be 2 seconds.  I understand there could be other variables in play that affect the time but a good starting point would be helpful.

2.  The PDF file included gives some description of the different Web Services, but what about the Tasks.  Some are self-explanatory but others we have no idea how to relate them to RTC.  While it's good to know if the averages are consistent, it would also be good to know what they're representing in the tool so if something does go wrong we know where to look.

Comments
sam detweiler commented Aug 16 '12, 6:45 p.m. | edited Aug 17 '12, 10:11 a.m.

We are asking the same questions.

IBM provides two baseline sets of data

200 users, 8 hours and
200 users 24 hours

to be able to compare your data against. but just because we have these numbers doesn't tell much about them.

we see quite a bit of difference in the mix of services called.
really haven't paid attention to the async services list yet..

I just organized our data looking at our two servers using the services eCnt, eAvg and eTot sheets.
(web services calls, counts, totals and average elapsed times).
every 15 minutes, for a day
sorted on the second time period (just before lunch), most called to least,  slice off the calls that fall under 1000/15 mins (about a call/second and higher)..

then get those in the total ranking (highest total time)
then compare our elapsed with the ibm elapsed.

about 14 calls for each server,  8 are the same. 6-8 are source code services based, and the rest are repository services.


Chetna Warade commented Aug 16 '12, 10:13 p.m.
JAZZ DEVELOPER

How do we tell if the data returned by JazzMon is what we should expect? This thought very much applies to any monitoring tool out there that is used for perfomance measurement, whether it is monitoring IOP's, network traffic, system statistics like CPU, memory. What provides useful insight is how the data compares and contrasts against accepted baselines/benchmarks. User experience and system statistics are few qualifiers that can be used to determine whether the time spent/avg response need to be improved further by fine tuning or hardware upgrades.

The current state of the tool is analogous to a thermometer which can measure temperature but not prescribe over the counter or antibodies if it is above acceptable thresholds. As the product matures hopefully there will be better answers to these questions.


Michael Walker commented Aug 17 '12, 1:46 p.m.

I would think at least a value range should be provided for these metrics. DB2 performance tools provide these and they're helpful to know. The data is only good if we know how to read it.

I see in the Data folder in the zip file you guys provide txt files of the output from the jazz.net repositories. Are these output values considered good for 200 users?


sam detweiler commented Aug 17 '12, 6:22 p.m.

based on my view, I think the numbers are what they HAVE available, and the performance on jazz.net is almost acceptable.

Be the first one to answer this question!


Register or to post your answer.


Dashboards and work items are no longer publicly available, so some links may be invalid. We now provide similar information through other means. Learn more here.