It's all about the answers!

Ask a question

Out Of Memory Exception in large RTC SCM Merges


Harry Koehnemann (30125238) | asked Jul 10 '14, 12:58 p.m.
We are getting OOM exceptions doing large merges with RTC streams.  The merge will involve 40-50 components and several hundred (possibly 1000+) change sets.  When successful, the merge takes ~10 seconds.  When the OOM error occurs, the merge takes 30+ seconds and kills Eclipse.  When relaunching, though, the merge was completed successfully.  We are using 4.0.3.

Also, the failure is sort of intermittent.  It happens frequently, but not all the time.  Not sure if it is related to server load, but that is something we are investigating.  The fact the process takes 10 seconds makes me think it is not a server-side issue as dashboards and plans seem to be more resource-intensive.  The issue seems to be running out of memory on the Eclipse client for rendering.  

We have explored up-ing the -Xmx and -Xms JVM setting up to 1G with no luck.  I will attach the Eclipse core dump if I can figure out how...

Thoughts?  Anyone seen this issue before?  

Comments
Harry Koehnemann commented Jul 10 '14, 1:01 p.m. | edited Jul 10 '14, 1:02 p.m.

 How do we add an attachment with this forum??  Thanks.  We can bold, bullet, quote, paste from Word, but can't attach a file??

One answer



permanent link
Paul Ellis (1.3k613) | answered Jul 11 '14, 4:56 a.m.
JAZZ DEVELOPER
Hi Harry,

It probably sounds like you require a PMR to investigate this, as the issue may be deeper than a quick forum Q&A, but usually the eclipse log should let you know if the OOM is server side or client side.
Everything you write implies it is client side, so it sounds like you're working in the right direction by using 1024 in the eclipse.ini as your max heap.  You may need to increase this further and ensure that other applications are not consuming all the memory and causing you to swap heavily.

If you could let us know this information too, along with any pertinent errors from the log (with any confidential information obfuscated) then we might be able to assist you further.
  1. What operating system is in use?

  2. What is the amount of system memory on the client machine and free memory while the user is working?

  3. What is the load on that client machine?

  4. Have they increased the default amount of memory for the operating system (3GB switch on XP)

  5. Have they increased the –Xmx setting for the Eclipse client?


We used to see this issue quite a lot on XP due to the memory restrictions, but especially when moving from 3.x to 4.x.  I assume your clients are at the same version of 4.0.3 as with your server?

Kind regards,
Paul

Comments
Harry Koehnemann commented Jul 11 '14, 7:39 p.m.

 Thanks Paul.  Here is the information on your questions

1) OS - Windows 7
2) 8G, 
3) 80% utilized when running RTC merge
5) yes, to 1024m

I would love to post the Eclipse core file - but don't see how to upload a file on this forum.  Ideas??


sam detweiler commented Jul 11 '14, 8:07 p.m.

there is no file attachment support here.

you should probably open a PMR with support. they have upload support

1024m may not be enough, I would double it.


Paul Ellis commented Jul 14 '14, 4:28 a.m.
JAZZ DEVELOPER

Ditto to what Sam says.  As you're not using the restrictive XP, then you should be able to allow more access to your 8GB RAM.
If you're unhappy about jumping to 2048 then increments in between 1 & 2GB should probably help you feel your way to the solution...I am afraid I have nothing more scientific for now.

You might be able to use JTSMon to identify which services are heavily in use on the server, but for the client, I am afraid if you cannot find any pertinent messages for us to work back from, I'd have to recommend you contact Support.
I am afraid though without a PMR we couldn't look at the core file.

Your answer


Register or to post your answer.


Dashboards and work items are no longer publicly available, so some links may be invalid. We now provide similar information through other means. Learn more here.