Jazz Forum Welcome to the Jazz Community Forum Connect and collaborate with IBM Engineering experts and users

Error in running SVN2RTC import

By using RTC 3.0.1.1 trying to import a svn dumpfile of 33GB, got a out of memory error while the importer processing after the related branch and user selection. Please can anyone help?

!ENTRY com.ibm.team.internal.repository.rcp.util.FileChannelUtil 2 900 2013-09-29 12:07:23.872
!MESSAGE Channel unexpectedly closed, suspect being interrupted
!STACK 0
java.lang.Throwable
    at com.ibm.team.internal.repository.rcp.util.FileChannelUtil.ensureOpen(FileChannelUtil.java:53)
    at com.ibm.team.internal.repository.rcp.util.FileChannelUtil.readUninterrupted(FileChannelUtil.java:196)
    at com.ibm.team.internal.repository.rcp.dbhm.BTreeHeap$FileHeapInputStream.read(BTreeHeap.java:434)
    at com.ibm.team.internal.repository.rcp.streams.UnsynchronizedBufferedInputStream.fillBuffer(UnsynchronizedBufferedInputStream.java:155)
    at com.ibm.team.internal.repository.rcp.streams.UnsynchronizedBufferedInputStream.read(UnsynchronizedBufferedInputStream.java:48)
    at java.io.DataInputStream.readBoolean(DataInputStream.java:246)
    at com.ibm.team.internal.repository.rcp.dbhm.DiskBackedHashMap.getTableEntry(DiskBackedHashMap.java:503)
    at com.ibm.team.internal.repository.rcp.dbhm.CachedDiskBackedHashMap.getTableEntry(CachedDiskBackedHashMap.java:126)
    at com.ibm.team.internal.repository.rcp.dbhm.DiskBackedHashMap.getEntry(DiskBackedHashMap.java:482)
    at com.ibm.team.internal.repository.rcp.dbhm.CachedDiskBackedHashMap.getEntry(CachedDiskBackedHashMap.java:109)
    at com.ibm.team.internal.repository.rcp.dbhm.DiskBackedHashMap.get(DiskBackedHashMap.java:155)
    at com.ibm.team.scm.client.importz.svn.internal.SVNRepositoryTree.getFolder(SVNRepositoryTree.java:129)
    at com.ibm.team.scm.client.importz.svn.internal.SVNRepositoryTree.getRoot(SVNRepositoryTree.java:362)
    at com.ibm.team.scm.client.importz.svn.internal.ui.SVNFolder2ProjectMappingPage$SVNRepositoryContentProvider.getElements(SVNFolder2ProjectMappingPage.java:147)
    at org.eclipse.jface.viewers.StructuredViewer.getRawChildren(StructuredViewer.java:959)
    at org.eclipse.jface.viewers.ColumnViewer.getRawChildren(ColumnViewer.java:703)
    at org.eclipse.jface.viewers.AbstractTreeViewer.getRawChildren(AbstractTreeViewer.java:1330)
    at org.eclipse.jface.viewers.TreeViewer.getRawChildren(TreeViewer.java:390)
   ---------------------------------
!ENTRY org.eclipse.core.jobs 4 2 2013-09-30 02:38:40.449
!MESSAGE An internal error occurred during: "Import from SVN".
!STACK 0
java.lang.OutOfMemoryError
    at java.lang.StringBuilder.<init>(StringBuilder.java:75)
    at com.ibm.team.filesystem.rcp.ui.internal.util.PathUtils.appendPath(PathUtils.java:58)
    at com.ibm.team.scm.client.importz.svn.internal.SVNRevisionTree.addCopy(SVNRevisionTree.java:434)
    at com.ibm.team.scm.client.importz.svn.internal.SVNRevisionTree.addCopy(SVNRevisionTree.java:435)
----------------------
!ENTRY org.eclipse.core.jobs 4 2 2013-09-30 02:38:40.464
!MESSAGE An internal error occurred during: "Loading News from 'Build Events for My Teams on ipsvm00155.swg.usma.ibm.com'".
!STACK 0
java.lang.OutOfMemoryError
    at sun.nio.ch.Net.localInetAddress(Native Method)
    at sun.nio.ch.Net.localAddress(Net.java:193)
    at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:173)
    at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:87)
    at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:80)
--------------------------------------

0 votes



One answer

Permanent link
 Hi Sridhar.

The SVN import process can consume a lot of Java heap space, especially for a large dump file. What are your heap settings ("-Xmx" in the eclipse.ini file)? Try increasing the heap size until you no longer get the OutOfMemoryError.

-Matt

1 vote

Comments

Thanks Matt for your prompt reply. Below are the settings in .ini file.(1536m), how much can i increase for around 35GB of dump file any idea?
--launcher.XXMaxPermSize
1024m
-vmargs
-Xms1536m
-Xmx1536m

It can really vary depending on the dump file. I've seen situations where 8g or even 12g was needed. This of course requires a 64 bit machine & JVM, and sufficient RAM.


-Matt

https://jazz.net/library/article/650#Configuration_the_memory_usage_d

There's also the ability to configure how much memory the importer can use. Setting it to use less memory will negatively impact performance. So only limit memory usage if your machine doesn't have enough RAM.

Your answer

Register or log in to post your answer.

Dashboards and work items are no longer publicly available, so some links may be invalid. We now provide similar information through other means. Learn more here.

Search context
Follow this question

By Email: 

Once you sign in you will be able to subscribe for any updates here.

By RSS:

Answers
Answers and Comments
Question details

Question asked: Sep 30 '13, 3:05 a.m.

Question was seen: 3,970 times

Last updated: Sep 30 '13, 10:46 a.m.

Confirmation Cancel Confirm