ERROR durin build process
Hello, I have been trying to use a build definition, launching a build job.
After several attempts, I had first to adjust the memory heap size. This is thre command I used: jbe -verbose -vm "C:\jdk1.5.0_22\bin\java" -repository https://<myhost>:<myport>/jazz/ -userId <user> -pass <password> -engineId <engine> -Xms1024m -Xmx8192m -server At last I got this error: pre-build "com.ibm.team.build.jazzscm" in corso java.lang.IllegalStateException: 0 at com.ibm.team.internal.repository.rcp.dbhm.BTreeHeap.doFree(BTreeHeap.java:341) at com.ibm.team.internal.repository.rcp.dbhm.BTreeHeap.free(BTreeHeap.java:283) at com.ibm.team.filesystem.client.internal.PersistentHeapManager$AutoClosingPersistentFileHeap.free(PersistentHeapManager. java:128) at com.ibm.team.internal.repository.rcp.dbhm.DiskBackedHashMap.freeObject(DiskBackedHashMap.java:406) at com.ibm.team.internal.repository.rcp.dbhm.DiskBackedHashMap$Entry.setValue(DiskBackedHashMap.java:972) at com.ibm.team.internal.repository.rcp.dbhm.CachedDiskBackedHashMap$CachedEntry.flush(CachedDiskBackedHashMap.java:390) at com.ibm.team.internal.repository.rcp.dbhm.CachedDiskBackedHashMap.flushEntry(CachedDiskBackedHashMap.java:197) at com.ibm.team.internal.repository.rcp.dbhm.CachedDiskBackedHashMap.flushCache(CachedDiskBackedHashMap.java:166) at com.ibm.team.internal.repository.rcp.dbhm.PersistentDiskBackedHashMap.persist(PersistentDiskBackedHashMap.java:145) at com.ibm.team.internal.repository.rcp.dbhm.PersistentDiskBackedHashMap.close(PersistentDiskBackedHashMap.java:139) at com.ibm.team.filesystem.client.internal.Store.close(Store.java:58) at com.ibm.team.filesystem.client.internal.SharingMetadata2.close(SharingMetadata2.java:1756) at com.ibm.team.filesystem.client.internal.MetadataChangeTracker.close(MetadataChangeTracker.java:312) at com.ibm.team.filesystem.client.internal.copyfileareas.CopyFileAreaStore.internalClose(CopyFileAreaStore.java:2755) at com.ibm.team.filesystem.client.internal.copyfileareas.CopyFileAreaManager.deregister(CopyFileAreaManager.java:253) at com.ibm.team.filesystem.client.internal.copyfileareas.CopyFileAreaManager.shutdown(CopyFileAreaManager.java:463) at com.ibm.team.filesystem.client.FileSystemCore.shutDown(FileSystemCore.java:98) at com.ibm.team.build.internal.engine.JazzScmPreBuildParticipant.preBuild(JazzScmPreBuildParticipant.java:204) at com.ibm.team.build.internal.engine.BuildLoop.invokePreBuildParticipants(BuildLoop.java:628) at com.ibm.team.build.internal.engine.BuildLoop$2.run(BuildLoop.java:466) at java.lang.Thread.run(Thread.java:595) |
3 answers
Hi Enrico,
Which version of RTC are you using? Errors like this usually indicate that the metadata for the 'sandbox' (i.e. the loaded workspace on disk) has been corrupted. When this happens, try deleting the load directory manually. That's assuming you're running with "Delete directory before loading" unchecked in the build definition. If not, try simply running the build again. Regarding heap size, you may need to bump it up, particularly if using a Sun VM, however, the way you're specifying them will have no effect. You need to add -vmargs between the program args and the VM args. The value you have for -Xmx is also really large, and -Xms can sometimes cause more problems than it solves (e.g. by delaying finalization). Try instead: jbe -verbose -vm "C:\jdk1.5.0_22\bin\java" -repository https://<myhost>:<myport>/jazz/ -userId <user> -pass <password> -engineId <engine> -vmargs -Xmx512m -server Also note that we generally recommend running using the JDK included in the RTC Eclipse client. For more details, see: https://jazz.net/wiki/bin/view/Main/BuildFAQ#WhichJDK |
Hi Nick,
thank you very much, it works now. I am working on a 2.0.0.2iFix2 version, and I think that both the problems you highlighted were there. I did remove manually thye folders and passed the jvm options in the new way, and everything worked perfectly. Again, thank you very much for your support Best regards Enrico |
Good to hear. Thanks for letting me know.
|
Your answer
Dashboards and work items are no longer publicly available, so some links may be invalid. We now provide similar information through other means. Learn more here.