Is there a way to split the same RTC java query Results in batches for different threads or processes
The requirement is a bit odd because the query is very simple and I cannot find a way to split the query itself so I am looking at the results as a possible option
Here is the query Fetch all DEFECT workitem with processing status of CLASSIFIED (custom attribute) within a specified PROJECTAREA That by itself does not seem to offer a way of splitting the query to multiple entities. So here is the usecase/requirement I need to implement parallelism in my batch program so multiple subjobs can split and work with or on results list from one RTC query. Ideally if the query was more complex I would split them so that each subjob runs a part of the query but that is not quite Here is what I am hoping is feasible and I need help - Run the same query and determine the number of results available say 3000 items in some sorted order -Split the number of results among my parallel so If I have 3 jobs I want to have each of them fetch 1000 results with a flow such as this JOB1 - Rerun the query and fetch the first1000 results so 1 to 1000 JOB2 - Rerun the query and OR if the results can be stored at the RTC side, fetch the next 1000 results 1001 to 2000 JOB3 - Rerun the query and OR if the results can be stored at the RTC side, fetch the next 1000 results 2001 to 3000 Is this possible ? Is there a mechanism to store query results in a sorted order so the multiple entities can fetch different part/regions of the results, I am bit lost Please help |
One answer
Ralph Schoon (63.5k●3●36●46)
| answered Jun 28 '13, 3:00 a.m.
FORUM ADMINISTRATOR / FORUM MODERATOR / JAZZ DEVELOPER
I have described what I think is possible here: https://rsjazz.wordpress.com/2012/10/29/using-work-item-queris-for-automation/. You can probably get the unresolved result set and then pass the results in batches to other processes as well. That is all I know.
Comments
mark owusu-ansah
commented Jun 28 '13, 11:04 a.m.
Ralph,
Ralph Schoon
commented Jun 28 '13, 11:17 a.m.
| edited Jun 28 '13, 11:18 a.m.
FORUM ADMINISTRATOR / FORUM MODERATOR / JAZZ DEVELOPER
Mark, have you looked at the post? Section Process Paged Results should explain how you can get paged results and how you could send each paged result sub set over to some thread. I think I remember you can paginate unresolved results also.
Ralph Schoon
commented Jun 28 '13, 11:29 a.m.
FORUM ADMINISTRATOR / FORUM MODERATOR / JAZZ DEVELOPER
I have to look if I can find the code and make it available for download. That'll take a while. But the post shows the main items you need to know. You can get paged results that have a number of items and you can process each page independently.
mark owusu-ansah
commented Jun 28 '13, 11:51 a.m.
Thanks Ralph,
Ralph Schoon
commented Jun 28 '13, 11:59 a.m.
FORUM ADMINISTRATOR / FORUM MODERATOR / JAZZ DEVELOPER
I think what I did back then is pass each page to a separate thread. I have some code up here: https://hub.jazz.net/project/rschoon/Jazz%20In%20Flight in com.ibm.js.team.workitem.automation.examples There is a class SynchronizeAttributesParallel, but it is pretty much a mess and all commented out. I did the code at a trade fair and had no time to consolidate.
mark owusu-ansah
commented Sep 17 '13, 9:47 p.m.
So I am back to try to get this to work. Increasingly I realize need the parallelism for performance. I understand the scoped page results better but does not fit so nicely .. So my use case
mark owusu-ansah
commented Sep 17 '13, 9:51 p.m.
Rather long winded but I hope it makes sense. There are other more different approaches but too sloppy . For example from master job get all worlkitem ids , split the results and write to separate file . Sent each job one filename.
Sorry, I have no more information to provide. I used the pages to do work in parallel threads and it just worked well for me. Otherwise you would have to iterate the unresolved results and pass a collection of those entries to the thread to work on them in parallel.
showing 5 of 8
show 3 more comments
|
Your answer
Dashboards and work items are no longer publicly available, so some links may be invalid. We now provide similar information through other means. Learn more here.