How does RTC derive the "Expected Story Points" in the Progress bar in a plan view?
In 6.0.2 we have several agile teams sharing a PA with each team having their own individual team area. Team A has 120 points planned for the current sprint. The progress report when hovering over their Sprint Backlog plan progress bar says "Expected story points" is 70 points and that they are behind by 27.72 points. They have completed 42\120 points.
Team B is using the same exact Ranked List view and same Sprint Backlog plan type. They have 68 points planned for this sprint but the hover says they have 0 expected story points. Because of having no expected points in the progress bar, their progress is green when it should really be mostly red as they have only completed 6 points so far.
The differences between the teams that I can see is that Team A has estimated 100% of their items and they use estimated Features as parent items to the stories, the Features are ranked. Team B has estimated 88% of their sprint backlog and are not using Features as parents, they are ranking their stories.
The Question is how is RTC coming up with the "expected" story points for the sprint? Is it based on the rate that stories are getting closed?
I know that the calculation is: RealTimeProgress =
RealTimeDone[resolved] / (RealTimeDone[unresolved] + RealTimeLeft)
But this doesn't clearly explain why one team would show as having an "Expected" number of points and another team has none, when both teams have planned points and are progressing through the same sprint? Have searched this forum, IBM Support and IBM Tech Notes without finding an explanation. Very grateful for any insights into this mechanism....