Why does rich hover in RQM repeatedly asks for OAuth authentication?
Hi,
We have made a rich hover implementation in a system (let's call it X). To view this web page , the user has to be logged on using Basic authentication.
In RTC, the rich hover works fine. When hovering over the link, I get a basic authentication login (if I'm not already logged on to the other system) and after login, I can see the small and large preview pages.
However, in RQM we have a problem. When hovering over the link, we get a text "Log in to X to view this content". If I click on Log in, I get the basic authentication and then I'm sent to the OAuth authentication page for X.
After authenticating the connection, this page is closed and I hover over the link again. This time I'm directly sent to the OAuth authentication page for X. This one, I don't need to do anything about, just close the page and after this the hover function works. If I have more than one link, the second authorization page shows up once for each link.
So first of all, why is OAuth at all involved in this? The rich hover should, as far as I see it, be strictly client side? It works this way in RTC at least.
The script for fetching the compact rendering XML (using application/x-jazz-compact-rendering accept header) is, of course, done on server side but this one is not password protected, so no OAuth should be necessary for this either.
Second; If RQM requires some server side communication using OAUth, why do I get the second authorization page that does nothing?
We have tried using both Firefox 3.6 and IE 8 and the behavior is the same.
Best regards,
Mattias
We have made a rich hover implementation in a system (let's call it X). To view this web page , the user has to be logged on using Basic authentication.
In RTC, the rich hover works fine. When hovering over the link, I get a basic authentication login (if I'm not already logged on to the other system) and after login, I can see the small and large preview pages.
However, in RQM we have a problem. When hovering over the link, we get a text "Log in to X to view this content". If I click on Log in, I get the basic authentication and then I'm sent to the OAuth authentication page for X.
After authenticating the connection, this page is closed and I hover over the link again. This time I'm directly sent to the OAuth authentication page for X. This one, I don't need to do anything about, just close the page and after this the hover function works. If I have more than one link, the second authorization page shows up once for each link.
So first of all, why is OAuth at all involved in this? The rich hover should, as far as I see it, be strictly client side? It works this way in RTC at least.
The script for fetching the compact rendering XML (using application/x-jazz-compact-rendering accept header) is, of course, done on server side but this one is not password protected, so no OAuth should be necessary for this either.
Second; If RQM requires some server side communication using OAUth, why do I get the second authorization page that does nothing?
We have tried using both Firefox 3.6 and IE 8 and the behavior is the same.
Best regards,
Mattias
13 answers
The error message is
"Error: Unable to load /qm/proxy?uri=<URI>%3Foslc_cm.properties%3Dcalm%253AaffectsExecutionResult status:401"
So RQM tries to GET a defect and according to what I can see, it does not send any authorization header.
But wait a second here. You are talking about logged on to the external system. Is maybe this call made using ajax directly in the web browser? I have been thinking that this was a server side call. If this is correct, at least I can change where I should look. It's still a bit strange in that case that no authorization header or JSESSIONID cookie is sent in this call, but at least I know where to continue the investigation. :)
"Error: Unable to load /qm/proxy?uri=<URI>%3Foslc_cm.properties%3Dcalm%253AaffectsExecutionResult status:401"
So RQM tries to GET a defect and according to what I can see, it does not send any authorization header.
But wait a second here. You are talking about logged on to the external system. Is maybe this call made using ajax directly in the web browser? I have been thinking that this was a server side call. If this is correct, at least I can change where I should look. It's still a bit strange in that case that no authorization header or JSESSIONID cookie is sent in this call, but at least I know where to continue the investigation. :)
Unfortunately I still don't really get it.
If I try to save the defect, I get the error message above. But if I access
<URI>?oslc_cm.properties=calm%3AaffectsExecutionResult in the web browser (and specify application/json as accept header) I get a JSON representation of the defect in that system. This means that I am logged on to that system but I still get the 401 error message if I save the execution result.
If I understand correct, the "/qm/proxy" is used because ajax does not allow a server to send requests to another server (at least not with https)? But if that's true, I suppose that the actual call to the external system is made server side after all? And if that's true, the authorization header or JSESSIONID cookie has to be sent so that the external system knows that I am logged on.
Maybe this can give more information:
If I connect an existing defect to the execution result, I get a requests to the external system with the following headers:
accept:application/x-oslc-compact+xml, application/x-jazz-compact-rendering; q=0.5
host:<host>
connection:Keep-Alive
user-agent:JazzHttpClient
Note that the x-oslc-compact+xml documents are not password protected so this works fine.
I then try to save and the request to the external system has the following headers, that generates the error message:
user-agent:Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.2.10) Gecko/20100914 Firefox/3.6.10 (.NET CLR 3.5.30729)
accept:application/json
accept-language:en-us,sv-se;q=0.8,sv;q=0.5,en;q=0.3
accept-encoding:gzip,deflate
accept-charset:ISO-8859-1,utf-8;q=0.7,*;q=0.7
content-type:application/x-www-form-urlencoded
x-requested-with:XMLHttpRequest
referer:https://<jazz>/qm/web/console/MHDev4
content-length:0
host:<host>
connection:Keep-Alive
Then I instead click on the link and the request is done with the headers (JSESSIONID and authorization masked):
host:<host>
user-agent:Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.2.10) Gecko/20100914 Firefox/3.6.10 (.NET CLR 3.5.30729)
accept:text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
accept-language:en-us,sv-se;q=0.8,sv;q=0.5,en;q=0.3
accept-encoding:gzip,deflate
accept-charset:ISO-8859-1,utf-8;q=0.7,*;q=0.7
keep-alive:115
connection:keep-alive
referer:https://<jazz>/qm/web/console/MHDev4
cookie:JSESSIONID=XXXXXXXXXXXXXXXXXXXXXXXX; WLOBJECTTYPE=Eriref
authorization:Basic xxxxxxxxxxxxxxxxxxx
If I try to save the defect, I get the error message above. But if I access
<URI>?oslc_cm.properties=calm%3AaffectsExecutionResult in the web browser (and specify application/json as accept header) I get a JSON representation of the defect in that system. This means that I am logged on to that system but I still get the 401 error message if I save the execution result.
If I understand correct, the "/qm/proxy" is used because ajax does not allow a server to send requests to another server (at least not with https)? But if that's true, I suppose that the actual call to the external system is made server side after all? And if that's true, the authorization header or JSESSIONID cookie has to be sent so that the external system knows that I am logged on.
Maybe this can give more information:
If I connect an existing defect to the execution result, I get a requests to the external system with the following headers:
accept:application/x-oslc-compact+xml, application/x-jazz-compact-rendering; q=0.5
host:<host>
connection:Keep-Alive
user-agent:JazzHttpClient
Note that the x-oslc-compact+xml documents are not password protected so this works fine.
I then try to save and the request to the external system has the following headers, that generates the error message:
user-agent:Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.2.10) Gecko/20100914 Firefox/3.6.10 (.NET CLR 3.5.30729)
accept:application/json
accept-language:en-us,sv-se;q=0.8,sv;q=0.5,en;q=0.3
accept-encoding:gzip,deflate
accept-charset:ISO-8859-1,utf-8;q=0.7,*;q=0.7
content-type:application/x-www-form-urlencoded
x-requested-with:XMLHttpRequest
referer:https://<jazz>/qm/web/console/MHDev4
content-length:0
host:<host>
connection:Keep-Alive
Then I instead click on the link and the request is done with the headers (JSESSIONID and authorization masked):
host:<host>
user-agent:Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.2.10) Gecko/20100914 Firefox/3.6.10 (.NET CLR 3.5.30729)
accept:text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
accept-language:en-us,sv-se;q=0.8,sv;q=0.5,en;q=0.3
accept-encoding:gzip,deflate
accept-charset:ISO-8859-1,utf-8;q=0.7,*;q=0.7
keep-alive:115
connection:keep-alive
referer:https://<jazz>/qm/web/console/MHDev4
cookie:JSESSIONID=XXXXXXXXXXXXXXXXXXXXXXXX; WLOBJECTTYPE=Eriref
authorization:Basic xxxxxxxxxxxxxxxxxxx
I think I have found out the reason why RQM don't send the OAuth headers.
When making the unauthorized request, we answered with a 401 Unauthorized response. But we did not send the WWW-Authenticate header back. The result was that RQM just popped up an Unauthorized dialog.
After adding this header, RQM actually pops up the OAuth authorization page.
So now I think I have all information required to respond to the GET and POST/PUT messages :)
When making the unauthorized request, we answered with a 401 Unauthorized response. But we did not send the WWW-Authenticate header back. The result was that RQM just popped up an Unauthorized dialog.
After adding this header, RQM actually pops up the OAuth authorization page.
So now I think I have all information required to respond to the GET and POST/PUT messages :)
page 2of 1 pagesof 2 pages