After an evaluation, GNOME has moved from Bugzilla to GitLab. Learn more about GitLab.
No new issues can be reported in GNOME Bugzilla anymore.
To report an issue in a GNOME project, go to GNOME GitLab.
Do not go to GNOME Gitlab for: Bluefish, Doxygen, GnuCash, GStreamer, java-gnome, LDTP, NetworkManager, Tomboy.
Bug 587528 - HTTPS + HTTP proxy + connection problem = infinite wait
HTTPS + HTTP proxy + connection problem = infinite wait
Status: RESOLVED FIXED
Product: libsoup
Classification: Core
Component: HTTP Transport
2.27.x
Other All
: Normal normal
: ---
Assigned To: libsoup-maint@gnome.bugs
libsoup-maint@gnome.bugs
Depends on:
Blocks:
 
 
Reported: 2009-07-01 15:33 UTC by Daniel LACROIX
Modified: 2009-12-19 14:19 UTC
See Also:
GNOME target: ---
GNOME version: 2.25/2.26


Attachments
Patch to correct the problem (1.54 KB, patch)
2009-07-01 15:41 UTC, Daniel LACROIX
rejected Details | Review
Patch to correct the problam (2.85 KB, patch)
2009-07-06 09:35 UTC, Daniel LACROIX
none Details | Review
Program to help reproducing the bug (3.42 KB, text/plain)
2009-07-08 14:51 UTC, Daniel LACROIX
  Details

Description Daniel LACROIX 2009-07-01 15:33:32 UTC
Please describe the problem:
If a proxy is setup with libsoup and if we are using HTTPS, libsoup use the HTTP tunneling via the HTTP CONNECT.

The problem is if the proxy fails to connect (by returning HTTP 503 or by closing connection before any HTTP reponse to the CONNECT) then the waiting request will never return.

Consequence, for application using Gstreamer souphttpsrc or directly libsoup, if a proxy is setup for HTTPS and if the connection fails, then the application is stalled.

Steps to reproduce:
1. run the following command:
gst-launch souphttpsrc proxy=http://myproxy.net:8080/ location=https://myhttpssite.com/ ! fakesink dump=true

myproxy.net must be a working HTTP/HTTPS proxy.
myhttpssite.com can be a false address to see the problem.


Actual results:
if your proxy return anything else than HTTP 200 for the HTTP CONNECT, then you'll wait for ever.

Expected results:
The command should return. Depending on the problem (DNS timeout, site connection refused...) it can be long or short time. But it has to return.

Does this happen every time?
Never return if address is false...

Other information:
Comment 1 Daniel LACROIX 2009-07-01 15:41:26 UTC
Created attachment 137694 [details] [review]
Patch to correct the problem

This patch correct 2 things.

When using soup_session_queue_message or directly queue_message, the given callback was never used/called. Seems not normal. Correct this at least for the tunnel_connected callback.

Change the code of tunnel_connected callback to cancel pending request if the CONNECT message fails.

This is my first time using libsoup, the given patch work for me but this is perhaps not the right way of doing things for libsoup. So review is needed.
Comment 2 Dan Winship 2009-07-01 16:08:44 UTC
In both cases, the extra work is being done by SoupSession's subclasses, not by SoupSession itself; the callback gets called from soup-session-async.c:final_finished(), and the tunnel error handling happens in soup-session-async.c:tunnel_connected().

and tests/proxy-test tests this case (trying to CONNECT through a proxy that returns "403 Forbidden" and it works fine there...

Try setting a breakpoint on soup-session-async.c:tunnel_connected (NOT the one in soup-session.c), and make sure it's being called at the right point, and see what it's doing?
Comment 3 Daniel LACROIX 2009-07-01 18:22:15 UTC
I have tested the proxy-test program without problem. I have create a new apache proxy that return always 403 and it works.

I try by using my squid 3.0 local proxy in proxy-test and it never returns.

I trace the network activity and I saw a difference.

Here is the apache proxy:
--------------------------------------
CONNECT 127.0.0.1:47525 HTTP/1.1

Host: 127.0.0.1:47525



HTTP/1.1 403 Forbidden

Date: Wed, 01 Jul 2009 18:04:31 GMT

Content-Length: 216

Content-Type: text/html; charset=iso-8859-1



<!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">
<html><head>
<title>403 Forbidden</title>
</head><body>
<h1>Forbidden</h1>
<p>You don't have permission to access 127.0.0.1:47525
on this server.</p>
</body></html>
---------------------------------------

And here is Squid:

---------------------------------------
CONNECT 127.0.0.1:47525 HTTP/1.1

Host: 127.0.0.1:47525



HTTP/1.0 403 Forbidden

Server: squid/3.0.STABLE8

Mime-Version: 1.0

Date: Wed, 01 Jul 2009 17:46:58 GMT

Content-Type: text/html

Content-Length: 1382

Expires: Wed, 01 Jul 2009 17:46:58 GMT

X-Squid-Error: ERR_ACCESS_DENIED 0

X-Cache: MISS from localhost

X-Cache-Lookup: NONE from localhost:3128

Via: 1.0 localhost (squid/3.0.STABLE8)

Proxy-Connection: close



<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN" "http://www.w3.org/TR/html4/loose.dtd">
...
---------------------------------------

squid (at least 3.0) reply in HTTP/1.0 to the HTTP/1.1 request. It seems to be the problem. I'll try to investigate more but if it is the problem, it is not correct from squid but also no cool from libsoup to never return.

One other test that cause problem to me and not tested by proxy-test is when the CONNECT request get no result and the connection close. For me it is also freezing.
Comment 4 Daniel LACROIX 2009-07-05 10:20:06 UTC
I have done more tests. The problem has nothing to do with HTTP/1.0 or HTTP/1.1.

Infact it happends when CONNECT return an HTTP response without "Content-Length", a bad "Content-Length" or return before the end of the response.

Example (bad content length) :
-----------------------------------
CONNECT www.testsite.com:443 HTTP/1.1
Host: www.testsite.com
User-Agent: get libsoup/2.27.2

HTTP/1.1 403 Forbidden
Date: Wed, 01 Jul 2009 18:04:31 GMT
Content-Length: 1976
Content-Type: text/html; charset=iso-8859-1

<!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">
<html><head><title>403 Forbidden</title></head><body>
Forbidden
</body></html>
-----------------------------------

Example (no content length) :
-----------------------------------
CONNECT www.testsite.com:443 HTTP/1.1
Host: www.testsite.com
User-Agent: get libsoup/2.27.2

HTTP/1.1 403 Forbidden
Date: Wed, 01 Jul 2009 18:04:31 GMT
Content-Type: text/html; charset=iso-8859-1

<!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">
<html><head><title>403 Forbidden</title></head><body>
Forbidden
</body></html>
-----------------------------------

Example (connection close) :
-----------------------------------
CONNECT www.testsite.com:443 HTTP/1.1
Host: www.testsite.com
User-Agent: get libsoup/2.27.2

HTTP/1.0 200 Connection established
-----------------------------------

I have looked forward to find why this happends and found why.

in soup-session-async.c:tunnel_connected, if SOUP_STATUS_IS_SUCCESSFUL fails, pending messages will be cancelled by calling soup-session.c:soup_session_connection_failed.

In soup_session_connection_failed, the host is lookup in priv->conns hash table. But in case of CONNECT, the association is not written in the hash table. Why ? Because in soup-session.c:soup_session_make_connect_message the connect message is queued using soup-session.c:queue_message and the association is done in soup-session-async.c:queue_message.

Changing soup-session.c:soup_session_make_connect_message by using SOUP_SESSION_GET_CLASS (session)->queue_message (session, msg, tunnel_connected, NULL); to queue the message correct the problem.

But with the test case of "connection close", it then raise some WARNING and CRITICAL:

(process:10445): GLib-GObject-WARNING **: instance of invalid non-instantiatable type `<invalid>'

(process:10445): GLib-GObject-CRITICAL **: g_signal_emit_valist: assertion `G_TYPE_CHECK_INSTANCE (instance)' failed

I'm still looking were the problem is...
 


Comment 5 Daniel LACROIX 2009-07-06 09:35:21 UTC
Created attachment 137907 [details] [review]
Patch to correct the problam

This patch should correct the connect_failed problem. A special soup_session_connection_failed is provided for tunnel_connect failure.
Comment 6 Daniel LACROIX 2009-07-08 14:51:25 UTC
Created attachment 138037 [details]
Program to help reproducing the bug

A program to simulate the proxy.
1. Run the command: ./server-simul.pl 2345 connectsuccessclose
2. Use the test get command as a client: http_proxy=http://127.0.0.1:2345 ./get https://nosite.com/

it will never return...
Comment 7 Dan Winship 2009-07-08 17:15:18 UTC
(In reply to comment #6)
> Created an attachment (id=138037) [edit]
> Program to help reproducing the bug

OK, yes, I can reproduce the bug with this test program.

The problem is that soup-session.c:connection_disconnected() is being called before soup_session_connection_failed(), and so the connection is no longer recorded in priv->conns when soup_session_connection_failed() is called.

Still thinking about whether or not your patch is the best solution.
Comment 8 Dan Winship 2009-12-19 14:19:38 UTC
fixed in git. sorry about the delay