Opened 8 years ago

Closed 7 years ago

#498 closed enhancement (not a bug)

Many simultaneous http connections of browsers are bad

Reported by: anonfag123 Owned by:
Priority: minor Milestone:
Component: apps/i2ptunnel Version: 0.8.7
Keywords: Cc:
Parent Tickets:

Description

This is not an actual problem of i2p but a problem I think i2p could actually deal with.

I think that it is not very good for the i2p net generally when a browser tries to download from an eepsite with many simultaneous connections, but especially it is a problem with the eepsites:

For example the postman tracker seems to block clients that open too many connections at once.
The problem is: Assume there is a torrent with about 5 preview pictures. An unmodified Firefox will try to obtain them all at once and almost certainly get blocked by postman. When modifying the Firefox settings to only do one download at a time it downloads those pictures sequentially one after one and there's no problem.

I understand that it's not really the task of i2p to manage the connections but suggest it deals with it nonetheless. The problem is that I don't see an obvious way to do it.
The proxy could accept all connections but only process them sequentially and tell the browser to keep the other connections alive. But I imagine this to be ugly all in all.

So I see there are good reason to WONTFIX this, but then at least I'd like to see in some well visible document the tip to set "network.http.max-connections-per-server" to 1 in Firefox about:conf.
With that setting i2p is much more pleasant to use, at least for me.

Subtickets

Change History (5)

comment:1 in reply to: ↑ description Changed 8 years ago by SemiRandom

Replying to anonfag123:
From my experience with my eepsite, this is actually a limitation of the requesting user's I2P router and not something that happens at the eepsite server's end, whether postman or any other site. My eepsite serves files for download, and if any one user tries to download more than about 5 or 6 files at one time, their traffic chokes and not much gets transferred. But if one user downloads 5 files at once, and while that's going, another user also downloads 5 files at once, my server has no problem sending data out on all 10 transfers. Or one user may be choked trying to grab 7 things at once, while 3 other users go on downloading 1-5 files each perfectly well.

In any case, I support the idea of telling people how to set their web browser to limit the number of simultaneous connections to avoid getting stuck on busy pages. And not just getting pages, people need to be aware that the connection limit applies to all HTTP transfers, whether embedded or clicked or whatever. Although I also think that telling them to limit it to 1 at a time is unnecessarily strict; as I said, I often see people download 5 files in parallel without issues.

Or, of course, it would be even better to just remove/increase the limit in the router, or maybe even make it an explicit "max HTTP connections" setting that would be visible and adjustable if needed. But I doubt this issue comes up often enough to justify that effort, so just educating people about it is probably sufficient.

comment:2 follow-up: Changed 8 years ago by zzz

  • Component changed from unspecified to apps/i2ptunnel
  • Milestone changed from 0.8.8 to 0.9

I wasn't aware that postman limited simultaneous connections but he might. One other problem with the pictures on the detail pages is they aren't scaled, so it _looks_ like they are really slow to download but it's because they are being scaled on the browser side. You may wish to enter a ticket on postman's site for these issues.

I2Ps's congestion control, in general, does need work. The dropping by participating routers in tunnels is too aggressive and it hurts our throughput.

Believe it or not, I worked on a patch 6 years ago to support HTTP/1.1 pipelining. See http://zzz.i2p/pipeline.html . In retrospect it was pretty crappy - I didn't really know what I was doing back then. At the time, I abandoned it because whatever the firefox version was at the time didn't enable pipelining by default, so the whole effort seemed pointless. Also I had a feeling the code was crappy. Also I wasn't at all sure that pipelining would help anyway.

I don't know what the state of current browsers is w.r.t. pipelining, but if Chrome or Firefox or something does enable it by default it might be worth looking into it again for I2P.

There is currently no limit on the client-side of i2ptunnel for the number of simultaneous connections.

comment:3 Changed 8 years ago by SemiRandom

Replying to zzz:
There may be no intentional client-side limit on HTTP connections, but my observations stand. 5 simultaneous file downloads from my eepsite by one client work fine, 8 stall and fail. The eepsite server itself will serve at least 20-30 files at once without issue, but if any one client tries to establish more than about 5 or 6 simultaneous connections, those connections freeze and quit transferring data, while the rest continue working. A global limit on connections happening on the client side seems more likely than an accidental yet carefully counted per-client limit in the eepsite's I2P router, but whichever end it's happening on, the effect is real. I think that's what the OP is seeing, not any intentional limit enforced by postman.

comment:4 in reply to: ↑ 2 Changed 8 years ago by anonfag123

Replying to zzz:

I wasn't aware that postman limited simultaneous connections but he might. One other problem with the pictures on the detail pages is they aren't scaled, so it _looks_ like they are really slow to download but it's because they are being scaled on the browser side. You may wish to enter a ticket on postman's site for these issues.

Right, I didn't write that because I somehow forgot:

When connecting to a site with many pictures it doesn't simply get slow - the connections really get stuck. Furthermore I can't establish any more new connections to that site the previous connections choked to for a few minutes and all I get is the router's "unreachable" page.

I first thought that may be an artificial limit somewhere but now through the comments I realize that maybe the connections that choked are just waiting until their TTL is over and the i2prouter somehow prioritizing that old connections over new ones.

I don't know what the state of current browsers is w.r.t. pipelining, but if Chrome or Firefox or something does enable it by default it might be worth looking into it again for I2P.

I believe what they do is called pipelining but it definitely has something to do with simultaneous http connections since limiting them in the browser makes that problem nearly go away.

And yes, 1 may be not enaugh. Especially when occassionally one connection dies and you have to wait until it times out for being able to load other items on that eepsite.
From the other comments 3 or 4 seem to be more appropriate as a workaround.

comment:5 Changed 7 years ago by zzz

  • Milestone 0.9 deleted
  • Resolution set to not a bug
  • Status changed from new to closed

Interesting discussion, but it appears there's no proposal to change our code, only to add a recommendation to our website somewhere... but no agreement on what that is. Even so, some note on our website would be like shouting in the wind.

Also, changes to the WRED code in 0.8.12 seem to have helped a lot.

Since pipelining is off by default in (all?) browsers, some sort of transparent conversion to 1.1 pipelining or even SPDY in the HTTP client and server proxies would be the only way to do it. And that sounds really hard.

So I think the best solution is just to continue making I2P better and fix congestion issues.

Closing as not-a-bug, if you want to continue the discussion the best place is probably zzz.i2p.

Note: See TracTickets for help on using tickets.