This is when a load balancer helps. The requests sent around the
time that the GC starts will be stuck on that server, but later
ones can be sent to other servers.
We use a "least connections" load balancing strategy. Each connection
represents a request in progress, so this is the same as equalizing
the queue of requests for each server.

Also, only use as much heap as you really need. A larger heap
means longer GCs.

wunder

On 2/4/09 1:59 PM, "Yonik Seeley" <ysee...@gmail.com> wrote:

> On Wed, Feb 4, 2009 at 3:12 PM, Otis Gospodnetic
> <otis_gospodne...@yahoo.com> wrote:
>> I'd be curious if you could reproduce this in Jetty....
> 
> All application threads are blocked... it's going to be the same in
> Jetty or Tomcat or any other container that's pure Java.  There is an
> OS level listening queue that has a certain depth (configurable in
> both tomcat and jetty and passed down to the OS when listen() for the
> socket is called).  If too many connections are initiated without
> being accepted, they will start being rejected.
> 
> See UNIX man pages for listen() and connect() for more details.
> 
> For Tomcat, the config param you want is "acceptCount"
> http://tomcat.apache.org/tomcat-6.0-doc/config/http.html
> 
> Increasing this will ensure that connections don't get rejected while
> a long GC is going on.
> 
> -Yonik

Reply via email to