[ACCEPTED]-How to set limit to the number of concurrent request in servlet?-servlets
I'd suggest writing a simple servlet Filter
. Configure 8 it in your web.xml
to apply to the path that you 7 want to limit the number of concurrent requests. The 6 code would look something like this:
public class LimitFilter implements Filter {
private int limit = 5;
private int count;
private Object lock = new Object();
public void doFilter(ServletRequest request, ServletResponse response,
FilterChain chain) throws IOException, ServletException {
try {
boolean ok;
synchronized (lock) {
ok = count++ < limit;
}
if (ok) {
// let the request through and process as usual
chain.doFilter(request, response);
} else {
// handle limit case, e.g. return status code 429 (Too Many Requests)
// see http://tools.ietf.org/html/rfc6585#page-3
}
} finally {
synchronized (lock) {
count--;
}
}
}
}
Or alternatively 5 you could just put this logic into your 4 HttpServlet
. It's just a bit cleaner and more reusable 3 as a Filter
. You might want to make the limit 2 configurable through the web.xml
rather than hard 1 coding it.
Ref.:
Check definition of HTTP status code 429.
I've thought about using a static counter 10 to keep track of the number of request, but 9 it would raise a problem of race condition.
If 8 you use a AtomicInteger for the counter, you 7 will not have the problem of race conditions.
An 6 other way would be using the Java Executor Framework (comes with 5 Java 1.5). There you are able to limit the 4 number of running threads, and block new 3 once until there is a new free thread.
But 2 I think the counter would work and be the 1 easyest solution.
Attention: put the counter relese in a finally block!
//psydo code
final AtomicInteger counter;
...
while(true) {
int v = counter.getValue()
if (v > max) return FAILURE;
if(counter.compareAndSet(v, v+1)) break;
}
try{
doStuff();
} finally{
counter.decrementAndGet();
}
You might want to have a look on Semaphore.
Semaphores 5 are often used to restrict the number of 4 threads than can access some (physical or 3 logical) resource.
Or even better try to 2 figure it out with the server settings. That 1 would of course be server-dependant.
If you are serving static files, it's unlikely 12 that the server will crash. The bottleneck 11 would be the network throughput, and it 10 degrades gracefully - when more requests 9 come in, each still get served, just a little 8 bit slower.
If you set a hard limit on total 7 requests, remember to set a limit on requests 6 per IP. Otherwise, it's easy for one bad 5 guy to issue N requests, deliberately read 4 the responses very slowly, and totally clog 3 your service. This works even if he's on 2 a dialup and your server network has a vast 1 throughput.
More Related questions
We use cookies to improve the performance of the site. By staying on our site, you agree to the terms of use of cookies.