Jetty VS Tomcat Performance Comparison

Jetty VS Tomcat Overview

Jetty and Tomcat are open servlet containers, both of them support HTTP server, HTTP client and javax.servlet container. In this article, we will quick view the difference between Jetty and Tomcat, and give the generic idea about which is the better one.

You may think it is not make sense to compare the two containers, tomcat is the one clearly discussed moreover than jetty, it supports a lot of wonderful options to developers, this is no doubt we start using tomcat throughout development because it’s easy-going and free, Its an foremost free application server and provided full web server functionality and can be stripped down to be embedded or built up an full J2EE server.

Jetty is a uniformly excellent tool about particularly feature. It has been started around since 1998 and claims to be a “100% Java HTTP Server and Servlet Container”. It is a foremost a set of software components that offer HTTP and servlet services. Jetty can be installed as a standalone application server or be easily embedded in an application or framework as a HTTP component. It is a simple servlet engine, as do a feature rich servlet engine or as do part of a full JEE environment.

Let’s take a look: Jetty VS Tomcat:

Popularity:

The following figure gives us a generic idea about which Java Containers / App Servers are used the most?

The results came from more than 1000+ developers survey, they reported what the container they are in use in enterprise production, for those participants they don’t use them now, they can choose what container they ever used or what they expect to be best. From the figure we can also see that Tomcat and Jetty are bigger winner open source containers. Tomcat  is the absolute primary container over all others.

Their Features and advantages:

Jetty Features and Powered:

  • Full-featured and standards-based.
  • Embeddable and Asynchronous.
  • Open source and commercially usable.
  • Dual licensed under Apache and Eclipse.
  • Flexible and extensible, Enterprise scalable.
  • Strong Tools, Application, Devices and Cloud computing supported.
  • Low maintenance cost.
  • Small and Efficient.

Tomcat Features and Powered:

  • Famous open source under Apache.
  • Easier to embed Tomcat in your applications, e.g. in JBoss.
  • Implements the Servlet 3.0, JSP 2.2 and JSP-EL 2.2 support.
  • Strong and widely commercially usable and use.
  • Easy integrated with other application such as Spring.
  • Flexible and extensible, Enterprise scalable.
  • Faster JSP parsing.
  • Stable.

Jetty VS Tomcat Performance benchmark

Test Environment:
CPU: Intel Core Dou T6400 2.0GHz
RAM: 2G
JDK: Jvm sun 1.6
OS: Ubuntu

I created the below code to test performance benchmark around the two containers, it is fairly simple but can gives us a generic idea. the servlet url is /servlet/TestRuning.

  PrintWriter out = response.getWriter();
  String aStr = request.getParameter("a");
  String bStr = request.getParameter("b");

  int a = 100;
  int b = 100;

  try{
   a = Integer.parseInt(aStr);
   b = Integer.parseInt(bStr);
  }catch(Exception excep){
   System.err.println("err:" + excep.getMessage());
  }
  int sum = 0;
  long s = System.currentTimeMillis();
  for(int i = 0; i < a; ++i){
   for(int ii = 0; ii < b; ++ii){
    sum = a / b;
   }
  }
  long e = System.currentTimeMillis();
  long d = e - s;
  out.println( d );

  out.flush();
  out.close();

We are now deploying this application into tomcat and jetty, we set them with both default configuration and same JRE version.

wapproxy@ubuntu:~$ ps -ef | grep java

wapproxy  2076     1  1 11:28 ?        00:00:03 /usr/lib/jvm/java-6-openjdk/jre/bin/java -Djetty.home=/home/wapproxy/jetty -Djava.io.tmpdir=/tmp -jar /home/wapproxy/jetty/start.jar /home/wapproxy/jetty/etc/jetty-logging.xml /home/wapproxy/jetty/etc/jetty.xml
wapproxy  2185  1398  8 11:30 pts/0    00:00:02 /usr/lib/jvm/java-6-openjdk/jre/bin/java -Djava.util.logging.manager=org.apache.juli.ClassLoaderLogManager -Djava.util.logging.config.file=/home/wapproxy/Tomcat/conf/logging.properties -Djava.endorsed.dirs=/home/wapproxy/Tomcat/endorsed -classpath :/home/wapproxy/Tomcat/bin/bootstrap.jar -Dcatalina.base=/home/wapproxy/Tomcat -Dcatalina.home=/home/wapproxy/Tomcat -Djava.io.tmpdir=/home/wapproxy/Tomcat/temp org.apache.catalina.startup.Bootstrap start
wapproxy  2329  2309  0 11:31 pts/1    00:00:00 grep --color=auto java

The tomcat startup port is 8888 and Jetty port is 8080, Then we do pressure test:

This is Jetty Performance reports:

Server Software:        Jetty(6.1.22)
Server Hostname:        172.31.36.158
Server Port:            8080

Document Path:          /jt_jt/servlet/TestRuning?a=100000&b=100000
Document Length:        2 bytes

Concurrency Level:      1
Time taken for tests:   8.715 seconds
Complete requests:      5000
Failed requests:        0
Write errors:           0
Total transferred:      445000 bytes
HTML transferred:       10000 bytes
Requests per second:    573.72 [#/sec] (mean)
Time per request:       1.743 [ms] (mean)
Time per request:       1.743 [ms] (mean, across all concurrent requests)
Transfer rate:          49.86 [Kbytes/sec] received

Connection Times (ms)
              min  mean[+/-sd] median   max
Connect:        0    0   1.1      0       5
Processing:     0    1   7.1      0      50
Waiting:        0    1   7.1      0      50
Total:          0    2   7.2      0      50

Percentage of the requests served within a certain time (ms)
  50%      0
  66%      0
  75%      0
  80%      0
  90%      5
  95%      5
  98%     45
  99%     50
 100%     50 (longest request)

This is Tomcat Performance reports:

Server Software:        Apache-Coyote/1.1
Server Hostname:        172.31.36.158
Server Port:            8888

Document Path:          /jt_jt/servlet/TestRuning?a=100000&b=100000
Document Length:        3 bytes

Concurrency Level:      1
Time taken for tests:   4.070 seconds
Complete requests:      5000
Failed requests:        0
Write errors:           0
Total transferred:      650000 bytes
HTML transferred:       15000 bytes
Requests per second:    1228.50 [#/sec] (mean)
Time per request:       0.814 [ms] (mean)
Time per request:       0.814 [ms] (mean, across all concurrent requests)
Transfer rate:          155.96 [Kbytes/sec] received

Connection Times (ms)
              min  mean[+/-sd] median   max
Connect:        0    0   1.2      0       5
Processing:     0    0   1.7      0      45
Waiting:        0    0   1.7      0      45
Total:          0    1   2.1      0      45

Percentage of the requests served within a certain time (ms)
  50%      0
  66%      0
  75%      0
  80%      0
  90%      5
  95%      5
  98%      5
  99%      5
 100%     45 (longest request)

The following is the key data from our testing.

jetty 8080 Requests per second:    573.72 [#/sec] (mean)
tomcat 8888  Requests per second:    1228.50 [#/sec] (mean)

Let us see,  Tomcat handles 1228 requests per second but Jetty only goes 573 requests , so at least from this study statistics it reveals Tomcat does better.

More testing on Tomcat

Concurrent Requests Requests Waitting Time Requests Handling Time Throughput
1 0.422 0.422 2370.37
5 1.641 0.328 3047.62
10 3.125 0.313 3200
20 6.563 0.328 3047.62
40 12.5 0.313 3200
60 20.625 0.344 2909.09
80 25 0.313 3200
100 34.375 0.344 2909.09
200 596.875 2.984 335.08
300 618.75 2.063 484.85
400 1006.25 02.516 397.52

More testing on Jetty

Concurrent Requests Requests Waitting Time Requests Handling Time Throughput
1 6.391 6.391 156.48
5 11.484 2.297 435.37
10 19.063 1.906 524.59
20 25.625 1.281 780.49
40 0.797 31.875 1254.9
60 6.578 394.688 152.02
80 5.563 445 179.78
100 1.781 178.125 561.4
200 6.984 1396.875 143.18
300 3.109 932.813 321.61
400 6.531 2612.813 153.11

This is a simple and run on short time period testing, it draws one aspect performance. If you have any questions please let me know.

Оцените статью
ASJAVA.COM
Добавить комментарий

Your email address will not be published. Required fields are marked *

*

code

  1. Peter Veentjer

    To get more reliable measurements, you need to test for a longer period than just a few seconds. A few minutes gives the environment the time to let the JIT do its magic for example.

    Ответить
  2. Seriously?

    If Jetty can do 573 requests each second and tomcat can do 1228 requests each second, then how exactly is “It clear point out the processing speed of jetty is double of Tomcat” correct???

    Ответить
    1. admin автор

      True, I make a mistake on typing, Thank you point out.

      Ответить
  3. Simone Bordet

    Your “benchmark” is flawed in many ways; Peter suggested one, but there are other flaws.
    You think Google would have chosen Jetty if it’s twice as slow as Tomcat ?
    Perhaps they made serious benchmarks and it turned out they got different results.

    Ответить
    1. lawrence

      So if Google chooses Jetty, than that means the test results are by default erroneous? maybe Google are just plain stupid in their Jetty choice or maybe it has to do with political reasons or maybe Google tweaked their hardware so jetty could be happy. Who knows… These little tests have shown Jetty is the clear loser I’d say.

      Ответить
  4. nullone

    Are you really biased?

    Only Jetty has so-called powered feature:
    Full-featured and standards-based.?
    Strong Tools, Application, Devices and Cloud computing supported?

    How many people just run the web server test for seconds?

    What is software version number? What is their configuration? What is your OS/JDK/RAM/GC setup?

    Or you just want to make a sound?

    Ответить
    1. admin автор

      Thank your comments. Jetty and Tomcat both have Strong Tools, Application, Devices and Cloud computing supported. For this case I just did single test on them and it reported Tomcat presented better than Jetty. I think there should have some optimization option which didn’t enableed. My case ran on Window Ubuntu/JDK6/2G RAM.

      Ответить
  5. foobar

    While this whole test is pretty flawed it does give on valuable insight: no matter which container you choose, processing a full request cycle in 1 or 2 ms is pretty much negligible compared to typical response times in the 3 digit numbers.

    PS: and what’s the number crunching for? The chosen container certainly won’t have an influence on that code.

    Ответить
  6. Edward Capriolo

    Testing anything with the defaults is strange because you are not sure which features are on and off by default. Also default settings are typically made to work is small memory low CPU environments. Not great for showing how well something truly performs.

    Ответить
  7. Billy Bob

    > My case ran on Window Vista/JDK6/3G RAM.

    I actually think you ran it on ubuntu …

    wapproxy@ubuntu:~$ ps -ef | grep java
    wapproxy 2076 1 1 11:28 ? 00:00:03 /usr/lib/jvm/java-6-openjdk/jre/bin/java -Djetty.home=/home/wapproxy/jetty -Djava.io.tmpdir=/tmp -jar /home/wapproxy/jetty

    Ответить
  8. John

    In my own benchmark tests, Tomcat has much better raw performance for short bursts of traffic. The problem with Tomcat is that it uses a lot more memory than Jetty. What ends up happening is that Tomcat ends up choking itself because of garbage collection.

    If this guy ran his benchmark for 20 minutes, he would see Jetty take over in performance. Jetty is kind of like the tortoise in the classic story of “The Tortoise and the Hare”. It conserves its resources for sustainable, consistent, and reliable performance. Tomcat uses as much resources as it can until the system literally runs out.

    I’ve done tests with 100 concurrent connections pounding the server and Jetty maintains a 0% error rate after 1 hour whereas Tomcat will have a 30% error rate in the same amount of time.

    Ответить
    1. lawrence

      Garbage collecting can be seriously tweaked by assigning a few arguments to your JVM. That should not be an issue. If jetty uses less memory it means it does less or needs to star reading code from elsewhere when specific things are required…. It is as simple as that. I’m sure the tomcat developers – after 20 or more years – know their business… No offense intended.

      Ответить
    2. lawrence

      No offense, if you speak of 30% error rate while jetty has 0%, don’t take me wrong please but i do not believe you. i have a Tomcat 7.0.42 on my Mac, not even running APR and it spins like a kitten. Responses are very fast and no errors at all….. Server has 16GB of ram and tomcat gets 2 gigs. Machine is an i5 3.1Ghz

      Ответить
  9. Richard

    I’ve only just found this… It’s very very interesting… Good work!
    Of course Tomcat is just a servlet container, where Jetty is more or less a full app server.
    But, nowadays the vast majority of enterprise apps are constructed using pojo frameworks like Spring that make no use of app server functionality.
    I have been saying for years, that if you don’t need a container.. use Tomcat
    But people keep buying Weblogic (Not that there’s anything wrong with Weblogic, I like it) it’s just that for apps that don’t use a container, Tomcat is the right choice.
    I’m not affiliated in any way to the Tomcat project.

    There is one other argument, which is that Glassfish is an OSGi container. That gives it something that neither Tomcat nor Jetty have.

    Ответить