Ran the same test, 200 requests for 2 seconds (my test machine is a laptop). The weird thing is that the percentiles are multiple of 16...
Benchmarking localhost (be patient)
Completed 100 requests
Completed 200 requests
Finished 200 requests
Server Software:
Server Hostname: localhost
Server Port: 80
Document Path: /
Document Length: 11 bytes
Concurrency Level: 1
Time taken for tests: 2.456 seconds
Complete requests: 200
Failed requests: 0
Total transferred: 17800 bytes
HTML transferred: 2200 bytes
Requests per second: 81.44 [#/sec] (mean)
Time per request: 12.279 [ms] (mean)
Time per request: 12.279 [ms] (mean, across all concurrent requests)
Transfer rate: 7.08 [Kbytes/sec] received
Connection Times (ms)
min mean[+/-sd] median max
Connect: 0 0 1.9 0 16
Processing: 0 12 9.1 16 31
Waiting: 0 3 6.3 0 16
Total: 0 12 9.3 16 31
Percentage of the requests served within a certain time (ms)
50% 16
66% 16
75% 16
80% 16
90% 16
95% 31
98% 31
99% 31
100% 31 (longest request)
Alright, it's off by almost 10x then my test sux. The client code must be wrong. Thinking about it quickly, it's probably not so easy to send a lot of requests in parallel... I'll use ab in the future. Thank a lot for having checked that.