Nobody likes to be called a dummy by a dummy.

How to speedup botlist application

Botlist is a little slow and even under the best tests, anything with a database connection reduces the page to about 100ms a request.  That means at the best conditions it is getting 10 pages a second. Ideally, I would like 100 pages per second.

This page renders (just text) at 100ms a second.

http://www.botspiritcompany.com/botlist/spring/pipes/botverse_pipes.html

I have tried a couple of things with mysql like enabling caching.  I am use "hibernate(j2ee stuff)" and I can do caching there.

Other than that, I can't think of anything.  Probably have to tune tomcat also.
Permalink Bot Berlin 
July 3rd, 2007 10:06am
This page gets 0ms a request, served by the same server but without any database connectivity. hmm.

http://www.botspiritcompany.com/botlist/company/botlist_faq.htm
Permalink Bot Berlin 
July 3rd, 2007 10:09am
Active content vs. static content.

You'll see large sites redirect all their static content to come from a subdomain to free up bandwidth & CPU on the servers that have active content.
Permalink xampl 
July 3rd, 2007 10:17am
I hear that using Apache to serve up the static stuff could speed things up.  Let mongrel or lighty handle the dynamic.
Permalink Aaron 
July 3rd, 2007 10:21am
"Let mongrel or lighty handle the dynamic."

It is a J2EE application, using tomcat.  Tomcat is fast, but the java libraries are hideously slow.
Permalink Bot Berlin 
July 3rd, 2007 10:22am
Oh.  I thought it was a Rails app.

I'm no help here.  Sorry, you're screwed.

:)
Permalink Aaron 
July 3rd, 2007 10:26am
"Im no help here.  Sorry, you're screwed."

ah, it is just for fun anyway.
Permalink Bot Berlin 
July 3rd, 2007 10:29am
Hmm, are these good tests.  I am basically just wrote an application to get a request from the server and time it.  Even google is a little slow.

attempting request to=http://www.google.com
single request time=93 ms -- from Thread-0
attempting request to=http://www.google.com
single request time=63 ms -- from Thread-0
attempting request to=http://www.google.com
single request time=47 ms -- from Thread-0
attempting request to=http://www.google.com
single request time=47 ms -- from Thread-0
attempting request to=http://www.google.com
single request time=31 ms -- from Thread-0
attempting request to=http://www.google.com
single request time=62 ms -- from Thread-0
attempting request to=http://www.google.com
single request time=31 ms -- from Thread-0
attempting request to=http://www.google.com
single request time=31 ms -- from Thread-0
attempting request to=http://www.google.com
single request time=47 ms -- from Thread-0
attempting request to=http://www.google.com
single request time=47 ms -- from Thread-0
All requests time=531 ms
attempting request to=http://www.botspiritcompany.com
single request time=109 ms -- from Thread-0
attempting request to=http://www.botspiritcompany.com
single request time=110 ms -- from Thread-0
attempting request to=http://www.botspiritcompany.com
single request time=141 ms -- from Thread-0
attempting request to=http://www.botspiritcompany.com
single request time=141 ms -- from Thread-0
attempting request to=http://www.botspiritcompany.com
single request time=93 ms -- from Thread-0
attempting request to=http://www.botspiritcompany.com
single request time=94 ms -- from Thread-0
attempting request to=http://www.botspiritcompany.com
single request time=110 ms -- from Thread-0
attempting request to=http://www.botspiritcompany.com
single request time=94 ms -- from Thread-0
attempting request to=http://www.botspiritcompany.com
single request time=109 ms -- from Thread-0
attempting request to=http://www.botspiritcompany.com
single request time=94 ms -- from Thread-0
All requests time=1156 ms
Permalink Bot Berlin 
July 3rd, 2007 10:57am
Are those basically pings or web responses?

I average 18ms pings to Google. :P
Permalink Send private email JoC 
July 3rd, 2007 11:17am
>You'll see large sites redirect all their static content to come from a subdomain to free up bandwidth & CPU on the servers that have active content.

It doesn't sound like bot's issue is that the machine is overloaded serving static content. However taking that a step further and making active content static (e.g. caching)...well that's pure resource-saving gold.
Permalink DF 
July 3rd, 2007 11:19am
> Ideally, I would like 100 pages per second.


So you need to serve 6000 pages a minute? Not likely. To get to those levels you need to load balance requests across multiple servers.
Permalink son of parnas 
July 3rd, 2007 12:24pm
"So you need to serve 6000 pages a minute? Not likely. To get to those levels you need to load balance requests across multiple servers."

I could probably get that locally on the static pages.  But not on my production server and not with the dynamic code.

It looks like 200ms a request is the best I can do (that is a remote connection (like a typical browser)).

That seems like crap though.  5 requests a second?
Permalink Bot Berlin 
July 3rd, 2007 12:58pm
Possibly more.  Requests may not be handled serially.
Permalink xampl 
July 3rd, 2007 1:02pm
Crazy on Tap gets a crazy 1 request per second.

(these are based on my performance tests)
Permalink Bot Berlin 
July 3rd, 2007 1:06pm
Excuse me.

one request takes 1 - 1.5 seconds.
Permalink Bot Berlin 
July 3rd, 2007 1:06pm
You want to boost the speed of execution (not development) ditch the Java and start writing in C++.  C++ beats out even C (I checked).  Hook it up with FastCGI and you'll really be styling.
Permalink Send private email Clay Dowling 
July 3rd, 2007 1:45pm
there's an attribute in some XML config file somewhere ...

performance='fast'
Permalink Send private email strawdog soubriquet 
July 3rd, 2007 2:05pm

This topic is archived. No further replies will be accepted.

Other topics: July, 2007 Other topics: July, 2007 Recent topics Recent topics