High ActiveRecord CPU Utilization

B

Brian Adkins

When running a test that primarily involves loading up a few MySQL
tables with ActiveRecord objects, I was surprised to see the Ruby CPU
utilization at 93% and the MySQL CPU utilization at 7%. I would expect
this workload to be heavier on MySQL than that.

I would think inserts (particularly with updating several foreign key
indices) would tax the database more than Ruby.

Has this been other folks' experience? Is running in the test
environment incredibly different than production with respect to CPU
utilization? I suppose my next step is to run in production to see what
kind of results I get.

I'm running the test from the root of my Rails project via:

ruby test/unit/foo.rb

Here's part of the profiler output:

%
time name
7.96 ActiveRecord::ConnectionAdapters::Quoting.quote
5.61 ActiveRecord::Base#read_attribute
5.15 ActiveRecord::Base#column_for_attribute
4.25 ActiveRecord::Base#connection
3.74 Hash#[]
3.58 Array#each
3.30 ActiveRecord::ConnectionAdapters::MysqlAdapter#quote
3.16 ActiveRecord::ConnectionAdapters::Column#type_cast
2.84 Module#===
2.65 ActiveRecord::Base#clone_attribute_value
2.29 ActiveRecord::Base#write_attribute
2.24 Kernel.class
2.22 Hash#each
2.08 String#to_s
2.03 ActiveRecord::Base#quote_value
1.59 Kernel.send
1.55 Array#include?
1.52 Kernel.==
1.48 ActiveRecord::Base#unserializable_attribute?
1.39 Class#read_inheritable_attribute
1.34 Kernel.clone
1.29 ActiveRecord::Callbacks.notify
1.15 ActiveRecord::Base#method_missing
1.08 ActiveRecord::Base#columns_hash
1.08 ActiveRecord::Base#respond_to?
1.08 ActiveRecord::Callbacks.callback
0.99 Kernel.eval
0.95 Symbol#===
0.90 Observable.notify_observers
0.88 ActiveRecord::ConnectionAdapters::Column#text?
0.85 Hash#[]=
0.85 Class#inheritable_attributes
0.83 Kernel.kind_of?
0.81 ActiveRecord::Base#convert_number_column_value
....
 
B

Brian Adkins

Brian said:
When running a test that primarily involves loading up a few MySQL
tables with ActiveRecord objects, I was surprised to see the Ruby CPU
utilization at 93% and the MySQL CPU utilization at 7%. I would expect
this workload to be heavier on MySQL than that.
[...]

I just moved my test code into a controller and ran it via:

mongrel_rails start -e production

Similar CPU characteristics except that Mongrel wasn't able to fully
utilize my dual core CPU (I suppose because of the serialization of
Rails code due to lack of thread safetyness).

So the unit test (1093 records -> table1, 1093 records -> table2, 1
record -> table3) took 5.5 seconds to complete and the identical test in
a controller with Mongrel in production mode took 27.4 seconds!

Yeah, I know I can have a cluster of Mongrel processes, and that's how I
run for real, but I'm still a little bummed with these results :(

I've switched my company's development from 100% Java to 100% Ruby, and
I still believe that was a good decision because of productivity gains
and joy, but I do miss some of the runtime performance of Java and the
ease with which I could spin up a thread to do some background process.
I'm glad BackgrounDRB has been provided, but it's not quite the same.

Hopefully future versions of Ruby/Rails will provide some more runtime
performance and concurrency - I'd be glad if I could just fork in Rails
without trouble, but I don't think that's the case.

For now, I don't have more customers than a Core 2 Duo can handle, so
it's not exactly on the critical path for me yet :) In fact, I'm glad
MySQL isn't on the critical path because overcoming that seems much more
difficult than having a bunch of Apache/Mongrel processes running.

Brian
 
S

Sam Smoot

Brian said:
When running a test that primarily involves loading up a few MySQL
tables with ActiveRecord objects, I was surprised to see the Ruby CPU
utilization at 93% and the MySQL CPU utilization at 7%. I would expect
this workload to be heavier on MySQL than that.
[...]

I just moved my test code into a controller and ran it via:

mongrel_rails start -e production

Similar CPU characteristics except that Mongrel wasn't able to fully
utilize my dual core CPU (I suppose because of the serialization of
Rails code due to lack of thread safetyness).

So the unit test (1093 records -> table1, 1093 records -> table2, 1
record -> table3) took 5.5 seconds to complete and the identical test in
a controller with Mongrel in production mode took 27.4 seconds!

Yeah, I know I can have a cluster of Mongrel processes, and that's how I
run for real, but I'm still a little bummed with these results :(

I've switched my company's development from 100% Java to 100% Ruby, and
I still believe that was a good decision because of productivity gains
and joy, but I do miss some of the runtime performance of Java and the
ease with which I could spin up a thread to do some background process.
I'm glad BackgrounDRB has been provided, but it's not quite the same.

Hopefully future versions of Ruby/Rails will provide some more runtime
performance and concurrency - I'd be glad if I could just fork in Rails
without trouble, but I don't think that's the case.

For now, I don't have more customers than a Core 2 Duo can handle, so
it's not exactly on the critical path for me yet :) In fact, I'm glad
MySQL isn't on the critical path because overcoming that seems much more
difficult than having a bunch of Apache/Mongrel processes running.

Brian

Ruby is a slowish language right now, but this isn't really Ruby's
fault. Rails is just incredibly slow, and the problem only seems to be
getting worse. You could spend some time trying to speed it up, but
there's some really broad design decisions that make that pretty
difficult. If you can live with the performance, then I guess I'd just
do so with the expectation that a future release will improve
performance. If you can't, you might try looking into some of the
alternatives? It's a tough choice...
 
B

Brian Adkins

Sam said:
Brian said:
When running a test that primarily involves loading up a few MySQL
tables with ActiveRecord objects, I was surprised to see the Ruby CPU
utilization at 93% and the MySQL CPU utilization at 7%. I would expect
this workload to be heavier on MySQL than that.
[...]
I just moved my test code into a controller and ran it via:

mongrel_rails start -e production

Similar CPU characteristics except that Mongrel wasn't able to fully
utilize my dual core CPU (I suppose because of the serialization of
Rails code due to lack of thread safetyness).

So the unit test (1093 records -> table1, 1093 records -> table2, 1
record -> table3) took 5.5 seconds to complete and the identical test in
a controller with Mongrel in production mode took 27.4 seconds!

Yeah, I know I can have a cluster of Mongrel processes, and that's how I
run for real, but I'm still a little bummed with these results :(

I've switched my company's development from 100% Java to 100% Ruby, and
I still believe that was a good decision because of productivity gains
and joy, but I do miss some of the runtime performance of Java and the
ease with which I could spin up a thread to do some background process.
I'm glad BackgrounDRB has been provided, but it's not quite the same.

Hopefully future versions of Ruby/Rails will provide some more runtime
performance and concurrency - I'd be glad if I could just fork in Rails
without trouble, but I don't think that's the case.

For now, I don't have more customers than a Core 2 Duo can handle, so
it's not exactly on the critical path for me yet :) In fact, I'm glad
MySQL isn't on the critical path because overcoming that seems much more
difficult than having a bunch of Apache/Mongrel processes running.

Brian

Ruby is a slowish language right now, but this isn't really Ruby's
fault.

Yeah, I remember when I first saw the computer language shootout stats!
I was bummed that my new favorite language performed so poorly, but then
I recalled the early days of Java (I started with 1.02) and gained some
perspective.
Rails is just incredibly slow, and the problem only seems to be
getting worse. You could spend some time trying to speed it up, but
there's some really broad design decisions that make that pretty
difficult. If you can live with the performance, then I guess I'd just
do so with the expectation that a future release will improve
performance.

I'm pretty sure I can live with the performance - I think a few fast web
server machines in front of a fast MySQL machine will do fine, and I'm
not close to needing that yet.
If you can't, you might try looking into some of the
alternatives? It's a tough choice...

When I switched from Java to Ruby, I knew I was giving up some runtime
performance and gaining much in other areas. Then I discovered Lisp and
realized, "wow, it's powerful *and* fast" :) However, I think it would
take too much work for me to get a Lisp environment to the point of
being as productive for me as Ruby on Rails is currently (reminds me of
Python web dev several years ago, maybe older), and I really do enjoy
programming in Ruby.

Python is a fair amount quicker than Ruby, but I like Ruby better, and
the speed difference doesn't appear to be huge - I did get used to the
white space, but I still don't like it in principle. It's interesting
that I learned Python first, but at the time (2 to 3 yrs ago), I didn't
feel the web frameworks were ready, so I jumped into Ruby via Rails
(common story) and then discovered that I like the feel of the language
better - it has some warts, but it's still a blast to program in. The
Python frameworks seem to have progressed significantly since then.

I just took a look at Smalltalk, but despite Seaside's success, I don't
think it's quite ready (plus I just got away from an IDE heavy
environment with Eclipse), and despite the super duper IDE capabilities,
I don't feel it warrants the learning effort for me now - maybe later.

So all in all, I'm pretty darn happy with Ruby/Rails at present. I have
been on a language research blitz recently though (many, many hours) - I
think it's motivated by Ruby's strengths and not its weaknesses. The
logic goes something like this, "I was surprised about how much better
Ruby is than Java, so I wonder if I could make that kind of jump again"
- I guess I'm just greedy, not to mention susceptible to "the grass is
always greener on the other side of the fence"

I must say that everything I've read about Paul Graham's Arc indicates I
would be very pleased with it. No idea when it will be completed, but
I'd say it has a great shot at getting some traction.

If there's an alternative I haven't mentioned that you feel is actually
viable as a contender, feel free to pass it on.

Brian
 
B

Bill Kelly

From: "Brian Adkins said:
If there's an alternative I haven't mentioned that you feel is actually
viable as a contender, feel free to pass it on.

Not sure about a viable alternative; but from what you've said
I thought you might find some of Kirk Haines' posts in the last
month or two on the Eventmachine mailing list interesting:
http://rubyforge.org/pipermail/eventmachine-talk/
http://rubyforge.org/pipermail/eventmachine-talk/2007-February/000395.html
http://rubyforge.org/pipermail/eventmachine-talk/2007-March/000494.html
http://rubyforge.org/pipermail/eventmachine-talk/2007-March/000510.html



Regards,

Bill
 
K

khaines

When running a test that primarily involves loading up a few MySQL tables
with ActiveRecord objects, I was surprised to see the Ruby CPU utilization at
93% and the MySQL CPU utilization at 7%. I would expect this workload to be
heavier on MySQL than that.

I would think inserts (particularly with updating several foreign key
indices) would tax the database more than Ruby.

Has this been other folks' experience? Is running in the test environment
incredibly different than production with respect to CPU utilization? I
suppose my next step is to run in production to see what kind of results I
get.

No, that isn't surprising. Any ORM trades CPU utilization outside of the
database for convenience in the data representation. AR is fairly
heavyweight in that regard, so it's doing a lot of work to give you the
API that it does. Latency to a database can be a significant bottleneck
to some applications, but relatively speaking, CPU utilization by the db
will usually be a small fraction of the CPU utilization of the ORM using
application that is talking with the db.


Kirk Haines
 
B

Ben Bleything

Hey Brian, could you start this conversation on the Rails list? It's a
pretty narrow topic.

For what it's worth, I'm using AR extensively outside rails and would
have missed this had it started on the Rails list. I like it when AR
discussions happen here ;)

Ben
 
B

Brian Adkins

No, that isn't surprising. Any ORM trades CPU utilization outside of
the database for convenience in the data representation. AR is fairly
heavyweight in that regard, so it's doing a lot of work to give you the
API that it does. Latency to a database can be a significant bottleneck
to some applications, but relatively speaking, CPU utilization by the db
will usually be a small fraction of the CPU utilization of the ORM using
application that is talking with the db.

I agree about ORMs trading CPU for convenience, but I've used other ORMs
(such as Hibernate with Java), and I don't recall the CPU ratio to be
quite so high. In fact, my prior experience has always been that a piece
of code such as I'm using would be database bound, and not CPU bound. I
mean, seriously, all the code is doing is a bunch of database inserts!

Also, since I'm not using stored procedures with Rails, MySQL has to
work harder to parse the statements, so if stored procedures were used,
the Rails/MySQL CPU ratio would be even higher than 93/7.

Frankly, it's not worth my time to whip up a Java/Hibernate example for
comparison, but I'm pretty darn curious now - both at what the CPU ratio
would be as well as the time for completion.

I'm willing to live with it, but that doesn't mean I have to like it :)
 
B

Brian Adkins

Ben said:
For what it's worth, I'm using AR extensively outside rails and would
have missed this had it started on the Rails list. I like it when AR
discussions happen here ;)

Ben

I never saw Jeremy's message :( I don't know if the problem is with my
ISP (Bellsouth DSL) or my mail reader (Thunderbird), but this isn't the
first time that I've missed a relevant message. It makes me wonder how
many other posts aren't getting through.

Anyway, Jeremy, I actually did search for a rails newsgroup and
Thunderbird didn't show any newsgroups with rails in the name except for
(msn.onstage.motorsite.offroad.favtrails) :) So I did try, then I
searched this newsgroup and noticed a fair amount of Rails related
posts, so I thought it would be ok to post it here. Also, in this case,
even though Rails may be slow, I do think the speed of Ruby is an
important factor in my results.

If you can point me to the Rails newsgroup, I'll try subscribing even if
it doesn't show in Thunderbird. If you're referring to a mailing list
instead, that is less desirable to me, but I could make use of it for
Rails posts that are less relevant to Ruby.

Brian
 
S

Sam Smoot

I agree about ORMs trading CPU for convenience, but I've used other ORMs
(such as Hibernate with Java), and I don't recall the CPU ratio to be
quite so high. In fact, my prior experience has always been that a piece
of code such as I'm using would be database bound, and not CPU bound. I
mean, seriously, all the code is doing is a bunch of database inserts!

Also, since I'm not using stored procedures with Rails, MySQL has to
work harder to parse the statements, so if stored procedures were used,
the Rails/MySQL CPU ratio would be even higher than 93/7.

Frankly, it's not worth my time to whip up a Java/Hibernate example for
comparison, but I'm pretty darn curious now - both at what the CPU ratio
would be as well as the time for completion.

I'm willing to live with it, but that doesn't mean I have to like it :)

I'd definitely echo the perception that AR is unreasonably taxing on
the CPU.
Because it is. Performance just hasn't been a focus for it. Load up
1,000 AR
records with 10 attributes and you've got 10,000 strings to represent
the
column names. That's just one example. I'd expect Og to be faster, but
that's just a guess since I haven't actually benchmarked it.

I wasn't trying to imply looking at other languages, just maybe other
techniques. As I understand it Og+Rails has some issues right now
because
ActiveSupport and Facets step on each other (that's just what I gather
from
the nitro-general list). But I imagine contributing to resolve the
issues
between Og & Rails probably isn't as great an undertaking as making AR
faster.
Depending on Og's performance that may or may not be worthwhile I
suppose.

One other thing to consider though: MySQL is the fastest mainstream
database
I know of by a large margin, especially concerning INSERT and UPDATE
performance. So if you were using Oracle or MSSQL before, that would
skew
the results a bit more.
 
A

Avdi Grimm

If you can point me to the Rails newsgroup, I'll try subscribing even if
it doesn't show in Thunderbird. If you're referring to a mailing list
instead, that is less desirable to me, but I could make use of it for
Rails posts that are less relevant to Ruby.

The best place to look for ruby-related mailing lists and newsgroups
is http://www.ruby-forum.com/. You can find a link to the Rails
mailing list from there. You may find it easier to use the ruby-forum
frontend, though, because it's a very high-volume list.

That said, I concur that ActiveRecord has wider applicability than
just Rails, and I find this an interesting topic of discussion.
 
Z

zdennis

When running a test that primarily involves loading up a few MySQL
tables with ActiveRecord objects, I was surprised to see the Ruby CPU
utilization at 93% and the MySQL CPU utilization at 7%. I would expect
this workload to be heavier on MySQL than that.

What is your script doing? Can you post it?


Has this been other folks' experience? Is running in the test
environment incredibly different than production with respect to CPU
utilization?

I think depends on more of what you're doing and how you're doing it.
I've seen CPU and memory issues with AR before, but these have all
been fixed by understanding how and when to do things in AR (and in
ruby). This also depends on the hardware differences between your
development, test and production environments. Disk speed, memory and
CPU(s) can have alot to do with the change between the environments.
I suppose my next step is to run in production to see what
kind of results I get.

I'm running the test from the root of my Rails project via:

What your program is doing and how it is doing it will give more
insight into your issue then just profiler output. Have you isolated
the problem to a particular block of code?

Zach
 
M

M. Edward (Ed) Borasky

Avdi said:
The best place to look for ruby-related mailing lists and newsgroups
is http://www.ruby-forum.com/. You can find a link to the Rails
mailing list from there. You may find it easier to use the ruby-forum
frontend, though, because it's a very high-volume list.

That said, I concur that ActiveRecord has wider applicability than
just Rails, and I find this an interesting topic of discussion.
Yes, I'd like to see this discussion continue on the Ruby list, because
there are some things to be learned from experiences with ActiveRecord.
So, some questions for Brian:

1. What's your platform, and what version of Ruby are you running?
2. Is it possible for you to abstract a subset of your application as a
benchmark, suitable for profiling?
3. Is it "simple enough" that it will "probably work" with the recent
jRuby implementation of Rails?
 
B

Brian Adkins

zdennis said:
What is your script doing? Can you post it?

I created a smaller test that I could post that exhibits the same
characteristics:

class PerfTestController < ApplicationController
def index
t1 = Time.now
3000.times do
member = Member.new
member.first_name = 'Fred'
member.last_name = 'Flintstone'
member.address1 = '123 High St.'
member.city = 'Reykjavik'
member.state = 'Michigan'
member.email = '(e-mail address removed)'
member.save!
end
t2 = Time.now
puts "Time elapsed = #{t2-t1}"
end
end

That took 35.7 seconds (84 inserts per second) on a dual core 2 GHz AMD
Opteron. It pegged Mongrel and MySQL didn't break a sweat.

I just ran another test with a short ruby program inserting records
directly using the mysql gem and it only took 1.6 seconds (1,875 inserts
per second!), and the CPU utilization was as it should be - the MySQL
CPU was ten times as much as Ruby. So it definitely appears that
Rails/ActiveRecord is about 22 times as slow than a straight Ruby
program - wow!

This result makes me feel much better since the performance of Ruby
seems fine. The fact that Rails/ActiveRecord is way slow isn't hurting
me yet, and there is hope it can be sped up since it doesn't appear to
be an inherent problem with Ruby.

Here's the schema for Member:

create table members (
id int not null auto_increment,
created_at datetime not null,
updated_at datetime not null,
first_name varchar(30) null,
last_name varchar(30) null,
address1 varchar(50) null,
address2 varchar(50) null,
city varchar(30) null,
state varchar(5) null,
email varchar(100) null,
home_phone varchar(25) null,
primary key(id)
) engine=InnoDB;
I think depends on more of what you're doing and how you're doing it.
I've seen CPU and memory issues with AR before, but these have all
been fixed by understanding how and when to do things in AR (and in
ruby). This also depends on the hardware differences between your
development, test and production environments. Disk speed, memory and
CPU(s) can have alot to do with the change between the environments.


What your program is doing and how it is doing it will give more
insight into your issue then just profiler output. Have you isolated
the problem to a particular block of code?

I respectfully disagree. It was the profiler output that showed me where
the time was being spent. I truncated the profiler output before it even
got to my code - it was all Rails code.
 
B

Brian Adkins

M. Edward (Ed) Borasky said:
Yes, I'd like to see this discussion continue on the Ruby list, because
there are some things to be learned from experiences with ActiveRecord.
So, some questions for Brian:

1. What's your platform, and what version of Ruby are you running?

$ ruby -v
ruby 1.8.4 (2005-12-24) [i486-linux]

$ uname -a
Linux airstream 2.6.17-11-generic #2 SMP Thu Feb 1 19:52:28 UTC 2007
i686 GNU/Linux

$ rails -v
Rails 1.2.1

I'm running Ubuntu 6.10
2. Is it possible for you to abstract a subset of your application as a
benchmark, suitable for profiling?

Well, the "application" in this case is just a simple test for
benchmarking :) See my previous post for the Rails controller code
(what little there is of it).
3. Is it "simple enough" that it will "probably work" with the recent
jRuby implementation of Rails?

I would hope so - it doesn't get much simpler.
 
D

Devin Mullins

Brian said:
I created a smaller test that I could post that exhibits the same
characteristics:

3000.times do
member = Member.new
member.first_name = 'Fred'
...
member.save!
end

Is this being run in development mode, still? Dev mode does database
reflection pretty often -- you might be getting pinged by that.
 
B

Brian Adkins

Devin said:
Is this being run in development mode, still? Dev mode does database
reflection pretty often -- you might be getting pinged by that.

Nope, it's running in the production environment.

mongrel_rails start -e production
 
M

M. Edward (Ed) Borasky

Brian said:
M. Edward (Ed) Borasky said:
Yes, I'd like to see this discussion continue on the Ruby list,
because there are some things to be learned from experiences with
ActiveRecord. So, some questions for Brian:

1. What's your platform, and what version of Ruby are you running?

$ ruby -v
ruby 1.8.4 (2005-12-24) [i486-linux]
A couple of suggestions here:

1. Download the latest Ruby source -- 1.8.6 pre something.
2. Compile it for your architecture -- set CFLAGS = "-O2 -march=xxx"

where "xxx" is your architecture -- it's an AMD64 of some kind, right?

If you're running a 64-bit system, make sure you have a recent
64-bit kernel and GCC 4.1 -- older compilers suck wet dog fur on the 64
bit machines.

That should get you somewhere in the 10 - 30 percent speed improvement
over a 486-compiled Ruby 1.8.4. It might be more, but just doing the
1.8.6 and the -O2 / march= stuff is pretty much mandatory.
$ uname -a
Linux airstream 2.6.17-11-generic #2 SMP Thu Feb 1 19:52:28 UTC 2007
i686 GNU/Linux
That should be OK -- most likely it's user time anyhow, not kernel time.
$ rails -v
Rails 1.2.1

I'm running Ubuntu 6.10
That should be fine too. The compiler is probably more important.
Well, the "application" in this case is just a simple test for
benchmarking :) See my previous post for the Rails controller code
(what little there is of it). Yeah, I saw that.


I would hope so - it doesn't get much simpler.
I know Charles Oliver Nutter reads this list -- he's looking for tests
for the latest jRuby/Rails.
 
B

Brian Adkins

M. Edward (Ed) Borasky said:
Brian said:
M. Edward (Ed) Borasky said:
1. What's your platform, and what version of Ruby are you running?

$ ruby -v
ruby 1.8.4 (2005-12-24) [i486-linux]
A couple of suggestions here:

1. Download the latest Ruby source -- 1.8.6 pre something.
2. Compile it for your architecture -- set CFLAGS = "-O2 -march=xxx"

where "xxx" is your architecture -- it's an AMD64 of some kind, right?

If you're running a 64-bit system, make sure you have a recent
64-bit kernel and GCC 4.1 -- older compilers suck wet dog fur on the 64
bit machines.

That should get you somewhere in the 10 - 30 percent speed improvement
over a 486-compiled Ruby 1.8.4. It might be more, but just doing the
1.8.6 and the -O2 / march= stuff is pretty much mandatory.

Interesting. Do you mind if I ask where you got the 10 to 30% figure?
Stability is more important to me than raw speed, so I'd prefer to not
use anything newer than Ruby 1.8.5-p12.

I'm running a 32 bit kernel because 64 bit was, let's say, problematic.
My gcc is 4.1.2.
 
J

Jeremy Kemper

------=_Part_212542_25214727.1173336304234
Content-Type: text/plain; charset=ISO-8859-1; format=flowed
Content-Transfer-Encoding: 7bit
Content-Disposition: inline

I created a smaller test that I could post that exhibits the same
characteristics:

class PerfTestController < ApplicationController
def index
t1 = Time.now
3000.times do
member = Member.new
member.first_name = 'Fred'
member.last_name = 'Flintstone'
member.address1 = '123 High St.'
member.city = 'Reykjavik'
member.state = 'Michigan'
member.email = '(e-mail address removed)'
member.save!
end
t2 = Time.now
puts "Time elapsed = #{t2-t1}"
end
end

That took 35.7 seconds (84 inserts per second) on a dual core 2 GHz AMD
Opteron. It pegged Mongrel and MySQL didn't break a sweat.

I just ran another test with a short ruby program inserting records
directly using the mysql gem and it only took 1.6 seconds (1,875 inserts
per second!), and the CPU utilization was as it should be - the MySQL
CPU was ten times as much as Ruby. So it definitely appears that
Rails/ActiveRecord is about 22 times as slow than a straight Ruby
program - wow!

This result makes me feel much better since the performance of Ruby
seems fine. The fact that Rails/ActiveRecord is way slow isn't hurting
me yet, and there is hope it can be sped up since it doesn't appear to
be an inherent problem with Ruby.

Here's the schema for Member:

create table members (
id int not null auto_increment,
created_at datetime not null,
updated_at datetime not null,
first_name varchar(30) null,
last_name varchar(30) null,
address1 varchar(50) null,
address2 varchar(50) null,
city varchar(30) null,
state varchar(5) null,
email varchar(100) null,
home_phone varchar(25) null,
primary key(id)
) engine=InnoDB;


Hi Brian,

I wrapped this up in a simple script that anyone with MySQL or SQLite
and the AR gem can run. It benchmarks AR create vs using the db
connection directly. See attached.

Excerpted results on a new MacBook Pro:
user system total real
raw quoted 0.460000 0.000000 0.460000 ( 0.480184)
create 2.760000 0.080000 2.840000 ( 3.225227)

(Nearly 7 times slower.) I haven't tried profiling the methods yet.

In my experience with typical Rails apps, you'll hit a wall with ERB
template rendering much sooner than with Active Record creation. This
is an interesting pursuit nonetheless -- I'm interested to see what
you all come up with.

Best regards,
jeremy

------=_Part_212542_25214727.1173336304234
Content-Type: application/octet-stream; name=ar_bench.rb
Content-Transfer-Encoding: base64
X-Attachment-Id: f_ez0u2y1j
Content-Disposition: attachment; filename="ar_bench.rb"

IyEvdXNyL2Jpbi9lbnYgcnVieQpyZXF1aXJlICdiZW5jaG1hcmsnCgojIERvIHRoZSBnZW0gZGFu
Y2UuCnJlcXVpcmUgJ3J1YnlnZW1zJwpnZW0gJ2FjdGl2ZXJlY29yZCcKcmVxdWlyZSAnYWN0aXZl
X3JlY29yZCcKCgojIENvbm5lY3QgdG8gZGF0YWJhc2UsIGNyZWF0ZSBtZW1iZXJzIHRhYmxlLCBk
ZWZpbmUgc29tZSBiZW5jaCBtZXRob2RzLgpjbGFzcyBNZW1iZXIgPCBBY3RpdmVSZWNvcmQ6OkJh
c2UKICAjIEZhbGxiYWNrIHRvIGluLW1lbW9yeSBTUUxpdGUgaWYgTXlTUUwgbWVtYmVyX2JlbmNo
IGlzbid0IGF2YWlsYWJsZS4KICBiZWdpbgogICAgZXN0YWJsaXNoX2Nvbm5lY3Rpb24gOmFkYXB0
ZXIgPT4gJ215c3FsJywgOmRhdGFiYXNlID0+ICdtZW1iZXJfYmVuY2gnLCA6dXNlciA9PiAncm9v
dCcKICAgIGNvbm5lY3Rpb24ucGluZwogIHJlc2N1ZQogICAgZXN0YWJsaXNoX2Nvbm5lY3Rpb24g
OmFkYXB0ZXIgPT4gJ3NxbGl0ZScsIDpkYXRhYmFzZSA9PiAnOm1lbW9yeTonCiAgZW5kCgogIGNv
bm5lY3Rpb24uY3JlYXRlX3RhYmxlIHRhYmxlX25hbWUgZG8gfHR8CiAgICAldyhmaXJzdF9uYW1l
IGxhc3RfbmFtZSBhZGRyZXNzMSBjaXR5IHN0YXRlIGVtYWlsKS5lYWNoIGRvIHxuYW1lfAogICAg
ICB0LmNvbHVtbiBuYW1lLCA6c3RyaW5nCiAgICBlbmQKICBlbmQKCiAgRklYVFVSRSA9IHsgOmZp
cnN0X25hbWUgPT4gJ0ZyZWQnLCA6bGFzdF9uYW1lID0+ICdGbGludHN0b25lJywKICAgICAgICAg
ICAgICA6YWRkcmVzczEgPT4gJzEyMyBIaWdoIFN0LicsIDpjaXR5ID0+ICdSZXlramF2aWsnLAog
ICAgICAgICAgICAgIDpzdGF0ZSA9PiAnTWljaGlnYW4nLCA6ZW1haWwgPT4gJ2ZyZWRAZmxpbnRz
dG9uZS5jb20nIH0KCiAgIyBUaGUgc2ltcGxlc3Qgd2F5IHRvIGNyZWF0ZSBhIHJlY29yZC4KICBk
ZWYgc2VsZi5jcmVhdGVfZml4dHVyZQogICAgY3JlYXRlISBGSVhUVVJFCiAgZW5kCgogICMgRXhw
bG9kZWQgYSBsaXR0bGUuCiAgZGVmIHNlbGYuaW5pdGlhbGl6ZV9hbmRfc2F2ZV9maXh0dXJlCiAg
ICBuZXcoRklYVFVSRSkuc2F2ZSEKICBlbmQKCiAgIyBFeHBsb2RlZCBhIGxpdHRsZSBtb3JlLgog
IGRlZiBzZWxmLmluaXRpYWxpemVfc2V0X2FuZF9zYXZlX2ZpeHR1cmUKICAgIG1lbWJlciA9IG5l
dwogICAgRklYVFVSRS5lYWNoIHsgfGtleSwgdmFsdWV8IG1lbWJlci5zZW5kKCIje2tleX09Iiwg
dmFsdWUpIH0KICAgIG1lbWJlci5zYXZlIQogIGVuZAoKICBkZWYgc2VsZi5pbnNlcnRfdW5xdW90
ZWRfZml4dHVyZQogICAga2V5cyA9IEZJWFRVUkUua2V5cyAqICcsJwogICAgdmFsdWVzID0gRklY
VFVSRS52YWx1ZXMubWFwIHsgfHZ8ICInI3t2fSciIH0gKiAnLCcKICAgIGNvbm5lY3Rpb24uaW5z
ZXJ0ICJpbnNlcnQgaW50byAje3RhYmxlX25hbWV9ICgje2tleXN9KSB2YWx1ZXMgKCN7dmFsdWVz
fSkiCiAgZW5kCgogIGRlZiBzZWxmLmluc2VydF9xdW90ZWRfZml4dHVyZQogICAga2V5cywgdmFs
dWVzID0gW10sIFtdCgogICAgRklYVFVSRS5lYWNoIGRvIHxrZXksIHZhbHVlfAogICAgICBrZXlz
IDw8IGNvbm5lY3Rpb24ucXVvdGVfY29sdW1uX25hbWUoa2V5KQogICAgICB2YWx1ZXMgPDwgY29u
bmVjdGlvbi5xdW90ZSh2YWx1ZSwgY29sdW1uc19oYXNoW2tleV0pCiAgICBlbmQKCiAgICBjb25u
ZWN0aW9uLmluc2VydCAiaW5zZXJ0IGludG8gI3t0YWJsZV9uYW1lfSAoI3trZXlzICogJywnfSkg
dmFsdWVzICgje3ZhbHVlcy5qb2luKCcsJyl9KSIKICBlbmQKCiAgZGVmIHNlbGYuaW5zZXJ0X3F1
b3RlZF9maXh0dXJlX3dpdGhfY2FjaGVkX2Nvbm5lY3Rpb24KICAgIGtleXMsIHZhbHVlcyA9IFtd
LCBbXQoKICAgIGNvbm4gPSBjb25uZWN0aW9uCiAgICBGSVhUVVJFLmVhY2ggZG8gfGtleSwgdmFs
dWV8CiAgICAgIGtleXMgPDwgY29ubi5xdW90ZV9jb2x1bW5fbmFtZShrZXkpCiAgICAgIHZhbHVl
cyA8PCBjb25uLnF1b3RlKHZhbHVlLCBjb2x1bW5zX2hhc2hba2V5XSkKICAgIGVuZAoKICAgIGNv
bm4uaW5zZXJ0ICJpbnNlcnQgaW50byAje3RhYmxlX25hbWV9ICgje2tleXMgKiAnLCd9KSB2YWx1
ZXMgKCN7dmFsdWVzLmpvaW4oJywnKX0pIgogIGVuZAplbmQKCmlmIF9fRklMRV9fID09ICQwCiAg
biA9IChBUkdWLnNoaWZ0IHx8IDMwMDApLnRvX2kKICBwdXRzICIoI3tufSBpdGVyYXRpb25zIHBl
ciB0ZXN0KSIKCiAgQmVuY2htYXJrLmJtYm0gZG8gfHh8CiAgICB7ICdjcmVhdGUnICAgICAgICAg
ID0+IDpjcmVhdGVfZml4dHVyZSwKICAgICAgIyduZXcrc2F2ZScgICAgICAgID0+IDppbml0aWFs
aXplX2FuZF9zYXZlX2ZpeHR1cmUsCiAgICAgICMnbmV3K3NldCtzYXZlJyAgICA9PiA6aW5pdGlh
bGl6ZV9zZXRfYW5kX3NhdmVfZml4dHVyZSwKICAgICAgIydyYXcgcXVvdGVkK2Nvbm4nID0+IDpp
bnNlcnRfcXVvdGVkX2ZpeHR1cmVfd2l0aF9jYWNoZWRfY29ubmVjdGlvbiwKICAgICAgIydyYXcg
dW5xdW90ZWQnICAgID0+IDppbnNlcnRfdW5xdW90ZWRfZml4dHVyZSwKICAgICAgJ3JhdyBxdW90
ZWQnICAgICAgPT4gOmluc2VydF9xdW90ZWRfZml4dHVyZQogICAgfS5lYWNoIGRvIHxuYW1lLCBi
ZW5jaHwKICAgICAgIyAuZHVwIGJlY2F1c2UgYmVuY2htYXJrIHdhbnRzIHRvIG1vZGlmeSB0aGUg
ZnJvemVuIHN0cmluZywgd3RmCiAgICAgIHgucmVwb3J0IG5hbWUuZHVwIGRvCiAgICAgICAgbi50
aW1lcyB7IE1lbWJlci5zZW5kIGJlbmNoIH0KICAgICAgZW5kCiAgICAgIE1lbWJlci5kZWxldGVf
YWxsCgogICAgICB4LnJlcG9ydCAiI3tuYW1lfSB0eG4iIGRvCiAgICAgICAgTWVtYmVyLnRyYW5z
YWN0aW9uIGRvCiAgICAgICAgICBuLnRpbWVzIHsgTWVtYmVyLnNlbmQgYmVuY2ggfQogICAgICAg
IGVuZAogICAgICBlbmQKICAgICAgTWVtYmVyLmRlbGV0ZV9hbGwKICAgIGVuZAogIGVuZAplbmQK

------=_Part_212542_25214727.1173336304234--
 

Members online

Forum statistics

Threads
473,968
Messages
2,570,152
Members
46,697
Latest member
AugustNabo

Latest Threads

Top