MySQLdb

D

Daniel Bowett

I have just started playing around with MySQLdb for a project I am planning.

As a test I have written a script that executes 3000 insert statements
on a table. The table contains 10 fields with a mix of text and numbers
- its a product table for a website eg UPC, ProductName, Price etc.

The problem I have is that it takes just over two minuted to execute the
3000 insert statements which seems really slow! I am running it on a
machine with a 1.5 Ghz Pentium M Processor and Gig Of Ram. I dont think
the machine is to blame for the speed because during execution the
processor sits at about 10% and there is loads of free RAM.

Does anyone know if this sort of speed sounds right?

Cheers,

Dan.
 
N

nobody

The problem I have is that it takes just over two minuted to execute the
3000 insert statements which seems really slow!

Are you creating a new DB connection for every insert?

I just did a test on my system (Athlon 2500+), 3000 rows with an
auto_increment field and a randomly generated 128 character field. 1.9
seconds.
 
D

Dennis Lee Bieber

As a test I have written a script that executes 3000 insert statements
on a table. The table contains 10 fields with a mix of text and numbers
- its a product table for a website eg UPC, ProductName, Price etc.
How many indices?
The problem I have is that it takes just over two minuted to execute the
3000 insert statements which seems really slow! I am running it on a

I recall reading that, for some RDBMs, when doing such batch
inserts, they recommend turning off the indices at the start, do the
inserts, then reactivate the indices -- apparently it is faster to
rebuild an index after the data has been inserted, then to continually
update the index.

--
 
D

Daniel Bowett

Dennis said:
How many indices?




I recall reading that, for some RDBMs, when doing such batch
inserts, they recommend turning off the indices at the start, do the
inserts, then reactivate the indices -- apparently it is faster to
rebuild an index after the data has been inserted, then to continually
update the index.

UPC is my only index - its a varchar with 20 characters. I am only
opening the connection once, then doing 3000 execute statements in a for
loop.

I do have two "TEXT" fields in the table which contain the long and
short description. The average length of the long description is about
167 characters, the longest is 1800 characters. Is this whats making it
slow?
 
D

Dennis Lee Bieber

I do have two "TEXT" fields in the table which contain the long and
short description. The average length of the long description is about
167 characters, the longest is 1800 characters. Is this whats making it
slow?

Varying length fields /might/ have an effect -- I don't have the
experience to know. Sorry.

--
 
F

fedor

Hi Daniel,

You should probably take a look at the executemany method of the cursor.
Your insert times might drop by a factor 20 . Here's an example.

Cheers,

Fedor

import time
import MySQLdb


db=MySQLdb.Connect(user="me",passwd="my password",db="test")
c=db.cursor()
n=0
tic=time.time()
for i in range(3000):
n+=c.execute('INSERT INTO testtable VALUES (%s)', (i,))
toc=time.time()
t1=toc-tic
print 'separate sql statements: %s, inserted %s records' % (t1,n)



tic=time.time()
n=c.executemany('INSERT INTO testtable VALUES (%s)', [(i,) for i in
range(3000)])
toc=time.time()
t2=toc-tic
print 'all at once %s inserted %s records' % (t2,n)

OUTPUT>>>
separate sql statements: 0.571248054504, inserted 3000 records
all at once 0.0253219604492 inserted 3000 records
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
474,218
Messages
2,571,124
Members
47,727
Latest member
smavolo

Latest Threads

Top