K
Kevin
Hello All,
I wanted to thank Roger Binn for his email. He had
the answer to my issue with writing speed. It's
actual made an incredible change in the preformace. I
didn't have to go all the way to implementing the
synchronous mode(for my app). Previously, I was
insert one record at a time. The key was to write
them all at one time. I moved up to a 13 meg file and
wrote it to the db in secs. Now the issue is the 120
meg of RAM consumed by PyParse to read in a 13 meg
file. If anyone has thoughts on that, it would be
great. Otherwise, I will repost under a more specific
email.
Thanks,
Kevin
db.execute("begin")
while i < TriNum
db.execute("""insert into TABLE(V1_x)
values(%f),""" (data))
i = i + 1
db.execute("commit")
__________________________________
Do you Yahoo!?
Yahoo! Mail - You care about security. So do we.
http://promotions.yahoo.com/new_mail
I wanted to thank Roger Binn for his email. He had
the answer to my issue with writing speed. It's
actual made an incredible change in the preformace. I
didn't have to go all the way to implementing the
synchronous mode(for my app). Previously, I was
insert one record at a time. The key was to write
them all at one time. I moved up to a 13 meg file and
wrote it to the db in secs. Now the issue is the 120
meg of RAM consumed by PyParse to read in a 13 meg
file. If anyone has thoughts on that, it would be
great. Otherwise, I will repost under a more specific
email.
Thanks,
Kevin
db.execute("begin")
while i < TriNum
db.execute("""insert into TABLE(V1_x)
values(%f),""" (data))
i = i + 1
db.execute("commit")
__________________________________
Do you Yahoo!?
Yahoo! Mail - You care about security. So do we.
http://promotions.yahoo.com/new_mail