B
Brian Kelley
I am trying to use threads and mysqldb to retrieve data from multiple
asynchronous queries.
My basic strategy is as follows, create two cursors, attach them to the
appropriate databases and then spawn worker functions to execute sql
queries and process the results.
This works occasionally, but fails a lot taking python down with it.
Sometimes it also loses connection to the database. Sometimes I get an
error, "Commands out of sync; You can't run this command now" which
makes me suspicious. Of course, I could be doing things completely
wrong. If I can't have multiple cursors by the way, that's just fine
with me. I just thought that I could
I only have one thread or no threads at all it works just fine. I have
tried using thread safe Queues to bundle results and also lists with the
same results.
Can anyone notice anything in the toy code I have attached that would
cause this effect? Thanks for any input.
import MySQLdb, thread, time
def cursoriterate(cursor, buffer=100):
res = cursor.fetchmany(buffer)
while res:
for record in res:
yield record
res = cursor.fetchmany(buffer)
def worker(cursor, sql, result):
try:
print "executing", sql
cursor.execute(sql)
output = []
for record in cursoriterate(cursor):
output.append(cursor)
result.append(output)
print "done"
except:
# just for testing
result.append(None)
raise
for i in range(100):
sql = "select target, result, evalue from BLAST_RESULT where evalue
< 0.001"
db = MySQLdb.connect(user="mergedgraph", host="localhost")
cursor = db.cursor()
cursor.execute("USE HPYLORI_YEAST")
cursor2 = db.cursor()
cursor2.execute("USE HPYLORI_YEAST")
result = []
thread.start_new_thread(worker, (cursor, sql, result))
thread.start_new_thread(worker, (cursor2, sql, result))
while len(result)< 2:
time.sleep(1)
print "results are full"
res = result.pop()
res2 = result.pop()
if res: print len(res)
if res2: print len(res2)
cursor.close()
cursor2.close()
db.close()
asynchronous queries.
My basic strategy is as follows, create two cursors, attach them to the
appropriate databases and then spawn worker functions to execute sql
queries and process the results.
This works occasionally, but fails a lot taking python down with it.
Sometimes it also loses connection to the database. Sometimes I get an
error, "Commands out of sync; You can't run this command now" which
makes me suspicious. Of course, I could be doing things completely
wrong. If I can't have multiple cursors by the way, that's just fine
with me. I just thought that I could
I only have one thread or no threads at all it works just fine. I have
tried using thread safe Queues to bundle results and also lists with the
same results.
Can anyone notice anything in the toy code I have attached that would
cause this effect? Thanks for any input.
import MySQLdb, thread, time
def cursoriterate(cursor, buffer=100):
res = cursor.fetchmany(buffer)
while res:
for record in res:
yield record
res = cursor.fetchmany(buffer)
def worker(cursor, sql, result):
try:
print "executing", sql
cursor.execute(sql)
output = []
for record in cursoriterate(cursor):
output.append(cursor)
result.append(output)
print "done"
except:
# just for testing
result.append(None)
raise
for i in range(100):
sql = "select target, result, evalue from BLAST_RESULT where evalue
< 0.001"
db = MySQLdb.connect(user="mergedgraph", host="localhost")
cursor = db.cursor()
cursor.execute("USE HPYLORI_YEAST")
cursor2 = db.cursor()
cursor2.execute("USE HPYLORI_YEAST")
result = []
thread.start_new_thread(worker, (cursor, sql, result))
thread.start_new_thread(worker, (cursor2, sql, result))
while len(result)< 2:
time.sleep(1)
print "results are full"
res = result.pop()
res2 = result.pop()
if res: print len(res)
if res2: print len(res2)
cursor.close()
cursor2.close()
db.close()