Accessing a database from a multithreaded application

A

Alan Kemp

Hi,

I have a problem that is half python, half design. I have a
multithreaded network server working, each client request spawns a new
thread which deals with that client for as long as it is connected
(think ftp style rather than http style connections here). Each thread
gets passed a reference to the main server to access things like the
list of connected clients, global data, etc.

Now I want to add a database to store usernames and other user data. I
grabbed a copy of pysqlite2 which I have used successfully in the past
in single-threaded applications. My initial plan was for the server to
create a single connection and cursor and then just use some sort of
mutex to control access to it from the various threads. However,
apparently you can only access the cursor from the thread on which it
was created...

Can someone suggest a better (ie, valid) strategy for this? Should I
be using a Queue to make a list of db requests/results to get accross
the thread boundary (erg, that sounds nasty)? Should each client
thread create its own connection/cursor to the database? Would that
even work, wont there be locking issues?

Any suggestions or pointers in the direction of more information would
be greatly appreciated.

Thanks for your time,

Alan
 
F

Frithiof Andreas Jensen

Alan Kemp said:
Can someone suggest a better (ie, valid) strategy for this?

Pass the connection to the thread as a parameter and use it to create a
cursor local to the thread. You may have to create a connection per thread
also - in some database implementations connections are "global" like
cursors are. Cursors are throwaway things anyway.
Should I
be using a Queue to make a list of db requests/results to get accross
the thread boundary (erg, that sounds nasty)?

Maybe - it is safest to limit the amount of threads/cursors/connections that
can be created by people one does not know.
Should each client
thread create its own connection/cursor to the database? Would that
even work, wont there be locking issues?

Yes, and maybe; one connection/cursor for each thread will always work.
Any suggestions or pointers in the direction of more information would
be greatly appreciated.

Maybe use queues to pass the requests to a pool of threads; there might be a
limit to how many connection/cursors that can be created at the same time --
and some DOS tool will find it.
 
C

Christoph Zwerschke

Alan said:
I have a problem that is half python, half design. I have a
multithreaded network server working, each client request spawns a new
thread which deals with that client for as long as it is connected
(think ftp style rather than http style connections here). Each thread
gets passed a reference to the main server to access things like the
list of connected clients, global data, etc.
> ...
Can someone suggest a better (ie, valid) strategy for this?

Have a look at DBUtils (http://www.webwareforpython.org/DBUtils).

Basically, there are two possibilities: Persistent connections that are
bound to your server threads, or a pool of connections that are
independent from the server threads. I prefer the first solution if the
number of server threads stays constant. If the server regularly creates
and destroys threads, I prefer spooling. DBUtils supports both.

I plan to write a doco describing these ideas and the usage of DBUtils
in detail. For now you need to get along with the inline docstrings.

-- Christoph
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,995
Messages
2,570,226
Members
46,815
Latest member
treekmostly22

Latest Threads

Top