bob gailer wrote:
Kent Johnson wrote:
On Wed, Apr 23, 2008 at 9:46 AM, bob gailer <[EMAIL PROTECTED] <mailto:[EMAIL PROTECTED]>> wrote:

    Evey time someone recommends Queue I think "oh boy this will
    really help me". Then I go to the Library Reference, read the
    Queue docs and think "oh boy who can help me understand this".
    Even the sample code is confusing.


Can you say what is confusing about it?

"The Queue module implements a multi-producer, multi-consumer FIFO queue.

I understand producer, comsumer, FIFO.

I don't understand multi-

That means -- and this is perhaps Queue's raison d'etre -- that
several threads can read from and write to it "simultaneously"
(from their perspective) without having to worry about locks
and the usual problems of a memory structure shared between
threads.

"It is especially useful in threads programming when information must be exchanged safely between multiple threads. "

I understand threads. I've written some (to me fairly sophisticated) programs using Threading and conditions.

I understand that threads might want to exchange information.

I guess that queue supports the exchange by receiving and releasing items. Is that true?

It supports it be letting, say, thread tA write to queue qA, which
thread tB reads from. (Using put & get, optionally with timeouts).

I don't know what "safely" means.

If you've done some thread programming, I assume you're familiar with
the potential pitfalls, including race conditions, deadlocks and so
on? "Safely" means "without your having to worry about those things".

"The Queue class in this module implements all the required locking semantics." I have no idea what that means nor does any of the ensuing documentation explain.

Normally if you want to have a thread access some data which *might*
be written to by another thread at, effectively, the same time, the
normal thing is to have each thread lock the data: get hold of a flag
which grants it exclusive access to the data until the lock is released. (This is sometimes called a critical section). Without that lock, it's
possible for one thread to be half-way through its update when it is
switched out and the other thread switched in. Now the data is in an
intermediate state (which you don't want).

Do you have a specific use in mind?

I have an application that uses Threading. It is not a producer, consumer application, just a bunch of threads that are started at the same time. And they do not communicate with each other, just with the main thread. But that seems to be true of Queue also.

Yes, that's a typical setup for Queue: the main thread creates a
Queue and passes it to each thread it creates. Say you had an example
where you had a pool of worker threads, each one performing a
calculation. A common approach is to have two Queues: request &#
response. The code would look something like this (entirely untested):

<code>
import threading
import Queue

class CalcThread (threading.Thread):
  def __init__ (self, requests, responses):
    threading.Thread.__init__ (self)
    self.setDaemon (True)
    self.requests = requests
    self.responses = responses
  def run (self):
    while True:
      a, b = self.requests.get ()
      self.responses.put (a + b)

if __name__ == '__main__':
  requests = Queue.Queue ()
  responses = Queue.Queue ()
  for i in range (5):
    CalcThread (requests, responses).start ()

  requests.put ((1, 5))
  requests.put ((2, 6))

  print responses.get ()
  print responses.get ()

</code>

As it stands this code is fairly stunted, but
I'm trying to keep it simple enough to understand
easily.

TJG
_______________________________________________
Tutor maillist  -  Tutor@python.org
http://mail.python.org/mailman/listinfo/tutor

Reply via email to