Problem remotely shutting down a windows computer with python

2005-01-02 Thread EW
I have a problem when using the python script found here:

http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/360649

It is a script to remotely shutdown a windows computer.  When I use it,
the computer shuts down, but doesn't power off like with a regular
shutdown. It stays on the "Safe to power off" screen and I have to push
the power button to actually power off.  Anyone know why this happens
with this script?  Thanks for any help.

Eric

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Problem remotely shutting down a windows computer with python

2005-01-02 Thread EW
I believe that would shutdown the computer you were physically at, but
it wouldn't shutdown the computer down the hall over the LAN like this
script was meant to do.

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Problem remotely shutting down a windows computer with python

2005-01-06 Thread EW
This does exactly what I needed!  Thanks!  Not sure what Windows
Management Instrumentation is, but I'll look into it now.

Eric

-- 
http://mail.python.org/mailman/listinfo/python-list


reading windows event logs

2009-11-25 Thread EW
Hi All,
 I'm looking for some guidance on a better way to read eventlogs
from windows servers.  I've written a handy little app that relies on
WMI to pull the logs an in all my testing it worked great.  When I
deployed it, however, WMI choked on servers with a lot of logs.  I've
tried pulling the logs using much smaller VB scripts as well and they
still failed, so I'm pretty sure I'm facing a WMI problem and not a
python or system resources problem.  So I couldn't effectively get
logs off of domain controllers for example or file servers that had
auditing turned on.  Sadly those are exactly the types of servers
whose logs are most interesting.

 So I'm looking for suggestions on a way to grab that data without
using WMI for remote machines.  I know MS has C libraries for this but
I haven't touched C for 10 years so I'm hoping there's a python
equivalent out there somewhere.  Any advice would be appreciated.

Thanks in advance for any help,
Eric
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Confused: Newbie Function Calls

2010-08-11 Thread EW
This will work:

sample_string=""

def gen_header(sample_string=""):
HEADER = """
mymultilinestringhere
"""

sample_string+= HEADER
return sample_string

def gen_nia(sample_string=""):

NIA = """
anothermultilinestringhere
"""
sample_string += NIA
return sample_string

sample_string = gen_header(sample_string)
sample_string = gen_nia(sample_string)
print(sample_string)


and this will work


sample_string=""

def gen_header(OtherString):
global sample_string
HEADER = """
mymultilinestringhere
"""

sample_string+= HEADER


def gen_nia(OtherString):
global sample_string
NIA = """
anothermultilinestringhere
"""
sample_string += NIA


gen_header(sample_string)
gen_nia(sample_string)
print(sample_string)




The first one is the better of the 2 in this example but the second
one will show you how to use global variables if you really need to
use them

So your problem was that you thought you were working on a global
variable in your functions when you were not.  Since the your def
lines contained sample_string that make it a local variable.  So when
you were doing your += statements you were working on a local variable
and not a global variable.  You were returning the value of the local
variable but you didn't have anything in the main body of your script
catching that value.  So simply changing these 2 lines:
sample_string = gen_header(sample_string)
sample_string = gen_nia(sample_string)

made the global sample_string variable store the values of the return
data.


If you want to use global variables then you just have to do 2
things.  First you have to make sure you don't have any local
variables it the function with the same name.  So I change the name to
OtherString in the def line.  Then you need a global statement at the
start of your function (global sample_string) that tells python that
you really do want to use that global variable.

Global variables can cause you no end of heartache so python forces
you to explicitly state that you want to use them.

Hope that helps.
-- 
http://mail.python.org/mailman/listinfo/python-list


Queue cleanup

2010-08-11 Thread EW
Hi

I'm writing a multithreaded app that relies on Queues to move data
between the threads.  I'm trying to write my objects in a general way
so that I can reuse them in the future so I need to write them in such
a way that I don't know how many producer and how many consumer
threads I might need.  I also might have different consumer threads do
different tasks (for example one might write to a log and one might
write to SQL) so that again means I can't plan for a set ratio of
consumers to producers.  So it's unknown.

So this means that instead of having 1 Queue that all the producers
put to and that all the consumers get from I actually have 1 Queue per
producer thread  that the main body sends to the correct type of
consumer thread.  So I could get something like this where 3 producer
threads write to 3 different Queues all of which get read by 1
consumer thread:

P1P2   P3
 \|   /
   \  |  /
C1

So producers 1, 2, and 3 all write to individual Queues and consumer 1
had a list of those Queues and reads them all.  The problem I'm having
is that those producer threads can come and go pretty quickly and when
they die I can cleanup the thread with join() but I'm still left with
the Queue.  So I could get something like this:

P1 P3
 \|   /
   \  |  /
C1

So here the P2 thread has ended and gone away but I still have his
Queue lingering.

So on a thread I can use is_alive() to check status and use join() to
clean up but I don't see any analogous functionality for Queues.  How
do I kill them?  I thought about putting a suicide message on the
Queue and then C1 would read it and set the variable to None but i'm
not sure setting the variable to None actually makes the Queue go
away.  It could just end up sitting in memory unreferenced - and
that's not good.  Additionally, I could have any number of consumer
threads reading that Queue so once the first one get the suicide note
the other consumer threads never would.

I figure there has to be an elegant way for managing my Queues but so
far I can't find it.  Any suggestions would be appreciated and thanks
in advance for any help.


ps Python rocks.
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Confused: Newbie Function Calls

2010-08-11 Thread EW
On Aug 11, 12:39 pm, fuglyducky  wrote:
> On Aug 11, 9:31 am, Pinku Surana  wrote:
>
>
>
>
>
> > On Aug 11, 12:07 pm, fuglyducky  wrote:
>
> > > I am a complete newbie to Python (and programming in general) and I
> > > have no idea what I'm missing. Below is a script that I am trying to
> > > work with and I cannot get it to work. When I call the final print
> > > function, nothing prints. However, if I print within the individual
> > > functions, I get the appropriate printout.
>
> > > Am I missing something??? Thanks in advance
>
> > > 
> > > # Global variable
> > > sample_string = ""
>
> > > def gen_header(sample_string):
> > >     HEADER = """
> > >     mymultilinestringhere
> > >     """
>
> > >     sample_string += HEADER
> > >     return sample_string
>
> > > def gen_nia(sample_string):
> > >     NIA = """
> > >     anothermultilinestringhere
> > >     """
>
> > >     sample_string += NIA
> > >     return sample_string
>
> > > gen_header(sample_string)
> > > gen_nia(sample_string)
>
> > > print(sample_string)
>
> > There are 2 problems with your program.
>
> > (1) If you want to use a global variable in a function, you have to
> > add the line "global sample_string" to the beginning of that
> > function.
>
> > (2) Once you do (1), you will get an error because you've got
> > sample_string as a global and a function parameter. Which one do you
> > want to use in the function? You should change the name of the
> > parameter to "sample" to solve that confusion.
>
> > Here's the result, which works for me:
>
> > sample_string = ""
> > def gen_header(sample):
> >     global sample_string
> >     HEADER = """
> >     mymultilinestringhere
> >     """
> >     sample_string = sample + HEADER
> >     return sample_string
> > def gen_nia(sample):
> >     global sample_string
> >     NIA = """
> >     anothermultilinestringhere
> >     """
> >     sample_string = sample + NIA
> >     return sample_string
> > gen_header(sample_string)
> > gen_nia(sample_string)
> > print(sample_string)
>
> Thanks! That did the trick.
>
> I am a bit confused though. I tried to follow a sample in a book
> (which works) where I didn't have to 1) pass the global variable as a
> parameter into the function, 2) did not have to define the global
> variable within the function. I apologize if this is a super stupid
> question but if it is global, why do I have to pass it into the
> function? Shouldn't the global variable be accessible from anywhere???

If it's a global then you don't have to pass it to the function but
you do have to have the line that says:
global sample_string


Now if you think the example in the book didn't do that and it still
worked then if you post that sample I'm sure somebody can tell you why
it worked.  The book example might be doing something different.
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Queue cleanup

2010-08-11 Thread EW
On Aug 11, 12:55 pm, EW  wrote:
> Hi
>
> I'm writing a multithreaded app that relies on Queues to move data
> between the threads.  I'm trying to write my objects in a general way
> so that I can reuse them in the future so I need to write them in such
> a way that I don't know how many producer and how many consumer
> threads I might need.  I also might have different consumer threads do
> different tasks (for example one might write to a log and one might
> write to SQL) so that again means I can't plan for a set ratio of
> consumers to producers.  So it's unknown.
>
> So this means that instead of having 1 Queue that all the producers
> put to and that all the consumers get from I actually have 1 Queue per
> producer thread  that the main body sends to the correct type of
> consumer thread.  So I could get something like this where 3 producer
> threads write to 3 different Queues all of which get read by 1
> consumer thread:
>
> P1    P2   P3
>      \    |   /
>        \  |  /
>         C1
>
> So producers 1, 2, and 3 all write to individual Queues and consumer 1
> had a list of those Queues and reads them all.  The problem I'm having
> is that those producer threads can come and go pretty quickly and when
> they die I can cleanup the thread with join() but I'm still left with
> the Queue.  So I could get something like this:
>
> P1         P3
>      \    |   /
>        \  |  /
>         C1
>
> So here the P2 thread has ended and gone away but I still have his
> Queue lingering.
>
> So on a thread I can use is_alive() to check status and use join() to
> clean up but I don't see any analogous functionality for Queues.  How
> do I kill them?  I thought about putting a suicide message on the
> Queue and then C1 would read it and set the variable to None but i'm
> not sure setting the variable to None actually makes the Queue go
> away.  It could just end up sitting in memory unreferenced - and
> that's not good.  Additionally, I could have any number of consumer
> threads reading that Queue so once the first one get the suicide note
> the other consumer threads never would.
>
> I figure there has to be an elegant way for managing my Queues but so
> far I can't find it.  Any suggestions would be appreciated and thanks
> in advance for any help.
>
> ps Python rocks.

Whoo..the formatting got torn up!  My terrible diagrams are even more
terrible!  Oh well, I think you'll catch my meaning   :)
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Queue cleanup

2010-08-11 Thread EW
On Aug 11, 1:18 pm, Paul Rubin  wrote:
> EW  writes:
> > I also might have different consumer threads do
> > different tasks (for example one might write to a log and one might
> > write to SQL) so that again means I can't plan for a set ratio of
> > consumers to producers  So it's unknown.
>
> > So this means that instead of having 1 Queue that all the producers
> > put to and that all the consumers get from I actually have 1 Queue per
> > producer thread
>
> That doesn't sound appropriate.  Queues can have many readers and many
> writers.  So use one queue per task (logging, SQL, etc), regardless of
> the number of producer or consumer threads.  Any producer with an SQL
> request sends it to the SQL queue, which can have many listeners.  The
> different SQL consumer threads listen to the SQL queue and pick up
> requests and handle them.

I thought about doing it that way and I could do it that way but it
still seems like there should be a way to clean up Queues on my own.
If I did it this way then I guess I'd be relying on garbage collection
when the script ended to clean up the Queues for me.

What if I want to clean up my own Queues?  Regardless of the specifics
of my current design, I'm just generally curious how people manage
cleanup of their Queues when they don't want them any more.
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Queue cleanup

2010-08-11 Thread EW
On Aug 11, 1:55 pm, MRAB  wrote:
> EW wrote:
>
> [snip]
>
>
>
> > So here the P2 thread has ended and gone away but I still have his
> > Queue lingering.
>
> > So on a thread I can use is_alive() to check status and use join() to
> > clean up but I don't see any analogous functionality for Queues.  How
> > do I kill them?  I thought about putting a suicide message on the
> > Queue and then C1 would read it and set the variable to None but i'm
> > not sure setting the variable to None actually makes the Queue go
> > away.  It could just end up sitting in memory unreferenced - and
> > that's not good.  Additionally, I could have any number of consumer
> > threads reading that Queue so once the first one get the suicide note
> > the other consumer threads never would.
>
> > I figure there has to be an elegant way for managing my Queues but so
> > far I can't find it.  Any suggestions would be appreciated and thanks
> > in advance for any help.
>
> An object will be available for garbage collection when nothing refers
> to it either directly or indirectly. If it's unreferenced then it will
> go away.
>
> As for the suicide note, if a consumer sees it then it can put it back
> into the queue so other consumers will see it and then forget about the
> queue (set the variable which refers to the queue to None, or, if the
> references are in a list, delete it from the list).

Ok great.  I wasn't sure about the Garbage collection part of it.
That's actually pretty easy.

Thanks!
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Queue cleanup

2010-08-11 Thread EW
On Aug 11, 2:16 pm, Paul Rubin  wrote:
> EW  writes:
> > I thought about doing it that way and I could do it that way but it
> > still seems like there should be a way to clean up Queues on my own.
> > If I did it this way then I guess I'd be relying on garbage collection
> > when the script ended to clean up the Queues for me.
>
> Oh, I see.  As long as it's possible to start new producer or consumer
> threads that touch a queue, obviously that queue has to still be around.
> If the program starts all its threads at the beginning, then runs til
> they exit, then does more stuff, then you could do something like:
>
>     # make dictonary of queues, one queue per task type
>     queues = {'sql': Queue(), 'logging': Queue(), ... }
>
>     for i in 
>        threading.Thread(target=your_handler, args=[queues])
>
>     del queues
>
> and then when all the threads exit, there are no remaining references to
> the queues.  But why do you care?  Queues aren't gigantic structures,
> they're just a list (collections.deque) with an rlock.  It's fine to let
> the gc clean them up; that's the whole point of having a gc in the first
> place.

Well I cared because I thought garbage collection would only happen
when the script ended - the entire script.  Since I plan on running
this as a service it'll run for months at a time without ending.  So I
thought I was going to have heaps of Queues hanging out in memory,
unreferenced and unloved.  It seemed like bad practice so I wanted to
get out ahead of it.

But the GC doesn't work the way I thought it worked so there's really
no problem I guess. I was just confused on garbage collection it seems.
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Queue cleanup

2010-08-11 Thread EW
On Aug 11, 2:52 pm, Paul Rubin  wrote:
> EW  writes:
> > Well I cared because I thought garbage collection would only happen
> > when the script ended - the entire script.  Since I plan on running
> > this as a service it'll run for months at a time without ending.  So I
> > thought I was going to have heaps of Queues hanging out in memory,
> > unreferenced and unloved.  It seemed like bad practice so I wanted to
> > get out ahead of it.
>
> Even if GC worked that way it wouldn't matter, if you use just one queue
> per type of task.  That number should be a small constant so the memory
> consumption is small.

Well I can't really explain it but 1 Queue per task for what I'm
designing just doesn't feel right to me.  It feels like it will lack
future flexibility.  I like having 1 Queue per producer thread object
and the person instantiating that object can do whatever he wants with
that Queue.  I can't prove I'll need that level of flexibility but I
don't see why it' bad to have.  It's still a small number of Queues,
it's just a small, variable, number of Queues.
-- 
http://mail.python.org/mailman/listinfo/python-list