I'm trying to get curl to traverse a remote http directory get all page
names and do some processing on these file with python.
For starters, i need to get the directory listing in an array from the
domain. Any ideas on how to do this?
Thanks,
JJ
Disclaimer: The information contained in this
Thanks Jay. This helps!
JJ
On Mon, 2008-12-08 at 10:01 -0800, Jay Deiman wrote:
> -BEGIN PGP SIGNED MESSAGE-
> Hash: SHA1
>
> Jeremiah Jester wrote:
> > Is anyone on here using the python-rrdtool module for graphing and
> > analysis? If so, do you have some sampl
Is anyone on here using the python-rrdtool module for graphing and
analysis? If so, do you have some sample scripts you could show me.
There doesn't seem to be a lot out there as far as real world python
examples.
Thanks,
JJ
Disclaimer: The information contained in this transmission, including
Thanks for clearing this up for me.
On Tue, 2008-12-02 at 13:25 -0800, Steve Willoughby wrote:
> On Tue, Dec 02, 2008 at 01:08:09PM -0800, Jeremiah Jester wrote:
> > Hello,
> >
> > I'm trying to gather a list of files and md5 hash them to do a
> checksum.
> &g
Hello,
I'm trying to gather a list of files and md5 hash them to do a checksum.
I've created a function for each dictionary. However, when i print out
the dictionary I don't get all the items. Any ideas?
Thanks<
JJ
CODE:
#!/usr/bin/python
import os
import md5
import sys
source={}
target={}
so