Re: Binary search tree

2007-11-13 Thread Scott Sandeman-Allen
On 11/13/07, Terry Reedy ([EMAIL PROTECTED]) wrote: >"Scott SA" <[EMAIL PROTECTED]> wrote in message >news:[EMAIL PROTECTED] >| On 11/12/07, Scott SA ([EMAIL PROTECTED]) wrote: >| I decided to test the speeds of the four methods: >| >|set_example >|s = set() >|for url in urls

Re: Binary search tree

2007-11-13 Thread Gabriel Genellina
En Mon, 12 Nov 2007 16:21:36 -0300, Scott SA <[EMAIL PROTECTED]> escribió: > I decided to test the speeds of the four methods: (but one should always check for correctness before checking speed) > def dict_example(urls): > d = {} > for url in urls: > if url in d: > d[

Re: Binary search tree

2007-11-12 Thread Terry Reedy
"Scott SA" <[EMAIL PROTECTED]> wrote in message news:[EMAIL PROTECTED] | On 11/12/07, Scott SA ([EMAIL PROTECTED]) wrote: | I decided to test the speeds of the four methods: | |set_example |s = set() |for url in urls: |if not url in s: |s.add(url)

Re: Binary search tree

2007-11-12 Thread Scott SA
On 11/12/07, Scott SA ([EMAIL PROTECTED]) wrote: Uhm sorry, there is a slightly cleaner way of running the second option I presented (sorry for the second post). >If you would find an index and count useful, you could do something like this: > >for idx in range(len(urls)): >uniqu

Re: Binary search tree

2007-11-12 Thread Scott SA
On 11/12/07, Michel Albert ([EMAIL PROTECTED]) wrote: >On Nov 9, 11:45 pm, Bruno Desthuilliers ><[EMAIL PROTECTED]> wrote: >> [EMAIL PROTECTED] a Ècrit : >> >> > Hi, >> >> > I have to get list of URLs one by one and to find the URLs that I have >> > more than one time(can't be more than twice). >>

Re: Binary search tree

2007-11-12 Thread Martin v. Löwis
> Now, I can see that this method has some superfluous data (the `1` > that is assigned to the dict). So I suppose this is less memory > efficient. But is this slower then? As both implementations use hashes > of the URL to access the data. Just asking out of curiosity ;) Performance-wise, there i

Re: Binary search tree

2007-11-12 Thread Michel Albert
On Nov 9, 11:45 pm, Bruno Desthuilliers <[EMAIL PROTECTED]> wrote: > [EMAIL PROTECTED] a écrit : > > > Hi, > > > I have to get list of URLs one by one and to find the URLs that I have > > more than one time(can't be more than twice). > > > I thought to put them into binary search tree, this way the

Re: Binary search tree

2007-11-09 Thread Bruno Desthuilliers
[EMAIL PROTECTED] a écrit : > Hi, > > I have to get list of URLs one by one and to find the URLs that I have > more than one time(can't be more than twice). > > I thought to put them into binary search tree, this way they'll be > sorted and I'll be able to check if the URL already exist. What ab

Re: Binary search tree

2007-11-09 Thread D.Hering
On Nov 9, 4:06 pm, [EMAIL PROTECTED] wrote: > Hi, > > I have to get list of URLs one by one and to find the URLs that I have > more than one time(can't be more than twice). > > I thought to put them into binary search tree, this way they'll be > sorted and I'll be able to check if the URL already e

Re: Binary search tree

2007-11-09 Thread Jake McKnight
What if someone wants to implement, say, Huffman compression? That requires a binary tree and the ability to traverse the tree. I've been looking for some sort of binary tree library as well, and I haven't had any luck. On 11/9/07, Larry Bates <[EMAIL PROTECTED]> wrote: > > [EMAIL PROTECTED] wro

Re: Binary search tree

2007-11-09 Thread Neil Cerutti
On 2007-11-09, Larry Bates <[EMAIL PROTECTED]> wrote: > [EMAIL PROTECTED] wrote: >> I have to get list of URLs one by one and to find the URLs >> that I have more than one time(can't be more than twice). >> >> I thought to put them into binary search tree, this way >> they'll be sorted and I'll be

Re: Binary search tree

2007-11-09 Thread Larry Bates
[EMAIL PROTECTED] wrote: > Hi, > > I have to get list of URLs one by one and to find the URLs that I have > more than one time(can't be more than twice). > > I thought to put them into binary search tree, this way they'll be > sorted and I'll be able to check if the URL already exist. > > Couldn