Re: 2.3 -> 2.4: long int too large to convert to int
Grant Edwards wrote:
> I give up, how do I make this not fail under 2.4?
>
> fcntl.ioctl(self.dev.fileno(),0xc0047a80,struct.pack("HBB",0x1c,0x00,0x00))
>
> I get an OverflowError: long int too large to convert to int
>
> ioctl() is expecting a 32-bit integer value, and 0xc0047a80 has
> the high-order bit set. I'm assuming Python thinks it's a
> signed value. How do I tell Python that 0xc0047a80 is an
> unsigned 32-bit value?
>
You could sort-of fake it like this,
def unsigned(val):
return struct.unpack('i', struct.pack('I', val))[0]
fcntl.ioctl(self.dev.fileno(), unsigned(0xc0047a80), ...)
but good luck writing a docstring explaining why a function called
"unsigned" takes a positive long and returns a negative int... ;)
Chris Perkins
--
http://mail.python.org/mailman/listinfo/python-list
str.count is slow
It seems to me that str.count is awfully slow. Is there some reason
for this?
Evidence:
str.count time test
import string
import time
import array
s = string.printable * int(1e5) # 10**7 character string
a = array.array('c', s)
u = unicode(s)
RIGHT_ANSWER = s.count('a')
def main():
print 'str:', time_call(s.count, 'a')
print 'array: ', time_call(a.count, 'a')
print 'unicode:', time_call(u.count, 'a')
def time_call(f, *a):
start = time.clock()
assert RIGHT_ANSWER == f(*a)
return time.clock()-start
if __name__ == '__main__':
main()
## end
On my machine, the output is:
str: 0.29365715475
array: 0.448095498171
unicode: 0.0243757237303
If a unicode object can count characters so fast, why should an str
object be ten times slower? Just curious, really - it's still fast
enough for me (so far).
This is with Python 2.4.1 on WinXP.
Chris Perkins
--
http://mail.python.org/mailman/listinfo/python-list
