[issue5171] itertools.product docstring missing 'repeat' argument

2011-01-06 Thread mrjbq7

mrjbq7  added the comment:

I noticed a Reddit post[1] today that makes the comment that the docstring 
should read:

product(*iterables[, repeat]) --> product object

instead of:

product(*iterables) --> product object

---

[1] 
http://www.reddit.com/r/Python/comments/ex68j/omission_in_docstring_for_itertoolsproduct/

--
nosy: +mrjbq7

___
Python tracker 
<http://bugs.python.org/issue5171>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue18195] error when deep copying module is confusing

2013-06-11 Thread mrjbq7

New submission from mrjbq7:

If you have a simple module (say "foo.py"):

$ cat foo.py
bar = 1

You get weird errors when trying to deep copy them (which I did by accident, 
not intentionally trying to deep copy modules):

Python 2.7.2:

>>> import foo
>>> import copy
>>> copy.deepcopy(foo)
Traceback (most recent call last):
  File "", line 1, in 
  File 
"/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/copy.py",
 line 190, in deepcopy
y = _reconstruct(x, rv, 1, memo)
  File 
"/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/copy.py",
 line 334, in _reconstruct
state = deepcopy(state, memo)
  File 
"/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/copy.py",
 line 163, in deepcopy
y = copier(x, memo)
  File 
"/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/copy.py",
 line 257, in _deepcopy_dict
y[deepcopy(key, memo)] = deepcopy(value, memo)
  File 
"/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/copy.py",
 line 163, in deepcopy
y = copier(x, memo)
  File 
"/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/copy.py",
 line 257, in _deepcopy_dict
y[deepcopy(key, memo)] = deepcopy(value, memo)
  File 
"/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/copy.py",
 line 190, in deepcopy
y = _reconstruct(x, rv, 1, memo)
  File 
"/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/copy.py",
 line 329, in _reconstruct
y = callable(*args)
  File 
"/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/copy_reg.py",
 line 93, in __newobj__
return cls.__new__(cls, *args)
TypeError: object.__new__(NotImplementedType) is not safe, use 
NotImplementedType.__new__()

Python 3.3.2:

>>> import foo
>>> import copy
>>> copy.deepcopy(foo)
Traceback (most recent call last):
  File "", line 1, in 
  File "/usr/lib/python3.3/copy.py", line 174, in deepcopy
y = _reconstruct(x, rv, 1, memo)
  File "/usr/lib/python3.3/copy.py", line 301, in _reconstruct
y.__dict__.update(state)
AttributeError: 'NoneType' object has no attribute 'update'

I'm not expecting to be able to deep copy a module, but it would be really 
great if it is not possible for the error message to say something like 
"deepcopy doesn't work for modules" rather than two different funky tracebacks 
that don't really explain the problem...

Thanks,

--
messages: 190996
nosy: mrjbq7
priority: normal
severity: normal
status: open
title: error when deep copying module is confusing

___
Python tracker 
<http://bugs.python.org/issue18195>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue6684] "x / 1" and "x * 1" should return x

2009-08-11 Thread mrjbq7

New submission from mrjbq7 :

There are a couple arithmetic operations that idempotent, where the 
returned python object is the same python object as the input.  

For example, given a number:

>>> x = 12345

The abs() builtin returns the same number object if it is already a 
positive value:

>>> id(x)
17124964
>>> id(abs(x))
17124964

The "multiply by zero" operation returns a single "zero" object:

>>> id(x * 0)
16794004
>>> id(x * 0)
16794004

But, the "multiply by 1" or "divide by 1" does not:

>>> id(x * 1)
17124928
>>> id(x * 1)
17124880
>>> id(x / 1)
17203652
>>> id(x / 1)
17124952

--
messages: 91479
nosy: mrjbq7
severity: normal
status: open
title: "x / 1" and "x * 1" should return x

___
Python tracker 
<http://bugs.python.org/issue6684>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue6684] "x / 1" and "x * 1" should return x

2009-08-11 Thread mrjbq7

Changes by mrjbq7 :


--
components: +Interpreter Core
versions: +Python 2.6

___
Python tracker 
<http://bugs.python.org/issue6684>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue17560] problem using multiprocessing with really big objects?

2013-03-27 Thread mrjbq7

New submission from mrjbq7:

I ran into a problem using multiprocessing to create large data objects (in 
this case numpy float64 arrays with 90,000 columns and 5,000 rows) and return 
them to the original python process.

It breaks in both Python 2.7 and 3.3, using numpy 1.7.0 (but with different 
error messages).

It is possible the array is too large to be serialized (450 million 64-bit 
numbers exceeds a 32-bit limit)?


Python 2.7
==

Process PoolWorker-1:
Traceback (most recent call last):
  File "/usr/lib/python2.7/multiprocessing/process.py", line 258, in _bootstrap
self.run()
  File "/usr/lib/python2.7/multiprocessing/process.py", line 114, in run
self._target(*self._args, **self._kwargs)
  File "/usr/lib/python2.7/multiprocessing/pool.py", line 99, in worker
put((job, i, result))
  File "/usr/lib/python2.7/multiprocessing/queues.py", line 390, in put
return send(obj)
SystemError: NULL result without error in PyObject_Call


Python 3.3
==

Traceback (most recent call last):
  File "multi.py", line 18, in 
results = pool.map_async(make_data, range(5)).get(999)
  File "/usr/lib/python3.3/multiprocessing/pool.py", line 562, in get
raise self._value
multiprocessing.pool.MaybeEncodingError: Error sending result: '[array([[ 
0.74628629,  0.36130663, -0.65984794, ..., -0.70921838,
 0.34389663, -1.7135126 ],
   [ 0.60266867, -0.40652402, -1.31590562, ...,  1.44896246,
-0.3922366 , -0.85012842],
   [ 0.59629641, -0.00623001, -0.12914128, ...,  0.99925511,
-2.30418136,  1.73414009],
   ..., 
   [ 0.24246639,  0.87519509,  0.24109069, ..., -0.48870107,
-0.20910332,  0.11749621],
   [ 0.62108937, -0.86217542, -0.47357384, ...,  1.59872243,
 0.76639995, -0.56711461],
   [ 0.90976471,  1.73566475, -0.18191821, ...,  0.19784432,
-0.29741643, -1.46375835]])]'. Reason: 'error("'i' format requires 
-2147483648 <= number <= 2147483647",)'

--
files: multi.py
messages: 185344
nosy: mrjbq7
priority: normal
severity: normal
status: open
title: problem using multiprocessing with really big objects?
versions: Python 3.3
Added file: http://bugs.python.org/file29595/multi.py

___
Python tracker 
<http://bugs.python.org/issue17560>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue17560] problem using multiprocessing with really big objects?

2013-03-27 Thread mrjbq7

mrjbq7 added the comment:

On a machine with 256GB of RAM, it makes more sense to send arrays of this size 
than say on a laptop...

--

___
Python tracker 
<http://bugs.python.org/issue17560>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue17560] problem using multiprocessing with really big objects?

2013-03-27 Thread mrjbq7

mrjbq7 added the comment:

> Richard was saying that you shouldn't serialize such a large array,
> that's just a huge performance bottleneck. The right way would be 
> to use a shared memory.

Gotcha, for clarification, my original use case was to *create* them
in the other process (something which took some time since they were 
calculated and not just random as in the example) and returned to the 
original process for further computation...

--

___
Python tracker 
<http://bugs.python.org/issue17560>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com