On 23-2-2013 16:45, Paul Moore wrote:
> I need to transfer some data (nothing fancy, some dictionaries, strings,
> numbers and
> lists, basically) between 2 Python processes. However, the data (string
> values) is
> potentially not ASCII, but the transport is (I'm piping between 2 processes,
> but
> thanks to nasty encoding issues, the only characters I can be sure won't be
> mangled
> are ASCII).
>
> What's the best ASCII-only protocol to use that's portable between versions
> of Python
> back to about 2.6/2.7 and in the stdlib, so I don't need external modules?
>
> At the moment, I'm using
>
> encoded = json.dumps([ord(c) for c in json.dumps(obj)]) decoded =
> json.loads(''.join([chr(n) for n in json.loads(encoded)]))
>
> The double-encoding ensures that non-ASCII characters don't make it into the
> result.
Eww.
>
> This works fine, but is there something simpler (i.e., less of a hack!) that
> I could
> use? (Base64 and the like don't work because they encode bytes->strings, not
> strings->strings).
For Python < 3.0, strings and bytes are the same;
>>> import base64
>>> base64.b64encode("hello there")
'aGVsbG8gdGhlcmU='
>>> base64.b64decode(_)
'hello there'
>>>
Other than that, maybe a simple repr(stuff) / ast.literal_eval(string) might do
the job?
Irmen
--
http://mail.python.org/mailman/listinfo/python-list