[Python-Dev] x86_64 Interix - Advise needed on size of long

2007-08-04 Thread Jerker Bäck
Hello all,
I'm in need of an advise how to handle sizeof long in python. I wanted a
x86_64 compile of python for Interix (that is NT POSIX subsystem with x86_64
Interix 6 SDK).

My first attempt to build failed due to the makefile insisted on linking as
shared libraries (works only in x86 with GNU ld). Tried autoreconf to get
rid of libtool - no luck.
Q1: Is the static build broken?
Q2: Anyone have a useable Makefile.am?

My second attempt was based on the VS2005 project and the previous Makefile.
Not to tire you with details, but for this to work I need to explicit assign
the sizeof long (replace all long types with explicit sized ones, int32_t,
ssize_t etc).

There are 2 choices: All longs to 64bit (LP64 model) or all to 32bit (LLP64
model). Since Interix use LP64 the first alternative would be logic, but
considering compatibility with the Windows DLL, performance(?) and whatever,
I choosed the latter. A choice which later would turn me into trouble.

Here's how I am reasoning:

x64 Windows DLL = LLP64 model => sizeof(long) = 4
x86_64 Interix  = LP64 model  => sizeof(long) = 8

So, since the Windows build works, basically all long types in the code are
32bit (or at least works if they are 32bit). 64bit dependent variables like
pointers have already been taken care of. Right? While it sounds reasonable
as long as one are consistent, it's actually quite difficult to get it right
(and a lot of work).

To be precise, would this be OK?
long PyInt_AsLong(PyObject *);
change to:
int32_t PyInt_AsLong(PyObject *);
or
unsigned long PyOS_strtoul(char*, char**, int);
to:
uint32_t PyOS_strtoul(char*, char**, int);

Thanks, 
Erik




___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] x86_64 Interix - Advise needed on size of long

2007-08-05 Thread Jerker Bäck
Hello Martin,
Thanks very much for answering.

> As for the static vs. shared libpython: On Unix, Python is typically
> built as a single executable (only linked shared with the system
> libraries). The challenge is then with extension modules, which are
> shared libraries. In particular, it is a challenge that those want
> to find symbols defined in the executable, without being linked with
> it. So you have three options:
Aha, now it lightens a bit. As I understand, I will need a x86_64 PE GNU ld
to get this to work as intended - however, there is no such thing at this
moment.
> 2. Don't use extension modules. Edit Modules/Setup to statically link
>all extension modules into the interpreter binary.
This is the way. But how to do that?

Shell output:
../configure --disable-shared
...
ar cr libpython2.5.a Objects/
ar cr libpython2.5.a Python/
ar cr libpython2.5.a Modules/
ar cr libpython2.5.a Modules/
cc -o python \
Modules/python.o \
libpython2.5.a -lsocket -lm
CC='cc' LDSHARED='ld' OPT='-DNDEBUG -O' ./python -E ../setup.py build;;
Memory fault (core dumped)
make: *** [sharedmods] Error 139

I assume the "Modules/" are the extension modules. To get
them statically linked, the functions must be called somewhere. Statically
linked = "Builtin modules"? You mean I should list all of these in
"Modules/Setup"? FYI I got the "dynload_stub.c" compiled in. BTW, shouldn't
"--disable-shared" take care of this?

> OK in what sense? You making these changes locally? You can make
> whatever changes you please; this is free software. I can't
> see *why* you want to make all these changes, but if you so
> desire...
It's really very simple - I got LP64 libraries (Interix SDK). To get them to
work with a LLP64 compiler I need explicit sized types in case of long.

FYI: cc is a shell script wrapper of a x64 PE compiler, which in this case
is the MS x64 compiler v 14.00.50727.762 - POSIX mode. It automatically
translates all longs to long long in case of a 64bit compile. Thus, cc
cannot easily be used in e.g. Visual Studio.

> This becoming part of Python? No way. It is intentional that
> PyInt_AsLong returns long (why else would the function be called
> this way?), and it is also intentional that the int type has
> its internal representation as a long.
Oh, it was never my intention to propose a change to the LLP64 model. And
your right: All exports should be according to the LP64 model in case of a
POSIX compile. One must follow some rules! But you must admit it's tempting
with all these:
#if SIZEOF_LONG > 4
< get rid of the 64bit crap >
#endif
In my case the different paradigms are a real pain. I must take it into
account all the time when porting. I can only hope people stop using 
in favour of explicit sized types or types like size_t, intptr_t etc. I
would love to see the damn thing obsolete.

Cheers,
Erik

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] x86_64 Interix - Advise needed on size of long

2007-08-05 Thread Jerker Bäck
Hello Martin,

> > You mean I should list all of these in "Modules/Setup"?
> Exactly so. They are already listed - just uncomment them all
> (with proper command line flags and libraries where necessary).
OK, I will try to get it compiled and tested.

Meanwhile, you asked so:
> I still don't understand. Are you *certain* that these are LP64
> libraries? Can you kindly refer to some official document that says
> Interix uses LP64 on AMD64?
MS is surprisingly very quiet of the POSIX subsystem and the Interix BSD
implementation, so it's hard to find any official info on the net. But here
is one the developers:

In Interix SDK releasenotes.htm (SDK download):
"64-bit compilation supports the LP64 data model."
Interix general:



To find details on how it all really works, one will have to look in the
headers and try different features oneself. (Which actually is pretty fun
because it's really fast and usually works well). The SDK comes with support
for x86, x86_64 (EM64T or AMD64) and IA64.

> And if so, how did Microsoft manage to build them, if their compiler
> does not support LP64? (I see you kind of answer that below - although
> I'm unsure what "translate all longs to long long means - you mean
> literal text replacement?)
Sure, cc precompiles the source file to a temporary file, flip it, runs a
conversion tool - "l2ll" => all  are converted to  and
finally compiles the converted file. The compile is done via a call from
POSIX to the Windows subsystem and the compiler found in the POSIX path
environment. To understand the details one has to know that the POSIX
environment runs directly on top of the NT kernel and know nothing of
Windows, Windows paths etc. This is kind of a compile on the fly. The
libraries are also of two kinds: 
1 The core POSIX libraries - part of the OS, uses DDK tools
2 Interix SDK - BSD libc and utils, uses cc and Interix gcc (x86 only)

The DDK tools is turned to LP64 support via special defines in the headers.
But here is some unclear issues with functions directly exported from the OS
native LLP64 libraries (ntdll.dll) - don't know how this is solved.
 
Somewhere here lies the reason why cc is hard to use with Visual Studio and
why the long type is such a nuisance.

I also tried the Intel x64 PE compiler (for better C99 support), but it
produces applications which relies on Windows API functions (e.g.
VirtualAlloc, LoadLibrary) and thus cannot be used in POSIX.
 
Cheers,
Erik

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com