I'm developing on a Mac using MacOSX 10.8.2 with the latest xcode and the stock python interpreter.
Here's some code I put into fails.cpp
#include <iostream>
using namespace std;
extern "C" {
void mysort(long *data, long data2) {
cout << hex << *data << ' ' << data2 << endl;
}
}
Here's some code to call it from python that I put in fails.py:
import ctypes
sort_dll = ctypes.CDLL("fails.dylib")
mysort = ctypes.CFUNCTYPE(None, ctypes.POINTER(ctypes.c_long), ctypes.c_long)(sort_dll.mysort)
a = ctypes.c_long(0x987654321)
mysort(ctypes.byref(a), 0x123456789)
I compile and run with
c++ -arch x86_64 -o fails.o -c fails.cpp && g++ -o fails.dylib -dynamiclib fails.o && python fails.py
The result is:
987654321 23456789
Why is the 64-bit integer passed by value being truncated to 32 bits? Surprisingly, the pointer to a 64-bit long isn't truncated.