You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Take (pixie.uv/uv_hrtime) for instance. It returns CUInt64.
Then, ffi_get_value unpacks the value by casting it to signed rffi.LONG, which is pixie's Integer value.
user => (uv_hrtime)
-1474318557
user => (type (uv_hrtime))
pixie.stdlib.Integer
It seems that all numeric types are cast to Integer, which seems to be signed long equivalent and it's 32 bit wide on 32-bit system. long long should be used to express 64 bit wide numbers on 32-bit systems.
Moreover, literals are also affected. For instance:
user => 10000000000
1410065408
user => -10000000000
-1410065408
user => (type 10000000000)
pixie.stdlib.Integer
Not suprisingly, similar overflow occurs on 64-bit systems but the literal must be significantly larger.
One might use BigInteger literal defensively here, but it's not necessary on 64-bit systems and introduces performance hit.
First obvious solution would be to auto promote wider numeric types to BigInteger on 32-bit systems in ffi and reader. I imagine that this would increase apparent compatibility between 32-bit and 64-bit systems. On the other hand, it will cause significant performance hit and may, eventually, cause different compatibility issues because huge integers on 32-bit will become 'wider' than on 64-bit.
The other solution would be to implement LongInteger that'd represent 64-bit numbers on 32-bit machines and on 64-bit machines it would be same as Integer. Again, I'm not sure if there are any problems associated with this approach and how will it affect the interpreter.
I'm not sure how to proceed and what solution would be the most beneficial for the language.
The text was updated successfully, but these errors were encountered:
I'm using pixie on 32 bit armhf.
Take
(pixie.uv/uv_hrtime)
for instance. It returnsCUInt64
.Then,
ffi_get_value
unpacks the value by casting it to signedrffi.LONG
, which is pixie'sInteger
value.See: ffi.py#L289
It seems that all numeric types are cast to
Integer
, which seems to besigned long
equivalent and it's 32 bit wide on 32-bit system.long long
should be used to express 64 bit wide numbers on 32-bit systems.Moreover, literals are also affected. For instance:
Not suprisingly, similar overflow occurs on 64-bit systems but the literal must be significantly larger.
One might use
BigInteger
literal defensively here, but it's not necessary on 64-bit systems and introduces performance hit.First obvious solution would be to auto promote wider numeric types to
BigInteger
on 32-bit systems in ffi and reader. I imagine that this would increase apparent compatibility between 32-bit and 64-bit systems. On the other hand, it will cause significant performance hit and may, eventually, cause different compatibility issues because huge integers on 32-bit will become 'wider' than on 64-bit.The other solution would be to implement
LongInteger
that'd represent 64-bit numbers on 32-bit machines and on 64-bit machines it would be same asInteger
. Again, I'm not sure if there are any problems associated with this approach and how will it affect the interpreter.I'm not sure how to proceed and what solution would be the most beneficial for the language.
The text was updated successfully, but these errors were encountered: