[ACCEPTED]-Is an int a 64-bit integer in 64-bit C#?-primitive

Accepted answer
Score: 47

No. The C# specification rigidly defines 2 that int is an alias for System.Int32 with exactly 32 bits. Changing 1 this would be a major breaking change.

Score: 43

The int keyword in C# is defined as an alias 7 for the System.Int32 type and this is (judging by the 6 name) meant to be a 32-bit integer. To the 5 specification:

CLI specification section 8.2.2 (Built-in value 4 and reference types) has a table with the 3 following:

  • System.Int32 - Signed 32-bit integer

C# specification section 8.2.1 (Predefined types) has 2 a similar table:

  • int - 32-bit signed integral type

This guarantees that both 1 System.Int32 in CLR and int in C# will always be 32-bit.

Score: 18

Will sizeof(testInt) ever be 8?

No, sizeof(testInt) is 17 an error. testInt is a local variable. The 16 sizeof operator requires a type as its argument. This 15 will never be 8 because it will always be 14 an error.

VS2010 compiles a c# managed integer 13 as 4 bytes, even on a 64 bit machine.

Correct. I 12 note that section 18.5.8 of the C# specification 11 defines sizeof(int) as being the compile-time constant 10 4. That is, when you say sizeof(int) the compiler 9 simply replaces that with 4; it is just 8 as if you'd said "4" in the source code.

Does 7 anyone know if/when the time will come that 6 a standard "int" in C# will be 64 bits?

Never. Section 5 4.1.4 of the C# specification states that 4 "int" is a synonym for "System.Int32".

If 3 what you want is a "pointer-sized integer" then 2 use IntPtr. An IntPtr changes its size on 1 different architectures.

Score: 12

int is always synonymous with Int32 on all platforms.

It's 3 very unlikely that Microsoft will change 2 that in the future, as it would break lots 1 of existing code that assumes int is 32-bits.

Score: 5

I think what you may be confused by is that 5 int is an alias for Int32 so it will always be 4 4 bytes, but IntPtr is suppose to match the word 3 size of the CPU architecture so it will 2 be 4 bytes on a 32-bit system and 8 bytes 1 on a 64-bit system.

Score: 4

According to the C# specification ECMA-334, section 4 "11.1.4 Simple Types", the reserved 3 word int will be aliased to System.Int32. Since this is 2 in the specification it is very unlikely 1 to change.

Score: 3

No matter whether you're using the 32-bit 3 version or 64-bit version of the CLR, in 2 C# an int will always mean System.Int32 and long will always 1 mean System.Int64.

Score: 3

The following will always be true in C#:

sbyte signed 8 bits, 1 27 byte

byte unsigned 8 bits, 1 byte

short signed 16 bits, 2 26 bytes

ushort unsigned 16 bits, 2 bytes

int signed 32 25 bits, 4 bytes

uint unsigned 32 bits, 4 bytes

long 24 signed 64 bits, 8 bytes

ulong unsigned 64 bits, 8 23 bytes

An integer literal is just a sequence of digits 22 (eg 314159) without any of these explicit types. C# assigns 21 it the first type in the sequence (int, uint, long, ulong) in 20 which it fits. This seems to have been slightly 19 muddled in at least one of the responses 18 above.

Weirdly the unary minus operator (minus sign) showing up before 17 a string of digits does not reduce the choice 16 to (int, long). The literal is always positive; the 15 minus sign really is an operator. So presumably 14 -314159 is exactly the same thing as -((int)314159). Except apparently 13 there's a special case to get -2147483648 straight 12 into an int; otherwise it'd be -((uint)2147483648). Which I presume 11 does something unpleasant.

Somehow it seems 10 safe to predict that C# (and friends) will 9 never bother with "squishy name" types 8 for >=128 bit integers. We'll get nice 7 support for arbitrarily large integers and super-precise 6 support for UInt128, UInt256, etc. as soon 5 as processors support doing math that wide, and 4 hardly ever use any of it. 64-bit address 3 spaces are really big. If they're ever too small 2 it'll be for some esoteric reason like ASLR 1 or a more efficient MapReduce or something.

Score: 0

Yes, as Jon said, and unlike the 'C/C++ world', Java 4 and C# aren't dependent on the system they're 3 running on. They have strictly defined lengths 2 for byte/short/int/long and single/double 1 precision floats, equal on every system.

More Related questions