|
|
In the following code, assume ints are 16 bits.
int f(void)
{
int i = 0;
return i > 0xffff;
}
Because the hexadecimal constant's type is either
int
(with a value of -1 on a two's-complement machine)
or an
unsigned int
(with a value of 65535),
the comparison will be true on a non-ANSI C compiler
(-Xt
mode),
and false on an ANSI C compiler
(-Xa
and
-Xc
modes).
An appropriate cast clarifies the code:
i > (int)0xffff
i > (unsigned int)0xffff
/* or */
i > 0xffffU