2

I have the following code:

#include <stdio.h>
#include <stdlib.h>

int main(){

int first = cdab3412;

int second = abcd1234;

int result1 = (second >> 16) | (first & 0xFFFF0000);
int result2 = (first << 16) | (second & 0x0000FFFF);

printf("Outputs: %x and %x.\n", result1, result2);

result2 turns out as expected an outputs: 34121234

However, result1 outputs ffffabcd. If I just leave it as (first & 0xFFFF0000) it correctly outputs cdab0000.

Why is result1 ffffabcd and not cdababcd?

1
  • 2 suggestions: use unsigned variables when shifting, especially right. Use fixed-width integer type (uint32_t) when using magic numbers like 0xFFFF0000. Commented Sep 18, 2013 at 0:26

2 Answers 2

5

It's called sign extension. Set the types to unsigned int and it should work.

Sign up to request clarification or add additional context in comments.

Comments

4

second is a signed integer. So when you shift it to the right the leftmost bit becomes a 1.

If you use unsigned int's you'll get the result you expect.

Signed vs. unsigned has always been a source of confusion, so you need to tread carefully.

1 Comment

If you want to see this drive you crazy change the 'C' in first for a '7' and see it work even with the signed int code (and then see it fail spectacularly in production). I've spent more than a trivial amount of time debugging issues like this one. :)

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.