0

I am experimenting with both unsigned int data types and main method parameters in simple C programs. As an experiment I wrote a program that takes an int number from the command line as a parameter to the main method, and sums every integer between that number and 0.

E.g. The program calculates f(n) = (1 + 2 + 3... + n) valid when n > 0

#include <stdio.h>
#include <stdlib.h>

const unsigned int MAX_NUM = 92681; //Max input that will avoid int overflow later on

unsigned int sum(unsigned int x); 

int main(int argc, char *argv[]) { 

    unsigned int input = atoi(argv[1]); 

    if (input < 0 || input > MAX_NUM) {
        printf("Invalid input! Input must be less than 92682\n");
        exit(0); //If input > MAX_NUM, quit program
    }

    unsigned int result = sum(input);

    printf("Sum to %d = %d\n", input, result);

    return 0;
}

unsigned int sum(unsigned int x) {
    unsigned int sum = 0;
    unsigned int y;
    for (y = 0; y <= x; y++) {
        sum += y;
        printf("Current sum:\t%u\n",sum);
    }
    return sum;
}

The first thing I began to notice was integer overflow when f(n) > 2147483648 - aka the maximum value for a signed int.

I found the maximum values mathematically by hand for which the results generated by my program would be valid (e.g. before integer overflow) to be 65535 for signed ints and 92681 for unsigned ints.

Running the program for signed ints produced the expected results - at 65535 the very large positive number became a very large negative number as the integer overflowed.

I then went through and changed every "int" to "unsigned int". Despite this integer overflow occurs as if the ints were signed and not unsigned.

My question is a) Why is this? b) How can I make it so that my answer can use the whole range of unsigned int i.e. 0 through to (2^32) - 1 (as I dont need negative values!).

Thanks very much!

2
  • 1
    Tip from Gauss: n*(n-1)/2. Commented Oct 14, 2014 at 20:58
  • 1
    Thats how I calculated the max values ie. 2^32 = n*(n-1)/2 Commented Oct 14, 2014 at 21:02

1 Answer 1

4

You forgot to change the final printf formats from signed to unsigned.

Change:

printf("Sum to %d = %d\n", input, result);

to:

printf("Sum to %u = %u\n", input, result);
               ^^   ^^

Note that enabling compiler warnings (e.g. gcc -Wall ...) would have alerted you to this. Always enable compiler warnings and always take heed of them.

Sign up to request clarification or add additional context in comments.

4 Comments

Well if this teaches you to always enable compiler warnings then it will have been worth it. ;-)
@PaulR Not sure what OP is using but I wish MSVC would finally start emitting warnings for this... /analyze kinda does but is terribly slow and doesn't warn about this particular case (signed/unsigned mismatch). Oh well, at least we'll finally get C99 (sn)printf in the "14" version, sigh... ;)
@user2802841: I didn't realise MSVC was still lagging behind on this, although I guess I shouldn't be surprised - most other compilers have been generating warnings for printf et al for many years now of course.
@user2802841: For such printf warnings to be helpful, they should take into account how the arguments were produced. If u is of type unsigned char, requiring an (unsigned) cast within printf is mindless pedantry, especially since there's no reason why a non-obtuse implementation wouldn't treat signed and unsigned types interchangeably when dealing with values that are within range of both.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.