0

I'm trying to convert int to a C-string using this code:

static char* copyString(const char* str)  
{
    char* new_String = malloc(strlen(str) + 1);
    return strcpy(new_String, str);
}
static char* mallocIntToString(const int num)
{
    char s[10]; // *************
    sprintf(s, "%d", num);
    char* copy = copyString(s);

    return copy;
}

I was wandering what is the biggest number I should insert in the line char s[10];.

I used 10 because its the max digits for an integer. However, this code also works for me when I use char s[9];, trying to convert the number 2147483647 to string, how is it possible?

5
  • 1
    Look at <limits.h>. For example, INT_MIN and INT_MAX may be defined as -2147483648 and +2147483647, respectively. Commented Apr 16, 2020 at 17:33
  • 4
    Both are wrong and the code just happens to work. A 10-digit number requires 11 chars, because there must be space for the terminating '\0'. Even so, you will run into trouble with −2147483648, where you need one extra char for the sign. Commented Apr 16, 2020 at 17:33
  • 1
    Undefined Behavior Is Undefined. Anything could happen, including "working" today and breaking next week. Commented Apr 16, 2020 at 17:37
  • 1
    FYI: there's already strdup that does exactly what your copyString function is doing (and with proper error checking). Commented Apr 16, 2020 at 17:38
  • 1
    Why bother with copying the resulting string at all? Just allocate an appropriately sized buffer on the heap and use sprintf() to write the result into it. Commented Apr 16, 2020 at 17:42

1 Answer 1

1

I was wandering what is the biggest number I should insert in the line char s[10];.

You should insert the maximum length between INT_MAX and INT_MIN defined in <limits.h>. INT_MIN should be -2147483648, so you need 11 + 1 = 12 characters (the additional one is for the string terminator \0).

A correct implementation would be:

static char* mallocIntToString(const int num)
{
    char *s;

    s = malloc(12);
    if (s == NULL)
        return NULL;

    sprintf(s, "%d", num);    
    // Should probably also check for error here.

    return s;
}

However, this code also works for me when I use char s[9];, trying to convert the number 2147483647 to string, how is it possible?

You're invoking undefined behavior, since writing past the end of an array is undefined behavior. When doing so, anything can happen. The program is not guaranteed to work correctly, but out of pure coincidence, it could still seem to work. What is happening in your code is that since the variable is defined on the stack, and the stack is writable, writing one or two bytes more does not change much because the program is still writing to valid memory. With that being said, it's still undefined behavior and should be avoided at all costs!

To be 100% clear, you would be invoking undefined behavior even with char s[10]; or char s[11];, because in the case of INT_MIN the terminator would be written out of the array bounds.

Sign up to request clarification or add additional context in comments.

2 Comments

hey Marco, thank you for your very detailed answer, you made it absolutely clear for me :)
@NivBehar you're welcome. If I answered your question you can accept my answer with the checkmark button on the left so that your post can be marked as solved.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.