I'm writing a simple utility in C and I'm writing codes to print error messages in STDERR.
I have a struct, defined as:
struct arguments
{
FILE *source;
int modifier_value;
int filesize;
};
I have declared a pointer to the above struct, and allocated memory to it:
struct arguments *arg = NULL;
arg = malloc(sizeof(struct arguments));
if(arg == NULL)
{
fprintf(stderr, "error: malloc - %s\n", strerror(errno)); //I know i could use perror as well, but I like the convention of using fprintf() for both stdout, stderr, and other file streams, just for symmetry
return EXIT_FAILURE;
}
As you can see, I have only allocated memory sufficient to store one object of type struct arguments and I still error checking for it.
The problem is I have many pointers like that which points to space of one object, and error checking all of them just increases number of codes and affects readability of code.
Is it a "good practice" / "is it ok" if I ignore error checking just for the reason that I'm not allocating memory too much memory (I heard something about paging and I think system combines many pages if I request for too much memory and chances of error in that case would be high, but not for memory request of something like 64 bytes).
__func__name which resolves to a string. For example declare/definevoid *my_malloc(size_t size, const char *func);and then call asarg = my_malloc(sizeof(struct arguments), __func__);