I need to read a text file which may contain long lines of text. I am thinking of the best way to do this. Considering efficiency, even though I am doing this in C++, I would still choose C library functions to do the IO.
Because I don't know how long a line is, potentially really really long, I don't want to allocate a large array and then use fgets to read a line. On the other hand, I do need to know where each line ends. One use case of such is to count the words/chars in each line. I could allocate a small array and use fgets to read, and then determine whether there is \r, \n, or \r\n appearing in the line to tell whether a full line has been read. But this involves a lot of strstr calls (for \r\n, or there are better ways? for example from the return value of fgets?). I could also do fgetc to read each individual char one at a time. But does this function have buffering?
Please suggest compare these or other different ways of doing this task.
std::stringandstd::getline. Why not? Profile before you claim it's too slow.fgets()will not read\ras an end of line under normal circumstances. Look at POSIX 2008 andgetline()but beware of the portability implications of using it. (OTOH, it is not dreadfully hard to provide your own implementation if need so be.) All possible line endings is trickier - even POSIXgetline()only deals with a single delimiter character (as doesgetdelim()on the same page).fgetc()does have buffering, as dogetc()andgetchar(). Most of the input is described in terms of 'as if by calling `getc()'.