T
toton
Hi,
I am reading some large text files and parsing it. typical file size
I am using is 3 MB. It takes around 20 sec just to use std::getline (I
need to treat newlines properly ) for whole file in debug , and 8 sec
while optimization on.
It is for Visual Studio 7.1 and its std library. While vim opens it
in a fraction of sec.
So, is it that getline is reading the file line by line, instead
reading a chunk at a time in its internal buffer? is there any
function to set how much to read from the stream internally ?
I am not very comfortable with read and readsome , to load a large
buffer, as it changes the file position. While I need the visible file
position to be the position I am actually, while "internally" it
should read some more , may be like 1MB chunk ... ?
abir
I am reading some large text files and parsing it. typical file size
I am using is 3 MB. It takes around 20 sec just to use std::getline (I
need to treat newlines properly ) for whole file in debug , and 8 sec
while optimization on.
It is for Visual Studio 7.1 and its std library. While vim opens it
in a fraction of sec.
So, is it that getline is reading the file line by line, instead
reading a chunk at a time in its internal buffer? is there any
function to set how much to read from the stream internally ?
I am not very comfortable with read and readsome , to load a large
buffer, as it changes the file position. While I need the visible file
position to be the position I am actually, while "internally" it
should read some more , may be like 1MB chunk ... ?
abir