M
Mike Copeland
I have the following routine that determines the time required to
copy a large text file to another. I'm comparing this code to a class I
wrote (too large to post here) that does text file I/o with explicit
buffer size designation (4096 character buffers). The code below runs
12-13 times slower, although doing exactly the same work.
I tried to google "fstream buffer" with great confusion and little
understanding of what I'd need to do to change the code below to use
explicit buffers: hoping to get a better performance match to my old
processing.
I'd like to use C++ strings and I/o streams if possible. but I wonder
if the overhead to use these functions can match older C processing
(fgets, setbuf, etc.) at all. The following seems probable:
1. The overhead to size and allocate/release a string variable for each
logical record will always be slower than use of an explicit char
[some_size] for the text string being read from the input file.
2. The "<<" operators are probably much slower than "fputs".
Are my assumptions valid? Are there options or function calls I can
use to specify fstream buffers (and how are they used?)? Please advise.
TIA
//////////////////// Code ////////////////////
string line;
fstream fVar1, fVar2;
fVar1.open("pat12.n00", fstream::in);
if(fVar1.is_open())
{
fVar2.open("test.txt", fstream:ut);
while(getline(fVar1, line))
{
fVar2 << line << endl;
}
fVar1.close(), fVar2.close();
}
else cout << "Unable to open file";
//////////////////////////////////////////////
copy a large text file to another. I'm comparing this code to a class I
wrote (too large to post here) that does text file I/o with explicit
buffer size designation (4096 character buffers). The code below runs
12-13 times slower, although doing exactly the same work.
I tried to google "fstream buffer" with great confusion and little
understanding of what I'd need to do to change the code below to use
explicit buffers: hoping to get a better performance match to my old
processing.
I'd like to use C++ strings and I/o streams if possible. but I wonder
if the overhead to use these functions can match older C processing
(fgets, setbuf, etc.) at all. The following seems probable:
1. The overhead to size and allocate/release a string variable for each
logical record will always be slower than use of an explicit char
[some_size] for the text string being read from the input file.
2. The "<<" operators are probably much slower than "fputs".
Are my assumptions valid? Are there options or function calls I can
use to specify fstream buffers (and how are they used?)? Please advise.
TIA
//////////////////// Code ////////////////////
string line;
fstream fVar1, fVar2;
fVar1.open("pat12.n00", fstream::in);
if(fVar1.is_open())
{
fVar2.open("test.txt", fstream:ut);
while(getline(fVar1, line))
{
fVar2 << line << endl;
}
fVar1.close(), fVar2.close();
}
else cout << "Unable to open file";
//////////////////////////////////////////////