B
byte8bits
How does C++ safely open and read very large files? For example, say I
have 1GB of physical memory and I open a 4GB file and attempt to read
it like so:
#include <iostream>
#include <fstream>
#include <string>
using namespace std;
int main () {
string line;
ifstream myfile ("example.txt", ios::binary);
if (myfile.is_open())
{
while (! myfile.eof() )
{
getline (myfile,line);
cout << line << endl;
}
myfile.close();
}
else cout << "Unable to open file";
return 0;
}
In particular, what if a line in the file is more than the amount of
available physical memory? What would happen? Seems getline() would
cause a crash. Is there a better way. Maybe... check amount of free
memory, then use 10% or so of that amount for the read. So if 1GB of
memory is free, then take 100MB for file IO. If only 10MB is free,
then just read 1MB at a time. Repeat this step until the file has been
read completely. Is something built into standard C++ to handle this?
Or is there a accepted way to do this?
Thanks,
Brad
have 1GB of physical memory and I open a 4GB file and attempt to read
it like so:
#include <iostream>
#include <fstream>
#include <string>
using namespace std;
int main () {
string line;
ifstream myfile ("example.txt", ios::binary);
if (myfile.is_open())
{
while (! myfile.eof() )
{
getline (myfile,line);
cout << line << endl;
}
myfile.close();
}
else cout << "Unable to open file";
return 0;
}
In particular, what if a line in the file is more than the amount of
available physical memory? What would happen? Seems getline() would
cause a crash. Is there a better way. Maybe... check amount of free
memory, then use 10% or so of that amount for the read. So if 1GB of
memory is free, then take 100MB for file IO. If only 10MB is free,
then just read 1MB at a time. Repeat this step until the file has been
read completely. Is something built into standard C++ to handle this?
Or is there a accepted way to do this?
Thanks,
Brad