R
random guy
Hi,
I'm writing a program which creates an index of text files. For each
file it
processes, the program records the start and end positions (as
returned by
tellg()) of sections of interest, and then some time later uses these
positions
to read the interesting sections from the file.
When reading the sections, I'm currently using get() to read
characters from the
file one by one and concatenating them to what has already been read.
However, I
guess this will be fairly inefficient if the text to extract is long.
Is there a more efficient way to do this, perhaps using an existing
library
function? I'd imagine that this question has been asked before, but
when
googling for answers I could only find solutions for reading entire
files
completely; I can't do that because the files are too large to store
in memory.
My code is below; any advice would be gratefully received!
#include <iostream>
#include <string>
#include <fstream>
std::string get_string(std::ifstream &in,
std::ifstream:os_type start,
std::ifstream:os_type end) {
in.seekg(start);
std::string s;
while (in.tellg() != end) {
s += in.get(); // Not very efficient?
}
return s;
}
int main(void) {
std::ifstream in("test_file", std::ios_base::binary);
// Hard-coded positions below; these would normally be returned
from tellg()
std::cout << "\"" << get_string(in, 10, 19) << "\"" << std::endl;
return 0;
}
I'm writing a program which creates an index of text files. For each
file it
processes, the program records the start and end positions (as
returned by
tellg()) of sections of interest, and then some time later uses these
positions
to read the interesting sections from the file.
When reading the sections, I'm currently using get() to read
characters from the
file one by one and concatenating them to what has already been read.
However, I
guess this will be fairly inefficient if the text to extract is long.
Is there a more efficient way to do this, perhaps using an existing
library
function? I'd imagine that this question has been asked before, but
when
googling for answers I could only find solutions for reading entire
files
completely; I can't do that because the files are too large to store
in memory.
My code is below; any advice would be gratefully received!
#include <iostream>
#include <string>
#include <fstream>
std::string get_string(std::ifstream &in,
std::ifstream:os_type start,
std::ifstream:os_type end) {
in.seekg(start);
std::string s;
while (in.tellg() != end) {
s += in.get(); // Not very efficient?
}
return s;
}
int main(void) {
std::ifstream in("test_file", std::ios_base::binary);
// Hard-coded positions below; these would normally be returned
from tellg()
std::cout << "\"" << get_string(in, 10, 19) << "\"" << std::endl;
return 0;
}