M
Michael Linfield
Recently I've attempted to push a huge csv into arrays via code that
looks along the lines of this:
csvFile = FasterCSV.read('data.csv', :headers => true)
array = []
csvFile.each do |row|
array << row['column_name']
end
The problem arises when the csv file is someodd 2 million lines or more.
Normally I would comment about how long it took but I decided to call it
quits after 9 hours of waiting lol. Any ideas on how to handle columns
in CSV docs the same way FasterCSV does?
(And yes, theoretically I could split the 80mb csv into 20 4mb files but
whats the accomplishment in that!)
Thanks,
- Mac
looks along the lines of this:
csvFile = FasterCSV.read('data.csv', :headers => true)
array = []
csvFile.each do |row|
array << row['column_name']
end
The problem arises when the csv file is someodd 2 million lines or more.
Normally I would comment about how long it took but I decided to call it
quits after 9 hours of waiting lol. Any ideas on how to handle columns
in CSV docs the same way FasterCSV does?
(And yes, theoretically I could split the 80mb csv into 20 4mb files but
whats the accomplishment in that!)
Thanks,
- Mac