L
liming
Hi all,
I have to parse two text files on a weekly basis. Each range from 300kb to
1mb in total. Each text file has 5 columns (name,id, dollar,
startdate,enddate), everytime,
a) I need to parse each row, extract each column
2) check if the data already exisinst in the db between startdate and end
date
3) if not, then insert them into the the database, else, modify the record
with the new data.
As you can imagine, with size like 1mb, doing this row by row is not fast
nor efficient as it hits the db so many times (around 8000 -10,000 rows)
I'm wondering what would be a faster and efficent way to do this? I'm
thinking of a solution, but would love to get some input. Read into a
dataTable first and then modify the rowState? SqlBulkInsert? Is there a way
to parse large txt file into memory FAST?
thanks. Any suggestion is grealty appreciated.
I have to parse two text files on a weekly basis. Each range from 300kb to
1mb in total. Each text file has 5 columns (name,id, dollar,
startdate,enddate), everytime,
a) I need to parse each row, extract each column
2) check if the data already exisinst in the db between startdate and end
date
3) if not, then insert them into the the database, else, modify the record
with the new data.
As you can imagine, with size like 1mb, doing this row by row is not fast
nor efficient as it hits the db so many times (around 8000 -10,000 rows)
I'm wondering what would be a faster and efficent way to do this? I'm
thinking of a solution, but would love to get some input. Read into a
dataTable first and then modify the rowState? SqlBulkInsert? Is there a way
to parse large txt file into memory FAST?
thanks. Any suggestion is grealty appreciated.