J
Jon Spivey
ASP.net 3.5/SQL Server 2008
I've got a large (1.1m rows) csv file which needs parsing and sticking into
sql server - the job needs doing every day and if anything the csv will get
larger over time. At present I'm using a TextFieldParser to parse the csv
line by line and add to the database. This fails probably 2 times in 3, if
it's going to fall over it's usually at around 200,000 lines. Looking for
suggestions as to how to do this robustly, on a shared server which doesn't
allow bulk insert. Fair to assume the server is a factor in failure but I
can't upgrade just yet.
Would I be better breaking the csv into say 5 seperate files then processing
each individually or processing in chunks, eg
if TextFieldParser.LineNumber < 200,000 then
process first chunk
end if
if TextFieldParser.LineNumber > 200000 and TextFieldParser.LineNumber
<400000 then
process next chunk
end if
etc.
Or something else entirely?
Cheers,
Jon
I've got a large (1.1m rows) csv file which needs parsing and sticking into
sql server - the job needs doing every day and if anything the csv will get
larger over time. At present I'm using a TextFieldParser to parse the csv
line by line and add to the database. This fails probably 2 times in 3, if
it's going to fall over it's usually at around 200,000 lines. Looking for
suggestions as to how to do this robustly, on a shared server which doesn't
allow bulk insert. Fair to assume the server is a factor in failure but I
can't upgrade just yet.
Would I be better breaking the csv into say 5 seperate files then processing
each individually or processing in chunks, eg
if TextFieldParser.LineNumber < 200,000 then
process first chunk
end if
if TextFieldParser.LineNumber > 200000 and TextFieldParser.LineNumber
<400000 then
process next chunk
end if
etc.
Or something else entirely?
Cheers,
Jon