Insert/Update data to sql server DB

M

mrajanikrishna

Hi,

I am importing external data(oracle) to local sql server DB thru
linked server. This is actual replica of the external data(not
normalized). I need to run this import for every 2 minutes. Its
working well.
Now, I need to split this data into child tables to fulfil the
normalization. For that, I have created a C# class which gets all
records(newly added/modified) and passing these values to stored
procedure as parameters.
The SP will check if that record exists, it will update or it will
insert into my local tables.

I made DLL and I need to run this DLL for every 2 minutes(this is my
client requirement). If I have 1500 records, this import consumes 10
minutes(to establish connection, retrieve, and calling SP).
But this is more time. Its not acceptable.

How can I reduce the time? Is it correct way what I am doing or any
other way is possible?

Pls suggest me an idea.

Thanks in advance
 
K

Kevin Spencer

In essence, you're saying that it takes more time to process more records,
and that is a fact. If it takes more time than your client wants it to, I
hope your client is willing to spend some serious money, because the only
solution is to use multiple SQL Server instances on multiple machines, and
then you can distribute the load among them, using asynchronous operations
to process one group of records while another is being processed. With SQL
Server 2005, you can use SQL Server Service Broker to synchronize the server
farm.

--
HTH,

Kevin Spencer
Chicken Salad Surgeon
Microsoft MVP
 
S

sloan

//
For that, I have created a C# class which gets all
records(newly added/modified) and passing these values to stored
procedure as parameters.
The SP will check if that record exists, it will update or it will
insert into my local tables.
//

An approach I take, goes something like this:

First, I use Xml (strong dataset actually) to create an X number of records.
Lets say 100, but I make this number configurable.
I use a DataReader on the source data.
I loop over the source records, and add them to the strong dataset 1 by 1.
At 100 (dataset) records, I ship that Xml off to a Sql Server stored
procedure. At that end, I use OPENXML to either insert or update the
records.


Doing a db call, one by one..per records...... I think is a killer.

I'll give you some links.

http://www.sqlservercentral.com/articles/Stored+Procedures/thezerotonparameterproblem/2283/
Don't concentrate on the "variable report variables". Notice how I am
passing Xml into a stored procedure.
Once you have the xml data into a #temp or @variable table, you can do
anything you want with it.

Write one stored procedure that handles inserts and updates. You basically
do a match in the PrimaryKey (or perhaps a unique constraint on the "Name"
of the entity, like for dbo.Dept, you could do a check on
dbo.Dept.DepartmentName, if you have a unique constraint on DepartmentName)

You can write DTS packages as well, but I find when I need to throw some
business logic into the equation, I find the above approach useful.
bcp is another option.

Basically, the approach above is based on a MS KB I read a-long-time-ago,
and have used this approach over the years with good performance and
success.

...

Keep this in mind, when I update/insert 1000 records at a time, the index
rebuilding gets delayed until after all 1000 are done.
Your approach means index rebuilding for every record. Ouch!

Here is a google search that'll help find some clues as well.
http://www.google.com/search?source=ig&hl=en&rlz=&q=OPENXML+"Insert+into"+site:microsoft.com


I taught a guy how to do this who had a boat...he left an old laptop on the
boat, and only had port 80 open.
We wrote a mini web service, that took a strong dataset, and he was sending
over 5000 records, over the internet, over port 80, and updating a db on his
provider , across the country.
He was stunned how fast it ended up being. His laptop basically was adding
records to a strong dataset, and sending them over every hour. He was
monitoring things like
"Battery Level", "WaterHeight" (bottom of the boat) via sensors and usb
devices. It was a nice little setup.



That was a simple situation, with just log records, but it worked.


Your situation may be a little different, if you have two tables, and FK
relationships. But you can do it.


Not directly related, but there are tidbits there and there on my blog about
strong datasets, and layered applications as well.
http://sholliday.spaces.live.com/blog/cns!A68482B9628A842A!140.entry


Here are some other links to posts I've made in the past:
http://groups.google.com/groups/search?q=OPENXML+sloan&start=0&scoring=d&hl=en&
 
R

Rad [Visual C# MVP]

Hi,

I am importing external data(oracle) to local sql server DB thru
linked server. This is actual replica of the external data(not
normalized). I need to run this import for every 2 minutes. Its
working well.
Now, I need to split this data into child tables to fulfil the
normalization. For that, I have created a C# class which gets all
records(newly added/modified) and passing these values to stored
procedure as parameters.
The SP will check if that record exists, it will update or it will
insert into my local tables.

I made DLL and I need to run this DLL for every 2 minutes(this is my
client requirement). If I have 1500 records, this import consumes 10
minutes(to establish connection, retrieve, and calling SP).
But this is more time. Its not acceptable.

How can I reduce the time? Is it correct way what I am doing or any
other way is possible?

Pls suggest me an idea.

Thanks in advance

10 minutes? Something very wrong.

Have you tried using an alternative design? What version of SQL Server
are you using? If its SQL Server 2000 or even 7, take a look at DTS.
You can pump data to and fro at high speed and manipulate it using
VBScript.

If you're using SQL Server 2005 the SQL Server Integration Services
are even better ... you can write some pretty advanced logic to do
this sort of thing.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,982
Messages
2,570,186
Members
46,744
Latest member
CortneyMcK

Latest Threads

Top