App Architecture: Typed Datasets, EntLib, Performance

K

Karch

I have a web application that I have partitioned into 4 assemblies: web
application (presentation), business entities (typed datasets), business
objects (business logic), and data access (using DAAB). The two areas that
where I have read conflicting information are the use of typed datasets and
data access using the Enterprise Library.

Typed datasets:

There seems to be agreement that the use of typed datasets is "good".
However, there is some disagreement about _where_ they should be filled and
used. Some say that they should not be filled by the data access layer, but
rather should exist only in the business layer, which would merge the
untyped dataset returned from the data layer and pass the typed dataset to
the the web application.

1. Is it more efficient to use untyped datasets in the data access layer and
merge the results into a typed dataset at the business layer level?
2. What are the recommendations or guidelines for using typed datasets in
this type of architecture?

Data access:

I see the value in using DAAB if I want to be completely provider agnostic,
but what if I know that I will always be using SQL Server?

1. Am I better off in this case, from a performance perspective, to just use
data adapters and sqlclient?
2. Am I just buying myself fewer lines of code at the expense of performance
by using DAAB?
3. In cases where I expect only one row from a stored procedure, should I be
using out parameters in my stored procedure and assigning the values to a
dataset or should I just fill the dataset with a recordset?
 
C

Cowboy \(Gregory A. Beamer\)

Before getting too deep into this, I have one comment. DO NOT code for
performance alone.

You can write an application that blows the socks off others if you move as
much of your application to C code and put a service layer on top for the
..NET bits. In the process, you sacrifice maintainability and extensibility
and possibility scalability as well. But, damn, it performs.

To your question, efficiency is not the only thing you should be concerned
with. It is possible you will save a few cycles using untyped DataSets on
the back end. I am not convinced this is the case, but it is certainly
possible. In the process, however, you opt for two disparate systems with
different sets of rules. This leads to a team that is split and where you
cannot easily move people around from middle tier to back end. If this is
how your organization is set up, it is not an issue.

To your questions (INLINE)

Karch said:
I have a web application that I have partitioned into 4 assemblies: web
application (presentation), business entities (typed datasets), business
objects (business logic), and data access (using DAAB). The two areas that
where I have read conflicting information are the use of typed datasets and
data access using the Enterprise Library.

Typed datasets:

There seems to be agreement that the use of typed datasets is "good".
However, there is some disagreement about _where_ they should be filled
and used. Some say that they should not be filled by the data access
layer, but rather should exist only in the business layer, which would
merge the untyped dataset returned from the data layer and pass the typed
dataset to the the web application.

1. Is it more efficient to use untyped datasets in the data access layer
and merge the results into a typed dataset at the business layer level?

Not sure.
2. What are the recommendations or guidelines for using typed datasets in
this type of architecture?

I would recommend choosing your architecture, objects or datasets, up front
and sticking with the same archtiecture throughout. If you find that you
need more cycles due to a perf problem, you can then adjust the application
to try other options like the one you suggest.

Overall, I am more fond of true business objects, but I use code generators
to avoid having to type a lot (lazy programmers rule ;->). I am currently
using CodeSmith with .NET Tiers, but I actually prefer LLBLGen Pro, as it is
much easier to define complex relationships. Both create objects that
perform nicely.

A word of caution: If you go with a O/R Mapping layer, as I have, be careful
as there are some really crappy open source implementations out there (there
are some good ones, but I ran through quite a few before deciding I was not
realy "getting something for free" and purchased a commercial O/R Mapper).
As an example, because it is on the top of my head, there is a project caled
Typed Data Objects that does not use the same name for its dyanmic
parameters. The negative here is you cannot reuse query plans. I am sure
some of the other projects out there suffer from similar issues, so do
research if you go the O/R route.
Data access:

I see the value in using DAAB if I want to be completely provider
agnostic, but what if I know that I will always be using SQL Server?

1. Am I better off in this case, from a performance perspective, to just
use data adapters and sqlclient?

The DAAB (and Enterprise Library) offer more than simple data access. For
example, you have someone else responsible for the data layer code, which is
a nice thing. Less for you to maintain.

It does add a bit of weight, but I am not sure it is significant enough to
make a decision on the weight alone. I would aim for the EntLib instead of
just the DAAB, as you get some other bits and it is a better model than the
DAAB model.

Sticking with standard, however? Sure, if you have the need of rolling your
own data layer, it is an option.
2. Am I just buying myself fewer lines of code at the expense of
performance by using DAAB?

Possibly, but I would state that you should not focus solely on performance,
unless you have or foresee a performance problem. Too many people are
focused solely on performance and forget other aspects. Using a third party
library (like EntLib or DAAB) saves you time to focus on other things. If
you find out, during modeling, that you have a perf problem, you can switch
out to a layer that mimics the routines of the DAAB but is lighter in perf
weight. In fact, if you completely abstract your data access layer, you can
switch to whatever you want.

I do not, however, think the DAAB is going to be the biggest issue in your
list of potential perf issues.
3. In cases where I expect only one row from a stored procedure, should I
be using out parameters in my stored procedure and assigning the values to
a dataset or should I just fill the dataset with a recordset?

If you are getting back a single value, out parameters might gain you
something. If you are grabbing a full row, I am not sure you are going to
buy anything for the complexity you are adding.

--
Gregory A. Beamer
MVP; MCP: +I, SE, SD, DBA

*************************************************
Think outside of the box!
*************************************************
 
K

Karch

Thanks for the reply. Yes, I understand all these issues and that there are
other factors when making these decisions, but in order to determine the
tradeoffs, I would like to know what the impact is to performance. Clearly,
I am implementing architecture for the reasons that you mention. I just need
to know the degree to which it is providing benefit vs. just doing it for
the sake of doing it. I'd like to hear what others have done, specifically
where performance was an issue, because at some point it will be for me.
 
S

sloan

See
http://sholliday.spaces.live.com/
In my 5/24/2006 blog entry, I reference this MS article
And a reference to read from start to finish, aka, very informative for a
bird's eye view:
*
http://msdn.microsoft.com/library/default.asp?url=/library/en-us/dnbda/html/BOAGag.asp

Deploying Business Entities
find that heading in the MS article.
Read the 3 points, several times each. It takes a few reads to have it sink
in.

Then I'd get the code from my blog, the 5/24/2006 article.
I've matched the MS article , and have typed datasets as the common library
living outside of the tiers.


The enterprise blocks are barely going to affect performance.
The only place they hurt , just a tad bit, is using reflection to
instantiate the concrete classes.
However, reflection compared to the overhead of db communication, is about 3
drops in a gallon of water.


Cowboy is right on, performance is not the be all end all.
You can get great performance with application blocks.


You might read this article also:
http://www.codeproject.com/gen/design/DudeWheresMyBusinessLogic.asp


Finally: The MS article talks about the pros/cons of (typed) datasets vs
custom objects (business entities).
I choose custom objects most of the time.
For reports, I use strong typed datasets.

That's my rule of thumb.
 
K

Karch

Thanks! This is exactly what I was looking for.


sloan said:
See
http://sholliday.spaces.live.com/
In my 5/24/2006 blog entry, I reference this MS article
And a reference to read from start to finish, aka, very informative for a
bird's eye view:
*
http://msdn.microsoft.com/library/default.asp?url=/library/en-us/dnbda/html/BOAGag.asp

Deploying Business Entities
find that heading in the MS article.
Read the 3 points, several times each. It takes a few reads to have it
sink
in.

Then I'd get the code from my blog, the 5/24/2006 article.
I've matched the MS article , and have typed datasets as the common
library
living outside of the tiers.


The enterprise blocks are barely going to affect performance.
The only place they hurt , just a tad bit, is using reflection to
instantiate the concrete classes.
However, reflection compared to the overhead of db communication, is about
3
drops in a gallon of water.


Cowboy is right on, performance is not the be all end all.
You can get great performance with application blocks.


You might read this article also:
http://www.codeproject.com/gen/design/DudeWheresMyBusinessLogic.asp


Finally: The MS article talks about the pros/cons of (typed) datasets vs
custom objects (business entities).
I choose custom objects most of the time.
For reports, I use strong typed datasets.

That's my rule of thumb.
 
C

Cowboy \(Gregory A. Beamer\)

The app blocks have some overhead. As they cache parameters, they also
relieve some overhead with subsequent calls. They also give a certain amount
of safety (blow up on loading params rather than in an expensive trip to the
database, for example) which can help performance.

Are they the number one perf option? Not only no, but hell no. If you want
pure perf, switch your data layer to streaming output (DataReader) and blast
data through your layers. You then apply the opposite approach to inserting
and use larger chunks, like XML or updategrams, when you fire more than one
record back. Boom, you have perf, but you have to really trust your
developers at this point, as you have removed all safety nets. Net year,
when you hire Joe Junior Prog, you risk a meltdown. :)

Overall, the typed datasets add a perf gain in some respects, as you have
type checking on the dataset. If you have carefully ensured types elsewhere,
like pulling from the database, you might gain a bit, but you have better
perf on the client side with strongly typed. This means you have to migrate
from standard DS to STD, which costs something.

--
Gregory A. Beamer
MVP; MCP: +I, SE, SD, DBA

*************************************************
Think outside of the box!
*************************************************
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,994
Messages
2,570,223
Members
46,813
Latest member
lawrwtwinkle111

Latest Threads

Top