Monday, February 16, 2009

High Performance Data Access Layer Architecture Part 1

Why write data access code when we have ORMs?

Remember data access patterns?  It seems like there is a huge focus these days placed on Linq, ADO.Net Entity Framework, and other ORM-like tools that are intended to make programmers more efficient by doing all of that time consuming data access for you.  There definitely are productivity improvements to be had using these tools, but how does the resulting application perform? In most cases where we have a standard line of business application for which the primary concerns are functionality and workflow the answer is “good enough”.  These applications are a great fit for tools like this. However if you have a high volume application where performance is the primary concern, these tools may not be the right choice.  Both Linq and ADO.Net are significantly slower than well written ADO.Net code.  According to a post on the ADO.net Team Blog titled ADO.NET Entity Framework Performance Comparison, ADO.Net Entity Framework can be 50%-300% slower than ADO.Net using ordinals and SqlDataReaders.

So, my opinion is that ORMs are great for most applications, but when performance is a factor it’s best to roll your own DAL.  This post will demonstrate some of the patterns that I use for the Data Access Layer that allow for rapid development and lightning fast performance.

Use DTOs not DataSets or DataTables

First, what container are we going to use to pass data from our DAL to the other layers of our application?  The usual answers I get are either DataTables/DataSets or full business objects.  I don’t like either of these. DataSets and DataTables come with significant overhead and they don’t contain strongly typed data.  Business objects do contain strongly typed data, but they typically contain a lot of extra business logic that I don’t need, and they may even contain persistence logic.  I really don’t want any of that.  I want the lightest weight, simplest possible container that will give me strongly typed data, and that container is a Data Transfer Object (DTO). DTOs are simple classes that contain only properties.  They have no real methods, just mutators and accessors for their data. Below is a class diagram of a PersonDTO as well as the DTOBase and CommonBase classes that are in it’s inheritance chain. PersonDTO contains all of the data needed for a Person entity in my application.

  image

Here’s how I typically construct a DTO.  First, DTOs are designed to move between layers of the application.  So, they don’t belong in the DAL.  I put them in a separate project/assembly named “Common”.  Then I create a reference to Common in my DAL, BAL, WebUI, and in any other project in my application.

Now we can start creating classes in Common. the first class we need to create is CommonBase. CommonBase’s only purpose is to contain static properties that define null values.  Our DTOs are going to contain both value type and reference type data, and since value types always have a value and are never null, this can make null checking a challenge in higher layers of the application. To further complicate things, some developers will use String.Empty or “” to represent a null value for a string. Others will use null (string is a reference type after all).  To avoid all this confusion, I like to define actual null values for each type in my Common assembly.  That way we have a predefined value that we can use for null checking and null setting throughout the application.  Here is the code for CommonBase.

    public class CommonBase

    {

        // Let's setup standard null values

        public static DateTime DateTime_NullValue = DateTime.MinValue;

        public static Guid Guid_NullValue = Guid.Empty;

        public static int Int_NullValue = int.MinValue;

        public static float Float_NullValue = float.MinValue;

        public static decimal Decimal_NullValue = decimal.MinValue;

        public static string String_NullValue = null;

    }

The next class is DTOBase. This base class encapsulates any common functionality for my DTOs.  Right now, the only thing I’m putting in DTOBase is an IsNew flag that can be used to indicate if a DTO contains newly created data (as opposed to data that was pulled from the database).

    public abstract class DTOBase:CommonBase

    {

        public bool IsNew { get; set; }

    }

Now I can create my PersonDTO class.  PersonDTO is just a bunch of properties that represent the data for a person record, and a constructor that initializes each property to to the null value for it’s type.

public class PersonDTO : DTOBase

    {

        public Guid PersonGuid { get;set; }

        public int PersonId { get; set; }

        public DateTime UtcCreated { get; set; }

        public DateTime UtcModified { get; set; }

        public string Password { get; set; }

        public string Name { get; set; }

        public string Nickname { get; set; }

        public string PhoneMobile { get; set; }

        public string PhoneHome { get; set; }

        public string Email { get; set; }

        public string ImAddress { get; set; }

        public int ImType { get; set; }

        public int TimeZoneId { get; set; }

        public int LanguageId { get; set; }

        public string City { get; set; }

        public string State { get; set; }

        public int ZipCode { get; set; }

 

        // Constructor

        // No parameters and all types are intialized to their

        // null values as defined in CommonBase.

        public PersonDTO()

        {        

            PersonGuid = Guid_NullValue;

            PersonId = Int_NullValue;

            UtcCreated = DateTime_NullValue;

            UtcModified = DateTime_NullValue; 

            Name = String_NullValue;

            Nickname = String_NullValue;

            PhoneMobile = String_NullValue;

            PhoneHome = String_NullValue;

            Email = String_NullValue;

            ImAddress = String_NullValue;

            ImType = Int_NullValue;

            TimeZoneId = Int_NullValue;

            LanguageId = Int_NullValue;

            City = String_NullValue;

            State = String_NullValue;

            ZipCode = Int_NullValue;

            IsNew = true;

        }

    }

How should the DAL send data to other layers?

When you’re building framework code or plumbing as I like to call it, it’s a good idea to stop periodically and really think about how the code you’re writing is going to be used.  The most useful technique that I use is to stop and visualize what I want the consuming code to look like. We’ve already decided that we’re using DTOs to contain data. Let’s take a moment to think about how we want our BAL code to look and what functionality it will require from our DAL. In my BAL, I’m probably going to have a PersonRepository, and in that repository I’m going to have methods that will want to get individual PersonDTOs and generic lists of PersonDTOs from the DAL, and I will probably want to have a single DAL object that provides methods for getting those DTOs.  So I want to create a PersonDb class in my DAL that will allow me to write BAL code that looks like this:

    PersonDb db = new DAL.PersonDb;

    PersonDTO dto = db.GetPersonByPersonGuid(personGuid);

    PersonDTO dto = db.GetPersonByEmail(email);

    List<PersonDTO> people = db.GetPersonList();

With that target in mind I’m going to create a PersonDb class in my DAL.  PersonDb is going to provide methods that will either return a single PersonDTO or a List<PersonDTO>.

The DAL Architecture

We’re going to have a DALBase that encapsulates all of our repeated logic for doing things like creating connections, TSQL commands, sproc commands, and parameters.  The DALBase will also contain methods for getting our 2 main return types, DTO and List<DTO>, from an SqlDataReader. To act as the one stop shop for all of our data access methods that get and set person data, we will create a PersonDB class.  PersonDB will inherit from DALBase and will contain all of our methods that return or save person data like GetPersonByEmail(), GetPersonById() and SavePerson().

We will also need to find a place to put the logic for reading our person data out of an open SqlDataReader and putting it into a PersonDTO.  This involves finding the ordinal for a data field, checking to see if it is null, and if it isn’t, storing the data value in the DTO.  This is really parsing logic so we’ll put it in a separate DTOParser_Person class.  Right now we’re only looking at the classes for PersonDTO, but we will need to have a different parser for each DTO type that the DAL can return (PersonDTO, CompanyDTO, UserDTO, etc.).  We’ll use an abstract DTOParser class to define the interface for all DTOParsers and to encapsulate any repeated functionality. Lastly, we’ll create a static DTOParserFactory class that will return an instance of the appropriate DTOParser for any DTO type that we pass in to it. So if we need to parse a PersonDTO out of a reader we just call

DTOParser parser = DTOParserFactory.GetParser(typeof(PersonDTO))

and we’ll get an instance of the DTOParser_Person class.  Here’s what our DAL classes will look like.

image 

PersonDb

Once again employing the principle of thinking first about how we want to consume our code then writing code to that target, we’re going to write our PersonDb first, then we’ll write our DALBase.  The PersonDb class will need to use DALBase methods for things like creating SqlCommand objects and getting a list of PersonDTOs. Once we see how we want to use these things in PersonDb, we’ll have a better idea of how we want DALBase to work.

First let’s write the GetPersonByPersonGuid() method. Pulling data from a database, then populating a DTO with that data and returning it takes quite a bit of code. But if we think about it, most of that code is duplicated for each data access method that we write.  If we extract out only the things that change for each method we get the following list:

  • We’re going to use sprocs on the SQL Server side so the first thing we need to do is get an SqlCommand object for the named sproc.
  • Next we’ll need to add any parameters and set their values. 
  • The last thing we need to do is run the command and get back the desired return type (either a DTO or List<DTO>) populated with the data. 

These are the only things that really change.  What sproc we’re calling, what parameters we need to add, and what the return type is.  So we’re going to write DALBase helper methods that will enable us to do each one of these tasks with a single line of code.  The resulting GetPersonByPersonGuid() code will look like this:

    SqlCommand command = GetDbSprocCommand("Person_GetByPersonGuid");

    command.Parameters.Add(CreateParameter("@PersonGuid", PersonGuid));

    return GetSingleDTO<PersonDTO>(command);

If we need a GetPersonByEmail() method, we can use the above code with minor modifications.  The things that change are just the sproc name and the parameter.  The modified code looks like:

    SqlCommand command = GetDbSprocCommand("Person_GetByEmail");

    command.Parameters.Add(CreateParameter("@Email", Email));

    return GetSingleDTO<PersonDTO>(command);

Then if we need a GetAll() method that returns all person records, we can do that easily too.  This time the sproc name, the parameters (this time there aren’t any), and the return type all change.

    SqlCommand command = GetDbSprocCommand("Person_GetAll");

    return GetDTOList<PersonDTO>(command);

So with a few helper methods we can put together a simple and easy to maintain PersonDb class.  If you were watching closely you noticed a couple of requirements for DALBase that emerged while writing the PersonDb code.  First, we want to use GetSingleDTO() and GetDTOList() methods but we need to be able to tell them to return specific types of DTOs, like PersonDTO.  Therefore these will need to be generic methods that take the DTO as the type parameter, such as GetSingleDTO<PersonDTO>().

Second, we used the same CreateParameter() method to create a string parameter and a Guid parameter.  So we’ll have to do a little polymorphism and write CreateParameter() overload methods for each type of parameter that we want to create. 

We’ll get into the details next time when we finish up our DAL by coding up the DALBase, the DTOParser classes, and the DTOParserFactory.  BTW, next post is when we’ll get into the real performance oriented code. For data access we’ll use ordinals to pull data from the reader in the most efficient way possible and then use the SqlDataReader’s strongly typed Get methods to do a null check and write the data values to our DTO all without casting the value to Object. For now, here’s the full PersonDb class complete with a SavePerson() method that takes a PersonDTO as it’s only parameter. 

public class PersonDb:DALBase

   {

       // GetPersonByPersonGuid

       public static PersonDTO GetPersonByPersonGuid(Guid PersonGuid)

       {

           SqlCommand command = GetDbSprocCommand("Person_GetByPersonGuid");

           command.Parameters.Add(CreateParameter("@PersonGuid", PersonGuid));

           return GetSingleDTO<PersonDTO>(ref command);

       }

 

 

       // GetPersonByEmail

       public static PersonDTO GetPersonByEmail(string email)

       {

           SqlCommand command = GetDbSprocCommand("Person_GetByEmail");

           command.Parameters.Add(CreateParameter("@Email", email, 100));

           return GetSingleDTO<PersonDTO>(ref command);

       }

 

       // GetAll

       public static List<PersonDTO> GetAll()

       {

           SqlCommand command = GetDbSprocCommand("Person_GetAll");

           return GetDTOList<PersonDTO>(ref command);

       }

 

       // SavePerson

       public static void SavePerson(ref PersonDTO person)

       {

           // The sproc will handle both inserts and updates.  We

           // just need to return the appropriate person guid.  If

           // this is a new person then we return the NewPersonGuid.

           // If this is an update we just return the PersonGuid.

           bool isNewRecord = false;

           if (person.PersonGuid.Equals(Common.DTOBase.Guid_NullValue)){isNewRecord=true;}

 

           // Create the command and parameters. When creating parameters

           // we don't need to check for null values. The CreateParameter

           // method will handle that for us and will create null parameters

           // for any DTO members that match the DTOBase.NullValue for

           // that member's data type.

           SqlCommand command = GetDbSprocCommand("Person_Save");

           command.Parameters.Add(CreateParameter("@PersonGuid", person.PersonGuid));

           command.Parameters.Add(CreateParameter("@Password", person.Password, 20));

           command.Parameters.Add(CreateParameter("@Name", person.Name, 100));

           command.Parameters.Add(CreateParameter("@Nickname", person.Nickname, 50));

           command.Parameters.Add(CreateParameter("@PhoneMobile", person.PhoneMobile, 25));

           command.Parameters.Add(CreateParameter("@PhoneHome", person.PhoneHome, 25));

           command.Parameters.Add(CreateParameter("@Email", person.Email, 100));

           command.Parameters.Add(CreateParameter("@ImAddress", person.ImAddress, 50));

           command.Parameters.Add(CreateParameter("@ImType", person.ImType));

           command.Parameters.Add(CreateParameter("@TimeZoneId", person.TimeZoneId));

           command.Parameters.Add(CreateParameter("@LanguageId", person.LanguageId));

           SqlParameter paramIsDuplicateEmail = CreateOutputParameter("@IsDuplicateEmail", SqlDbType.Bit);

           command.Parameters.Add(paramIsDuplicateEmail);

           SqlParameter paramNewPersonGuid = CreateOutputParameter("@NewPersonGuid", SqlDbType.UniqueIdentifier);

           command.Parameters.Add(paramNewPersonGuid);

 

           // Run the command.

           command.Connection.Open();

           command.ExecuteNonQuery();

           command.Connection.Close();          

 

           // Check for duplicate email.

           if ((bool)paramIsDuplicateEmail.Value) {throw new Common.Exceptions.DuplicateEmailException();}

 

           // If this is a new record, let's set the Guid so the object

           // will have it.

           if(isNewRecord) {person.PersonGuid = (Guid)paramNewPersonGuid.Value;}       

       }

   }

43 comments:

  1. Nice writeup. I completely agree with your approach, over the past 20 years I've done similar things with DAO patterns. But in my case it wasn't only or even (usually) primarily for performance reasons. DAOs, factories, and stored procedures are the basis for the loosest possible coupling between the app and the DBMS. When multiple apps are sharing common DBs or even tables, it provides more flexibility in application testing, the DB maintenance cycle and less interactions between the DBA and developers (highly to be desired).

    ReplyDelete
  2. Very nice. We've be using a similar approach for a while. We've also noticed that custom serialization for DTO's (especially for collections!) makes a huge difference when sending over the wire as binary data.

    ReplyDelete
  3. All good, nice post. I fully agree with it and have been using a similar approach for the last 6 years or so.

    This sort of DB oriented development also means that coders have to understand the DB and how to interact with it correctly (access late, release early etc), which I feel is a prerequisite for "good" coders.

    ReplyDelete
  4. Agreed, I think the best approach to data access is having an organized DAL so you have full control. I've followed a similar pattern found here http://www.c-sharpcorner.com/UploadFile/rmcochran/elegant_dal05212006130957PM/elegant_dal.aspx

    ReplyDelete
  5. DAL is important but the light objects(what you call DTO ) can be hectic to manage using your hierarchy. Any derived object (for example a full Customer Object) would then have to derive from your DTO class, which can be problematic if applying this approach across the entire system as some objects have to derive from elsewhere (say an EntityBase class or a Base Object of their own).

    ReplyDelete
    Replies
    1. Mapping might eliminate that issue. In our distributed system, DTO's act as our communication between layers. Domain objects map to DTO's which transmit data to and from our various presentation layers. Data received from those layers are then mapped back to our DAL. Various frameworks exist to assist with mapping code, such as Automapper which can make this trivial and separate a number of concerns.

      Delete
  6. DTOs to abstract out data from a DAL are great, and if you want real objects with behaviour, those objects can embed the DTOS and delegate getters / setters behaviour to the internal DTO.

    The danger with exposing DTOs all around the application is you then have no control over what the programming clients are going to do with those DTOs. No place to centralise business logic or validation. For read only lists etc, they are fine.

    ReplyDelete
  7. Thank you all for the comments. Lev and Phillip, DTOs can be difficult to manage but one design that I use to make it easier is to use the DTO as the data container in my full business objects. So in my BAL I have a Person object that contains validation, business logic, and a single property called "data" which is of type PersonDTO. So when my BAL needs to create a Person object, it gets a PersonDTO from the DAL and sets person.data = dto. I've used this design a few times and so far I really like it. It's very clean, no accessors or mutators cluttering up my Person class and I like the fact that I'm using composition instead of inheritance. The only time it really gets messy is when I have have a business object whose data is a composite of multiple DTOs.

    ReplyDelete
  8. "stored procedures are the basis for the loosest possible coupling between the app and the DBMS"

    How so? Stored procedures depend entirely on (and actually exist within) a DBMS.

    What happens when you want to switch from MS SQL Server to Oracle or to mySQL?

    If you have a BLL/DAL your application logic only knows/cares about business objects and the DAL can be (re)generated (LLBLLGEN, SQLMETAL, etc) as needed.

    That seems like a loose coupling between the DB and the app...

    ReplyDelete
  9. I've always struggled with this argument - "What if I throw away all of our legacy data, systems, code, and who knows what else and change database systems? Then what? WHATCHA GONNA DO THEN? YOUR SYSTEM WON'T WORK FOR THAT NOW WILL IT!!! AH HA!"

    It's just not going to happen realistically. I've developed on three dbms platforms my whole 20 year career, actually four. MSSQL, IBM DB2, ORACLE and MYSQL.

    On DB2/ORACLE/MYSQL I've nine times out of ten used Java, but sometimes I've used .NET on MySQL and Oracle (never on DB2).

    Ten times out of ten, once the DB vendor has been chosen a switch away from that DB vendor has never transpired or if it has, has affected so many systems that it was so fundamentally different across the board that it didn't matter - all apps would need to be rewritten in some way or another as everything is comingled or linked in one way or another these days.

    In my opinion, screw loose coupling with the database. Draw a line in the sand, pick your vendor for your solution and then build the highest performance application you with the best mixture of maintainability and performance.

    Just my 2cents.

    ReplyDelete
    Replies
    1. I agrees that Switching database vendor is usually unrealistic in production. But separation of concerne (data access, data transfer..), make code loosely coupled for testability is still a good idea. But we shouldn't abuse the idea with too much boilerplate copy/paste codes. I prefer module over layer. An app is combination of small module which do only 1 things. Maybe the module need to acces database to fullfill his jobs. But it should do what ever it want, in the way it should to passed his blackbox test.

      Delete
  10. I think I agree with Julian. It seems like we're adding more and more complexity to our applications these days in the name of "loose coupling". I think that some level of loose coupling is a good thing, especially between the UI and the BLL. I definitely want to be able to put a web app, a windows app, or a reporting app in front of my Business Layer and have them all work. However, I'm starting to think that the layers of complex object indirection and the prohibitions against using platform specific features in the DAL and BAL are not appropriate for the vast majority of applications. As far as concerns about being able to switch out the RDBMS go, I've never done it and I've never worked anywhere that's ever done it. Unless you're shipping your app to clients to run in their environment, I don't think this issue comes up much.

    ReplyDelete
  11. Defining null values is needless. Use nullable types instead.

    ReplyDelete
  12. There is a good article about using the command pattern for the data access layer. It is a long series of articles, but worth reading. See http://www.designpatternsfor.net/default.aspx?pid=79

    What I miss in most implementations, is the 'batched statements' for updates and in some scenarios for retrieving the data. If you have a complex entity, like PurchaseOrder which has PurchaseOrderItems, most implementations I have seen go multiple times to the database; there should be implementations that go only once to the database, and then call the mappers (data mapper pattern).

    ReplyDelete
  13. Very interesting stuff! I am just getting my feet wet in ASP.net and I'm trying to figure out a good DAL to use. Unfortunately I don't really know C#, so would you see any problem with implementing this method in VB? Do you know anyone who has ported your examples to VB?

    ReplyDelete
  14. In response to my post above, here is a cool site which will convert C# to VB.net: http://rlacovara.blogspot.com/2009/02/high-performance-data-access-layer.html

    ReplyDelete
  15. Hi Jay, VB is not a problem. The language is really irrelevant. The design should work fine in C#, VB, or even Java. Really, despite the programmer snobbery, VB.Net and C# have virtually identical capabilities. I would strongly suggest learning C# though. Even though the languages have pretty much converged, there is a big difference in culture/background between a C# developer and a VB developer. Most code samples regarding real architecture questions are going to be in C# because C# developers are more likely to spend time thinking about architecture patterns.

    ReplyDelete
  16. Can you please make available full source code for the article for download.

    ReplyDelete
  17. Hi Rudy,
    I am totally convince with your approach. It gives me greater control on my code and enables loose coupling. Can you please send me the sample copy of the architecture you have implemented. I want to present a demo to my team here.

    ReplyDelete
  18. Very nice! Finally someone takes similar approach as I have independently taken! I like your approach because your approach is generic and easy to use. Also I wanted to add that I totally agree with you about saying that people who write frameworks should spend some time on thinking how their framework is going to be used, is that pleasant or not. However, recently I have written my own framework for several things I find common in my projects and I build my DAL and BAL on top of that framework. Please read my post Writing Data Access Layer (DAL) and Business Access Layer (BAL) with CsharpGears Framework and tell me what do you think of my framework. I have to mention it fully utilizes generics and flexibility, it can be easily switched to other data providers because full separation exists between the layers and the communication between the layers is standardized via common objects. I hope you like it, it may inspire you for improving your own framework.

    ReplyDelete
  19. One more thing.. What service do you use to have the C# code so nicely formatted and highlighted inside Blogger?

    ReplyDelete
  20. Rudy,

    Just wanted to stop by and say thanks. This is a very impressive article and helped me clarify a few things.
    I've been tearing my hair out for the last couple of weeks but now I realise that I was on the right track all along!

    Thanks again,
    Jason

    ReplyDelete
  21. Rudy,
    Certainly impressive article for those in persuit of a light weight DAl for their data access endeavour. I'm planning to use this approach in my recent project for Mobile Recovery system in for Windows Mobile 6 utilizing the ADO.NET CE and full scale versions for client and web service respectively. One thing I notitced here if I use same approach for my various modules of a full scale Banking System then the DTOs in the Common will create problems. As different programmers working on different modules and they should be able to edit their part of DTOs and DB/Persistence classes what will you suggest the workaround to this will be. Also suggest how could the Parser logic be merged with the DB classes or in DTO.

    Thanks,
    Khalid

    ReplyDelete
  22. I'm thinking to write an application for Windows Mobile and I was going to use Nhibernate, but I found that it is not supported by CF. But I still want to write my app in DDD manner. Your post showed that I could write effective DDD DAL even without ORM. Thanks.

    ReplyDelete
  23. There is such a thing as Strongly Typed Datasets. For Web Apps, you should never instantiate a Dataset, but the DataTables and TableAdapters provide a good means for binding data through ObjectDataSources. Your article is ver good. Thanks!

    ReplyDelete
  24. You have conveniently ignored the UPDATE case.

    ReplyDelete
  25. To be a truly high performance DAL, you need to implement high performance updates as well, which means only updating fields in the database that have changed. Can you show us how your framework updates a Person if, say, only the email has changed?

    ReplyDelete
  26. Totally agree with this post.

    I've analyzed, examined, and worked with many of the most common ORM platforms including Entity Framework, EF CTP4 Code-First, OpenAccess, NHibernate, LightSpeed, DataObjects, etc, and to me at the end of the day, what's most important is speed, control, and predictability.

    Although many of these ORMs make development easier in the beginning, the reality is more times than not the focus turns from creating code rapidly to tuning the application for performance, migration, and maintainability - especially after launch.

    Sure, with an ORM you can flip a switch and work with another DB, but who cares - if that DB is running 3-10x slower than native DAL code like ADO.NET, you're eventually going to be faced with tweaking those poor performing areas, whether they be queries, indexes, batching, connection pooling, or the myriad of other factors impacting all the layers between your POCO and your queries to the database.

    Not to mention, most of these ORMs tightly bind you one way or another to your model and schema. Try performing a staged, rolling upgrade while relying entirely on ORM-based platforms, and you're in for a painful ride. You will spend all your time trying to get the ORM to allow itself to work with a schema change.

    For large scalable web projects, no doubt it's wonderful to have a rapid, iterative development process where you can change a property and get a new schema instantly, but not very practical when it comes to production releases and you're wasting valuable (and expensive) CPU resources relying on an ORM.

    I love EF Code-First conceptually and in practice (for prototyping), but it's never going to be able to compete with the sheer performance gained from leveraging native ADO.NET. It's impossible.

    I will say, BLToolkit is something to look at - it's about the closest I've found to providing an "ORM-like" wrapper around native ADO.NET functions without giving up a lot of performance in the process.

    Maybe it's a middle ground worth considering, or at least deploying in concert with an ORM. Use an ORM for your back-end internal applications, and native DAL on the front-end (web) for high performance?

    It's always a trade-off, and every project is different, but it seems we're too quick to sacrifice performance for a slightly better coding experience.

    Is it really worth it? Time will tell.

    ReplyDelete
  27. I really like this approach, although I always run into fetching "composed" data. For example, I need data for a gridview containing data which is in fact a join of several tables. Now you either fetch the parent object and execute some more queries for all its children in order to fetch the entire dataset, or you create a "special" dataobject, containing all fields for the gridview data. Another option is to return datasets (either strongly or weakly typed) instead of creating those "special" objects or performing multiple queries. How do you think about this? What's the ultimate solution when you need consolidated data that does not map 1 to 1 with a dataobject?

    ReplyDelete
  28. I prefer the special data object approach. I've gone pretty far in that direction with my newer architectures (check out AAPL).

    Regarding strongly typed datasets, I never say never, and even the craziest architecture has some situation that it's the right solution for. However, all that goes out the window when it comes to strongly typed datasets. They are evil. They are friction. They are a pox sent from hell to punish developers. They are never a good idea. I hate them. Keep in mind I'm not talking about datasets. Datasets and DataTables are handy data containers, and while I prefer not to use them I've worked on lots of good apps that did use them. Datasets and DatTables are fine. The code generated monstrosity that is a strongly typed dataset is not fine and should be avoided if you value your sanity.

    ReplyDelete
  29. Is there code for your GetSingleDTO, GetListDTO and your stored procedures? I'm really interested on how you would accomplish this. The overhead of the current ORM tools is way to great to even consider using them for business applications. Writing your own straightforward DAL, BL could really benefit performance.

    Thanks!

    ReplyDelete
  30. @Ontani, to see the code for GetSingleDTO() take a look at http://rlacovara.blogspot.com/2010/04/aapl-part-5-how-to-write-data-access.html and the datamapper post that comes after it.

    ReplyDelete
  31. @Rudy Lacovara - In the example above you have suggested to use DTOs that contain only the properties of our objects. And we'll fill our DTOs using ADO.Net. That sounds pretty cool.

    But, What if we have a complex object graph i.e; may be a PersonDTO object contains a list of OrderDTO etc. Is there any recommended approach or best practice to fill the (Dependent Objects) OrderDTO list as well when querying the (Parent Object)PersonDTO from Database.

    Similary, When saving a (Parent Object) PersonDTO object how can we detect changes in the (Dependent Objects) OrderDTO and save them.

    The ORM tools like EF, NHibernate etc. provide such functionalities. I was wondering if you could suggest how we'll accomplish this using your prescribed architecture?

    ReplyDelete
  32. Great article. I was wondering if you have you thought about using extension methods to turn your DTOs into business objects? This would negate the need to have a "Data" property on your business objects and would be more intuative to the consuming code. I am considering this for a project I am working on, so would be interested in your $0.02!

    ReplyDelete
  33. This seems to be an anti pattern when looking from a domain driven perspective. You are stuck with a data access layer and no real domain model to house logic. I repository pattern that handles domain objects would be much more scaleable

    ReplyDelete
  34. Hi,

    Nice framework, but is there any link from where I can download the complete working solution?

    Thanks!

    ReplyDelete
  35. The 2nd Part is already Published and There are lots of persons are used this for high performance data access.just i m using diagram of the DAL design that for understanding.

    ReplyDelete
  36. Convert VB to C# with VBConversions,the most accurate code translation tool available. more information then visit: www.vbconversions.com
    and Product Details : www.tangiblesoftwaresolutions.com/Product_Details/Instant_CSharp.html

    vb.net to c# converter
    convert vb c#
    vb to c# converter

    ReplyDelete
  37. This comment has been removed by the author.

    ReplyDelete
  38. This comment has been removed by the author.

    ReplyDelete
  39. This comment has been removed by the author.

    ReplyDelete
  40. This comment has been removed by the author.

    ReplyDelete