Friday, September 15, 2006

ADO.Net vNext

I tried as best I could to keep the technology out of this blog and stay with espousing my architectural philosophy but, after my trip to VSLive in New York, I feel I've got to throw out my opinions. I do this because of my brief discussions with Rocky Lhotka and Pablo Castro.

I was nearly knocked out of my socks by Pablo's discussion of the Entity Framework. I finally feel validated with my current framework although my soon-to-be-published book will probably need some revisions. Ok, MANY revisions. That's not a bad thing since I haven't started writing it yet. Ah well.

So, from what I know, here's the deal with the Entity Framework. (I've got a lot more to read to see what the real deal is, but for now, I think I get it.)

You can do lots of cool things with business objects but your usefulness wanes to about 1% when they don't work with business data. Traditionally, you write a stored procedure, pull the data, put it in properties in your object, mung it, send it back, write more stored procedures to perform dml operations. When something changes, you have to touch everything. In the words of my 3-year-old daughter, that's icky.

Enter the Entity Framework, whose goal is to DEFINE THE DATA you need to manipulate, not to define HOW TO GET AND SET THE DATA. In my architecture, a logical table is a portion of an entity's data. It can be more or less than a table and it can be more or less than a record in that data. Once defined, the framework has a cache of its own decision tree specific to that entity. You can do anything you want to that data and it will find its way back into the store, business rules approving, of course.

In talking with Pablo, the Entity Framework currently generates all of its SQL on the fly. We tried that too. Its good for snap and sizzle. Not so much when 500 users ping your customer list at one time. Our answer is to build it when you need it, validate it internally (in another thread) and cache it until its no longer valid. For my system, this involves a listener that checks for schema changes (Hello SQL 2005 schema triggers!!!) and reevaluates logical tables that have become suspect.

Even at this level though, the Microsoft team is going to have performance issues. There are indeed two more facets of this problem that seems to be unaddressed. I'm interested in whether or not anyone else out there sees it.

Jim

No comments: