Rockford Lhotka's Blog

Home | Lhotka.net | CSLA .NET

 Tuesday, November 27, 2007

Every now and again I blog about the fact that Magenic needs good developers (.NET, SQL, Sharepoint, BizTalk, CSLA .NET) in San Francisco, Minneapolis, Chicago, Atlanta and Boston. This continues to be true, and if you are at all interested, read on!

My colleague, Brant Estes, put together a little web site to try and seduce you into applying for work at Magenic. But more fun, he put together a broad-reaching tech quiz to illustrate the kind of knowledge we expect people to have before we hire them.

I'll be brutally honest, and say that I scored 84%, which entitles me to keep working at Magenic (whew!). I figure that's not too bad, given that the quiz covers areas (like HTML) that I try to avoid like the plague (and I didn't cheat and use Google while taking the test :) ). Brant tells me that passing is 70%.

Whether you are interested in working at Magenic or not, the quiz is a fun challenge, so feel free to take a look (just give yourself around 10-15 minutes to go through it).

Tuesday, November 27, 2007 3:10:21 PM (Central Standard Time, UTC-06:00)  #    Disclaimer  |  Comments [0]  | 
 Friday, November 16, 2007

Next week, on 21 November, I'm speaking at a user group in Barcelona. I'll be giving an overview of CSLA .NET 3.0, with a little glimpse of 3.5 if I have time.

The organizers have also arranged for this to be broadcast as a Live Meeting. The link for registering and assisting to the live meeting session is: http://msevents.microsoft.com/CUI/EventDetail.aspx?EventID=1032360496&Culture=es-ES

Friday, November 16, 2007 1:25:15 PM (Central Standard Time, UTC-06:00)  #    Disclaimer  |  Comments [0]  | 
 Thursday, November 15, 2007

I'm busy getting my work and home life in order, as I head off to Barcelona with my oldest son for a week. I'm speaking at the Barcelona .NET user group on Wednesday, 21 November, so if you are in the area stop by and say hello! Otherwise, we'll be around the Barcelona area, enjoying Spain - I really love Spain!

Some of what I'm busy trying to get organized are some of the upcoming changes to CSLA .NET in version 3.5. Thematically of course, the focus is on support for LINQ, primarily LINQ for Objects (like CSLA objects) and LINQ for SQL. As usual however, I'm also tackling some items off the wish list, and making other changes that I think will make things better/easier/more productive - and some that are just fun for me :)

When talking about LINQ for Objects, there are a couple things going on. First, Aaron Erickson, a fellow Magenicon, is incorporating his i4o concepts into CSLA .NET to give CSLA collections the ability to support indexed queries. Second, there are some basic plumbing things that need to happen for LINQ to play well with CSLA objects, most notably ensuring that a result (other than a projection) from a query against a CSLA collection results in a live, updatable view of the original collection.

By default LINQ just creates a new IEnumerable list with the selected items. But this can be horribly unproductive, because a user might add a new item or remove an existing item from that list - which would have no impact at all on the original list. Aaron has devised an approach by which a query against a BusinessListBase or ReadOnlyListBase will result in a richer resulting list, that is still connected to the original - much like SortedBindingList and FilteredBindingList are today. This way, when an item is removed from the view, it is also removed from the real list.

Back to the i4o stuff, allowing indexing of properties on child objects contained in a list can make for much faster queries. If you only do a single query on a list it might not be worth building an index. But if you do multiple queries to get different views of a list (projections or not), having an index on one or more child object properties can make a very big performance difference!

When talking about LINQ for SQL, the focus is really on data access. LINQ for SQL, in the CSLA world-view, is merely a replacement for ADO.NET. The same is true for the ADO.NET Entity Framework. Architecturally, both of these technologies simply replace the use of a raw DataReader or a DataSet/DataTable like you might use today. Alternately, you can say that they compete with existing tools like nHibernate or LLBLgen - tools that people already use instead of ADO.NET.

In short, this means that your DataPortal_XYZ methods may make LINQ queries instead of using an ADO.NET SqlCommand. They may load the business object's fields from a resulting LINQ object rather than from a DataReader. In my testing thus far, I find that some of my DataPortal_Fetch() methods shrink by nearly 50% by using LINQ. That's pretty cool!

Of course nothing is faster than using a raw DataReader. LINQ loads its objects using a DataReader too - so using LINQ does incur overhead that you don't suffer when using raw ADO.NET. However, many applications aren't performance-bound as much as they are maintenance-bound. That is to say that it is often more important to improve maintainability and reduce coding than it is to eek out that last bit of performance.

To that end, I'm working on enhancing and extending DataMapper to be more powerful and more efficient. Since nHibernate, LINQ and ADO.NET EF all return data transfer objects or entity objects, it becomes important to be able to easily copy the data from those objects into the fields of your business objects. And that has to happen without incurring too much overhead. You can do the copies by hand with code that copies each DTO property into a field, which is fast, but is a lot of code to write (and debug, and test). If DataMapper can do all that in a single line of code that's awesome, but the trick is to avoid incurring too much overhead from reflection or other dynamic mapping schemes.

One cool side-effect of this DataMapper work, is that it will work for LINQ and EF, but also for XML services and anything else where data comes back in some sort of DTO/entity object construct.

There are also some issues with LINQ for SQL and context. Unfortunately, LINQ's context object wasn't designed to support n-tier deployments, and doesn't work well with any n-tier model, including CSLA. This means that LINQ doesn't/can't keep track of things like whether an object is new/old or marked for deletion. On the upside, CSLA already does all that. On the downside, it makes doing inserts/updates/deletes with LINQ for SQL more complex than if you used LINQ in the simpler 2-tier world for which it was designed. I hope to come up with some ways to bridge this gap a little so as to preserve some of the simpler LINQ syntax for database updates, but I'm not overly confident...

I'm also doing a bunch of work to minimize and standardize the coding required to build a business object. Each release of CSLA has tended to standardize the code within a business object more and more. Sometimes this also shrinks the code, though not always. At the moment I'm incorporating some methods and techniques to shrink and standardize how properties are implemented, and how child objects (collections or single objects) are managed by the parent.

By borrowing some ideas from Microsoft's DependencyProperty concept, I've been able to shrink the typical property implementation by about 35% - from 14 lines to 9:

Private Shared NameProperty As PropertyInfo(Of String) = RegisterProperty(Of String, Customer)("Name")
Private _name As String = NameProperty.DefaultValue
Public Property Name() As String
  Get
    Return GetProperty(Of String)(NameProperty, _name)
  End Get
  Set(ByVal value As String)
    SetProperty(Of String)(NameProperty, _name, value)
  End Set
End Property

Unlike a DependencyProperty, this technique continues to use a strongly typed local field, which I think is a better approach (faster, though slightly less abstract).

Child objects are similar:

Private Shared LineItemsProperty As PropertyInfo(Of LineItems) = RegisterProperty(Of LineItems, Order)("LineItems")
Public ReadOnly Property LineItems() As LineItems
  Get
    If Not ChildExists(LineItemsProperty) Then
      SetChild(Of LineItems)(LineItemsProperty, LineItems.NewList())
    End If
    Return GetChild(Of LineItems)(LineItemsProperty)
  End Get
End Property

In this case the implementation is more similar to a DependencyProperty, in that BusinessBase really does contain and manage the child object. There's value to that though, because it means that BusinessBase now automatically handles IsValid and IsDirty so you don't need to override them. The next step is to make BusinessBase automatically handle PropertyChanged/ListChanged/CollectionChanged events from the child objects and echo those as a PropertyChanged event from the parent. Along with that I'll also make sure that the child's Parent property is automatically set. This should eliminate virtually all the easy-to-forget code related to child objects - resulting in a lot less code and much more maintainable parent objects.

As I noted earlier, there are a lot of things on the wish list, and I'll hit some of those as time permits.

One item that I've done in VB and need to port to C# is changing authorization to call a pluggable IsInRole() provider method, allowing people to alter the algorithm used to determine if the current user is in a specific role. The default continues to do what it has always done by calling principal.IsInRole(), but for some advanced scenarios this extensibility point is very valuable.

Beyond that I'll hit items on the wish list given time and energy.

However, I need to have all major work done on 3.5 by the end of January 2008, because I'll start working on Expert 2008 Business Objects, the third edition of the Business Objects book for .NET, in January. The tentative plan is to have the book out around the end of June, which will be a tall order given that it will probably expand by 300 pages or so by the time I'm done covering .NET 3.0 and 3.5...

Now I must do some packing, and get into the mindset required to survive 11 hours in the air - they just don't make airplane seats for people who are 6'5" tall.

Thursday, November 15, 2007 9:34:59 PM (Central Standard Time, UTC-06:00)  #    Disclaimer  |  Comments [0]  | 
 Tuesday, November 13, 2007

I always encourage people to go with 2-tier deployments if possible, because more tiers increases complexity and cost.

But it is important to recognize when the value of 3-tier outweighs that complexity/cost. The primary three motivators are:

  1. Security – using an app server allows you to shift the database credentials from clients to the server, adding security because a would-be hacker (or savvy end-user) would need to hack into the app server to get those credentials.
  2. Network infrastructure limitations – sometimes a 3-tier solution can help offload processing from CPU-bound or memory-bound machines, or it can help minimize network traffic over slow or congested links. Direct database communication is a chatty dialog, and using an app server allows a single blob of data to go across the slow link, and the chatty dialog to occur across a fast link, so 3-tier can result in a net win for performance if the client-to-server link is slow or congested.
  3. Scalability – using an app server provides for database connection pooling. The trick here, is that if you don’t need it, you’ll harm performance by having an app server, so this is only useful if database connections are taxing your database server. Given the state of modern database hardware/software, scalability isn’t a big deal for the vast majority of applications anymore.

The CSLA .NET data portal is of great value here, because it allows you to switch from 2-tier to 3-tier with a configuration change - no code changes required*. You can deploy in the cheaper and simpler 2-tier model to start with, and if one of these motivators later justifies the complexity of moving to 3-tier, you can make that move with relative ease.

* disclaimer: if you plan to make such a move, I strongly suggest testing in 3-tier mode during development. While the data portal makes this 2-tier to 3-tier switch seamless, it only works if the code in the business objects follows all the rules around serialization and layer seperation. Testing in a 3-tier mode helps ensure that none of those rules get accidentally broken during development.

Tuesday, November 13, 2007 4:37:50 PM (Central Standard Time, UTC-06:00)  #    Disclaimer  |  Comments [0]  | 

Magenic is producing a series of webcasts on various IT related topics. One of them is CSLA .NET, and I'll be the speaker for that webcast. Click here for details on the date/time and how to register.

Tuesday, November 13, 2007 2:49:57 PM (Central Standard Time, UTC-06:00)  #    Disclaimer  |  Comments [0]  | 

I was recently asked whether CSLA incorporates thread synchronization code to marshal calls or events back onto the UI thread when necessary. The short answer is no, but the question deserves a bit more discussion to understand why it isn't the business layer's job to handle such details.

The responsibility for cross-thread issues resides with the layer that started the multi-threading.

So if the business layer internally utilizes multi-threading, then that layer must abstract the concept from other layers.

But if the UI layer utilizes multi-threading (which is more common), then it is the UI layer's job to abstract that concept.

It is unrealistic to build a reusable business layer around one type of multi-threading model and expect it to work in other scenarios. Were you to use Windows Forms components for thread synchronization, you'd be out of luck in the ASP.NET world, for example.

So CSLA does nothing in this regard, at the business object level anyway. Nor does a DataSet, or the entity objects from ADO.NET EF, etc. It isn't the business/entity layer's problem.

CSLA does do some multi-threading, specifically in the Csla.Wpf.CslaDataProvider control, because it supports the IsAsynchronous property. But even there, WPF data binding does the abstraction, so there's no thread issues in either the business or UI layers.

Tuesday, November 13, 2007 11:09:54 AM (Central Standard Time, UTC-06:00)  #    Disclaimer  |  Comments [0]  | 

As long as I'm doing simple blog links, here's another tool worth looking at, if you need to skin a SharePoint or other web site: 

SharePoint Skinner Overview - eLumenotion Blog

Web
Tuesday, November 13, 2007 10:53:57 AM (Central Standard Time, UTC-06:00)  #    Disclaimer  |  Comments [0]  | 

I haven't tried this yet, but it looks like a very nice tool for WPF developers:

Woodstock for WPF - The Code Project - Windows Presentation Foundation

WPF
Tuesday, November 13, 2007 10:48:41 AM (Central Standard Time, UTC-06:00)  #    Disclaimer  |  Comments [0]  | 
 Monday, November 05, 2007

I was recently asked how to get the JSON serializer used for AJAX programming to serialize a CSLA .NET business object.

From an architectural perspective, it is better to view an AJAX (or Silverlight) client as a totally separate application. That code is running in an untrusted and terribly hackable location (the user’s browser – where they could have created a custom browser to do anything to your client-side code – my guess is that there’s a whole class of hacks coming that most people haven’t even thought about yet…)

So you can make a strong argument that objects from the business layer should never be directly serialized to/from an untrusted client (code in the browser). Instead, you should treat this as a service boundary – a semantic trust boundary between your application on the server, and the untrusted application running in the browser.

What that means in terms of coding, is that you don’t want to do some sort of field-level serialization of your objects from the server. That would be terrible in an untrusted setting. To put it another way - the BinaryFormatter, NetDataContractSerializer or any other field-level serializer shouldn't be used for this purpose.

If anything, you want to do property-level serialization (like the XmlSerializer or DataContractSerializer or JSON serializer). But really you don’t want to do this either, because you don’t want to couple your service interface to your internal implementation that tightly. This is, fundamentally, the same issue I discuss in Chapter 11 of Expert 2005 Business Objects as I talk about SOA.

Instead, what you really want to do (from an architectural purity perspective anyway) is define a service contract for use by your AJAX/Silverlight client. A service contract has two parts: operations and data. The data part is typically defined using formal data transfer objects (DTOs). You can then design your client app to work against this service contract.

In your server app’s UI layer (the aspx pages and webmethods) you translate between the external contract representation of data and your internal usage of that data in business objects. In other words, you map from the business objects in/out of the DTOs that flow to/from the client app. The JSON serializer (or other property-level serializers) can then be used to do the serialization of these DTOs - which is what those serializers are designed to do.

This is a long-winded way of saying that you really don’t want to do JSON serialization directly against your objects. You want to JSON serialize some DTOs, and copy data in/out of those DTOs from your business objects – much the way the SOA stuff works in Chapter 11.

Monday, November 05, 2007 11:53:58 PM (Central Standard Time, UTC-06:00)  #    Disclaimer  |  Comments [0]  | 
On this page....
Search
Archives
Feed your aggregator (RSS 2.0)
August, 2014 (2)
July, 2014 (3)
June, 2014 (4)
May, 2014 (2)
April, 2014 (6)
March, 2014 (4)
February, 2014 (4)
January, 2014 (2)
December, 2013 (3)
October, 2013 (3)
August, 2013 (5)
July, 2013 (2)
May, 2013 (3)
April, 2013 (2)
March, 2013 (3)
February, 2013 (7)
January, 2013 (4)
December, 2012 (3)
November, 2012 (3)
October, 2012 (7)
September, 2012 (1)
August, 2012 (4)
July, 2012 (3)
June, 2012 (5)
May, 2012 (4)
April, 2012 (6)
March, 2012 (10)
February, 2012 (2)
January, 2012 (2)
December, 2011 (4)
November, 2011 (6)
October, 2011 (14)
September, 2011 (5)
August, 2011 (3)
June, 2011 (2)
May, 2011 (1)
April, 2011 (3)
March, 2011 (6)
February, 2011 (3)
January, 2011 (6)
December, 2010 (3)
November, 2010 (8)
October, 2010 (6)
September, 2010 (6)
August, 2010 (7)
July, 2010 (8)
June, 2010 (6)
May, 2010 (8)
April, 2010 (13)
March, 2010 (7)
February, 2010 (5)
January, 2010 (9)
December, 2009 (6)
November, 2009 (8)
October, 2009 (11)
September, 2009 (5)
August, 2009 (5)
July, 2009 (10)
June, 2009 (5)
May, 2009 (7)
April, 2009 (7)
March, 2009 (11)
February, 2009 (6)
January, 2009 (9)
December, 2008 (5)
November, 2008 (4)
October, 2008 (7)
September, 2008 (8)
August, 2008 (11)
July, 2008 (11)
June, 2008 (10)
May, 2008 (6)
April, 2008 (8)
March, 2008 (9)
February, 2008 (6)
January, 2008 (6)
December, 2007 (6)
November, 2007 (9)
October, 2007 (7)
September, 2007 (5)
August, 2007 (8)
July, 2007 (6)
June, 2007 (8)
May, 2007 (7)
April, 2007 (9)
March, 2007 (8)
February, 2007 (5)
January, 2007 (9)
December, 2006 (4)
November, 2006 (3)
October, 2006 (4)
September, 2006 (9)
August, 2006 (4)
July, 2006 (9)
June, 2006 (4)
May, 2006 (10)
April, 2006 (4)
March, 2006 (11)
February, 2006 (3)
January, 2006 (13)
December, 2005 (6)
November, 2005 (7)
October, 2005 (4)
September, 2005 (9)
August, 2005 (6)
July, 2005 (7)
June, 2005 (5)
May, 2005 (4)
April, 2005 (7)
March, 2005 (16)
February, 2005 (17)
January, 2005 (17)
December, 2004 (13)
November, 2004 (7)
October, 2004 (14)
September, 2004 (11)
August, 2004 (7)
July, 2004 (3)
June, 2004 (6)
May, 2004 (3)
April, 2004 (2)
March, 2004 (1)
February, 2004 (5)
Categories
About

Powered by: newtelligence dasBlog 2.0.7226.0

Disclaimer
The opinions expressed herein are my own personal opinions and do not represent my employer's view in any way.

© Copyright 2014, Marimer LLC

Send mail to the author(s) E-mail



Sign In