Rockford Lhotka

 Monday, January 12, 2009

AppArch2.0_small.jpgThe Microsoft Patterns and Practices group has a new Application Architecture Guide available.

Architecture, in my view, is primarily about restricting options.

Or to put it another way, it is about making a set of high level, often difficult, choices up front. The result of those choices is to restrict the options available for the design and construction of a system, because the choices place a set of constraints and restrictions around what is allowed.

When it comes to working in a platform like Microsoft .NET, architecture is critical. This is because the platform provides many ways to design and implement nearly anything you’d like to do. There are around 9 ways to talk to a database – from Microsoft, not counting the myriad 3rd party options. The number of ways to build web apps continues to grow, etc. The point I’m making is that if you just throw the entire .NET framework at a dev group you’ll get a largely random result that may or may not actually meet the short, medium and long-term needs of your business.

Developing an architecture first allows you to rationally evaluate the various options, discard those that don’t fit the business and application requirements and only allow use of those that do meet the needs.

An interesting side-effect of this process is that your developers may disagree. They may only see short-term issues, or purely technical concerns, and may not understand some of the medium/long term issues or broader business concerns. And that’s OK. You can either say “buck up and do what you are told”, or you can try to educate them on the business issues (recognizing that not all devs are particularly business-savvy). But in the end, you do need some level of buy-in from the devs or they’ll work against the architecture, often to the detriment of the overall system.

Another interesting side-effect of this process is that an ill-informed or disconnected architect might create an architecture that is really quite impractical. In other words, the devs are right to be up in arms. This can also lead to disaster. I’ve heard it said that architects shouldn’t code, or can’t code. If your application architect can’t code, they are in trouble, and your application probably is too. On the other hand, if they don’t know every nuance of the C# compiler, that’s probably good too! A good architect can’t afford to be that deep into any given tool, because they need more breadth than a hard-core coder can achieve.

Architects live in the intersection between business and technology.

As such they need to be able to code, and to have productive meetings with business stakeholders – often both in the same day. Worse, they need to have some understanding of all the technology options available from their platform – and the Microsoft platform is massive and complex.

Which brings me back to the Application Architecture Guide. This guide won’t solve all the challenges I’m discussing. But it is an invaluable tool in any .NET application architect’s arsenal. If you are, or would like to be, an architect or application designer, you really must read this book!

Monday, January 12, 2009 11:29:13 AM (Central Standard Time, UTC-06:00)  #    Disclaimer
 Tuesday, January 6, 2009

VS Live San Francisco is coming up soon – February 23-27.

San Francisco

As one of the co-chairs for this conference, I’m really pleased with the speaker and topic line-up. In my view, this is one of the best sets of content and speakers VS Live has ever put forward.

Even better, this VS Live includes the MSDN Developer Conference (MDC) content. So you can get a recap of the Microsoft PDC, with all its forward-looking content, and then enjoy the core of VS Live with its pragmatic and independent information about how you can be productive using Microsoft’s technologies today, and into the future.

Perhaps best of all are the keynotes. Day one starts with a keynote containing secret content. Seriously – we can’t talk about it, and Microsoft isn’t saying – but it is going to be really cool. The kind of thing you’ll want to hear in person so you can say “I was there when...”. And Day two contains some of the best WPF/XAML apps on the planet. We’re talking about apps that not only show off the technology, but show how the technology can be used in ways that will literally change the world – no exaggeration! Truly inspirational stuff, on both a personal and professional level!

I know travel budgets are tight, and the economy is rough. All I can say, is that you should seriously consider VS Live SF if you have the option. I think you’ll thank me later.

Tuesday, January 6, 2009 11:43:42 AM (Central Standard Time, UTC-06:00)  #    Disclaimer
 Saturday, January 3, 2009

I am not a link blogger, but I can’t help but post this link that shows so clearly why the Single Responsibility Principle is important.

I got this link from another blog I just discovered, The Morning Brew. I’m not sure why I only found out about the Brew now, because it appears to be the perfect resource, for me at least.

I mostly quit reading blogs about 6 months ago – I discovered that reading blogs had largely supplanted doing real work, and that could only lead to really bad results!!

Interestingly, I also mostly quit listening to any new music about 6 months ago. Browsing through the Zune.net catalog for interesting music had largely supplanted being entertained by music (there’s a lot of crap music out there, and I got tired of listening to it).

Is there a parallel here? I think so.

In October (or so), Zune.net introduced a Pandora-like feature where the Zune service creates a “virtual radio station” based on the music you listen to most. This is exactly what I want! I want a DJ (virtual or otherwise) who filters out the crap and provides me with good (and often new) music. The whole Channels idea in Zune.net kept me from canceling my subscription, and the more recent addition of 10 track purchases per month for free clinched the deal.

And I want the same thing for blogs. I don’t want to filter through tons of blog posts about good restaurants, pictures of kids or whatever else. (I’m sure those are interesting to some people, but I read blogs for technical content, and only read friends blogs for non-technical content). And this is where The Morning Brew comes in. It is like a “DJ” for .NET blogs. Perfect!!

I should point out that on twitter I also subscribe to Silverlight News – for the same reason – it is a pre-filtered set of quality links, though I think I may start following this via RSS instead of twitter, because I prefer the RSS reading ability in Outlook to twitter for this particular type of information.

In any case, I hope everyone reading this post has had a wonderful last couple weeks.

For the many of you who had a religious holiday in here, I hope it was fulfilling and you felt the full meaning of the holiday, and that it brought you closer to your god/goddess/gods/ultimate truth. If you had a secular holiday in here, I hope it was fulfilling and meaningful, preferably filled with family and fun and a warm sense of community.

Happy New Year, and welcome to 2009!

Saturday, January 3, 2009 10:26:25 AM (Central Standard Time, UTC-06:00)  #    Disclaimer
 Wednesday, December 17, 2008

csla_logo1_42

csla-lt_logo_42

I have released version 3.6 of CSLA .NET for Windows and CSLA .NET for Silverlight.

This is a major update to CSLA .NET, because it supports Silverlight, but also because there are substantial enhancements to the Windows/.NET version.

To my knowledge, this is also the first release of a major development framework targeting the Silverlight platform.

Here are some highlights:

  • Share more than 90% of your business object code between Windows and Silverlight
  • Powerful new UI controls for WPF, Silverlight and Windows Forms
  • Asynchronous data portal, to enable object persistence on a background thread (required in Silverlight, optional in Windows)
  • Asynchronous validation rules
  • Enhanced indexing in LINQ to CSLA
  • Numerous performance enhancements

This version of CSLA .NET for Windows is covered in my new Expert C# 2008 Business Objects book (Expert VB 2008 Business Objects is planned for February 2009).

At this time there is no book covering CSLA .NET for Silverlight. However, most business object code (other than data access) is the same for Windows and Silverlight, so you will find the Expert C# 2008 Business Objects book very valuable for Silverlight as well. I am in the process of creating a series of training videos around CSLA .NET for Silverlight, watch for those in early 2009.

Version 3.6 marks the first time major development was done by a team of people. My employer, Magenic, has been a patron of CSLA .NET since the year 2000, but with this version they provided me with a development team and that is what enabled such a major version to be created in such a relatively short period of time. You can see a full list of contributors, but I specifically want to thank Sergey Barskiy, Justin Chase and Nermin Dibek because they are the primary contributors to 3.6. I seriously couldn't have asked for a better team!

This is also the first version where the framework code is only in C#. The framework can be used from any .NET language, including VB, C# and others, but the actual framework code is in C#.

There is a community effort underway to bring the VB codebase in sync with version 3.6, but at this time that code will not build. In any case, the official version of the framework is C#, and that is the version I recommend using for any production work.

In some ways version 3.6 is one of the largest release of the CSLA .NET framework ever. If you are a Windows/.NET developer this is a natural progression from previous versions of the framework, providing better features and capabilities. If you are a Silverlight developer the value of CSLA .NET is even greater, because it provides an incredible array of features and productivity you won't find anywhere else today.

As always, if you are a new or existing CSLA .NET user, please join in the discussion on the CSLA .NET online forum.

Wednesday, December 17, 2008 4:13:21 PM (Central Standard Time, UTC-06:00)  #    Disclaimer
 Friday, December 12, 2008

My previous blog post (Some thoughts on Windows Azure) has generated some good comments. I was going to answer with a comment, but I wrote enough that I thought I’d just do a follow-up post.

Jamie says “Sounds akin to Architecture:"convention over configuration" I'm wholly in favour of that.”

Yes, this is very much a convention over configuration story, just at an arguably larger scope that we’ve seen thus far. Or at least showing a glimpse of that larger scope. Things like Rails are interesting, but I don't think they are as broad as the potential we're seeing here.

It is my view that the primary job of an architect is to eliminate choices. To create artificial restrictions within which applications are created. Basically to pre-make most of the hard choices so each application designer/developer doesn’t need to.

What I’m saying with Azure, is that we’re seeing this effect at a platform level. A platform that has already eliminated most choices, and that has created restrictions on how applications are created – thus ensuring a high level of consistency, and potential portability.

Jason is confused by my previous post, in that I say Azure has a "lock-in problem", but that the restricted architecture is a good thing.

I understand your confusion, as I probably could have been more clear. I do think Windows Azure has a lock-in problem - it doesn't scale down into my data center, or onto my desktop. But the concept of a restricted runtime is a good one, and I suspect that concept may outlive this first run at Azure. A restricted architecture (and runtime to support it) doesn't have to cause lock-in at the hosting level. Perhaps at the vendor level - but those of us who live in the Microsoft space put that concern behind us many years ago.

Roger suggests that most organizations may not have the technical savvy to host the Azure platform in-house.

That may be a valid point – the current Azure implementation might be too complex for most organizations to administer. Microsoft didn't build it as a server product, so they undoubtedly made implementation choices that create complexity for hosting. This doesn’t mean the idea of a restricted runtime is bad. Nor does it mean that someone (possibly Microsoft) could create such a restricted runtime that could be deployed within an organization, or in the cloud. Consider that there is a version of Azure that runs on a developer's workstation already - so it isn't hard to imagine a version of Azure that I could run in my data center.

Remember that we’re talking about a restricted runtime, with a restricted architecture and API. Basically a controlled subset of .NET. We’re already seeing this work – in the form of Silverlight. Silverlight is largely compatible with .NET, even though they are totally separate implementations. And Moonlight demonstrates that the abstraction can carry to yet another implementation.

Silverlight has demonstrated that most business applications only use 5 of the 197 megabytes in the .NET 3.5 framework download to build the client. Just how much is really required to build the server parts? A different 5 megabytes? 10? Maybe 20 tops?

If someone had a defined runtime for server code, like Silverlight is for the client, I think it becomes equally possible to have numerous physical implementations of the same runtime. One for my laptop, one for my enterprise servers and one for Microsoft to host in the cloud. Now I can write my app in this .NET subset, and I can not only scale out, but I can scale up or down too.

That’s where I suspect this will all end up, and spurring this type of thinking is (to me) the real value of Azure in the short term.

Finally, Justin rightly suggests that we can use our own abstraction layer to be portable to/from Azure even today.

That's absolutely true. What I'm saying is that I think Azure could light the way to a platform that already does that abstraction.

Many, many years ago I worked on a project to port some software from a Prime to a VAX. It was only possible because the original developers had (and I am not exaggerating) abstracted every point of interaction with the OS. Everything that wasn't part of the FORTRAN-77 spec was in an abstraction layer. I shudder to think of the expense of doing that today - of abstracting everything outside the C# language spec - basically all of .NET - so you could be portable.

So what we need, I think, is this server equivalent to Silverlight. Azure is not that - not today - but I think it may start us down that path, and that'd be cool!

Friday, December 12, 2008 11:33:03 AM (Central Standard Time, UTC-06:00)  #    Disclaimer
 Thursday, December 11, 2008

At PDC, Microsoft announced Windows Azure - their Windows platform for cloud computing. There's a lot we don't know about Azure, including the proper way to pronounce the word. But that doesn't stop me from being both impressed and skeptical and about what I do know.

In the rest of this post, as I talk about Azure, I'm talking about the "real Azure". One part of Azure is the idea that Microsoft would host my virtual machine in their cloud - but to me that's not overly interesting, because that's already a commodity market (my local ISP does this, Amazon does this - who doesn't do this??). What I do think is interesting is the more abstract Azure platform, where my code runs in a "role" and has access to a limited set of pre-defined resources. This is the Azure that forces the use of a scale-out architecture.

I was impressed by just how much Microsoft was able to show and demo at the PDC. For a fledgling technology that is only partially defined, it was quite amazing to see end-to-end applications built on stage, from the UI to business logic to data storage. That was unexpected and fun to watch.

I am skeptical as to whether anyone will actually care. I think the primary determination about whether people care will be determined by the price point Microsoft sets. But I also think it will be determined by how Microsoft addresses the "lock-in question".

I expect the pricing to end up in one of three basic scenarios:

  1. Azure is priced for the Fortune 500 (or 100) - in which case the vast majority of us won't care about it
  2. Azure is priced for the small to mid-sized company space (SMB) - in which case quite a lot of us might be interested
  3. Azure is priced to be accessible for hosting your blog, or my blog, or www.lhotka.net or other small/personal web sites - in which case the vast majority of us may care a lot about it

These aren't entirely mutually exclusive. I think 2 and 3 could both happen, but I think 1 is by itself. Big enterprises have such different needs than people or SMB, and they do business in such different ways compared to most businesses or people, that I suspect we'll see Microsoft either pursue 1 (which I think would be sad) or 2 and maybe 3.

But there's also the lock-in question. If I built my application for Azure, Microsoft has made it very clear that I will not be able to run my app on my servers. If I need to downsize, or scale back, I really can't. Once you go to Azure, you are there permanently. I suspect this will be a major sticking point for many organizations. I've seen quotes by Microsoft people suggesting that we should all factor our applications into "the part we host" and "the part they host". But even assuming we're all willing to go to that work, and introduce that complexity, this still means that part of my app can never run anywhere but on Microsoft's servers.

I suspect this lock-in issue will be the biggest single roadblock to adoption for most organizations (assuming reasonable pricing - which I think is a given).

But I must say that even if Microsoft doesn't back down on this. And even if it does block the success of "Windows Azure" as we see it now, that's probably OK.

Why?

Because I think the biggest benefit to Azure is one important concept: an abstract runtime.

If you or I write a .NET web app today, what are the odds that we can just throw it on a random Windows Server 200x box and have it work? Pretty darn small. The same is true for apps written for Unix, Linux or any other platform.

The reason apps can't just "run anywhere" is because they are built for a very broad and ill-defined platform, with no consistently defined architecture. Sure you might have an architecture. And I might have one too. But they are probably not the same. The resulting applications probably use different parts of the platform, in different ways, with different dependencies, different configuration requirements and so forth. The end result is that a high level of expertise is required to deploy any one of our apps.

This is one reason the "host a virtual machine" model is becoming so popular. I can build my app however I choose. I can get it running on a VM. Then I can deploy the whole darn VM, preconfigured, to some host. This is one solution to the problem, but it is not very elegant, and it is certainly not very efficient.

I think Azure (whether it succeeds or fails) illustrates a different solution to the problem. Azure defines a limited architecture for applications. It isn't a new architecture, and it isn't the simplest architecture. But it is an architecture that is known to scale. And Azure basically says "if you want to play, you play my way". None of this random use of platform features. There are few platform features, and they all fit within this narrow architecture that is known to work.

To me, this idea is the real future of the cloud. We need to quit pretending that every application needs a "unique architecture" and realize that there are just a very few architectures that are known to work. And if we pick a good one for most or all apps, we might suffer a little in the short term (to get onto that architecture), but we win in the long run because our apps really can run anywhere. At least anywhere that can host that architecture.

Now in reality, it is not just the architecture. It is a runtime and platform and compilers and tools that support that architecture. Which brings us back to Windows Azure as we know it. But even if Azure fails, I suspect we'll see the rise of similar "restricted runtimes". Runtimes that may leverage parts of .NET (or Java or whatever), but disallow the use of large regions of functionality, and disallow the use of native platform features (from Windows, Linux, etc). Runtimes that force a specific architecture, and thus ensure that the resulting apps are far more portable than the typical app is today.

Thursday, December 11, 2008 6:31:31 PM (Central Standard Time, UTC-06:00)  #    Disclaimer
 Friday, December 5, 2008

CSLA .NET for Windows and CSLA .NET for Silverlight version 3.6 RC1 are now available for download.

The Expert C# 2008 Business Objects book went to the printer on November 24 and should be available very soon.

I'm working to have a final release of 3.6 as quickly as possible. The RC1 release includes a small number of bug fixes based on issues reported through the CSLA .NET forum. Barring any show-stopping bug reports over the next few days, I expect to release version 3.6 on December 15.

If you intend to use version 3.6, please download this release candidate and let me know if you encounter any issues. Thank you!

Friday, December 5, 2008 10:30:44 AM (Central Standard Time, UTC-06:00)  #    Disclaimer
 Wednesday, December 3, 2008

One of the topic areas I get asked about frequently is authorization. Specifically role-based authorization as supported by .NET, and how to make that work in the "real world".

I get asked about this because CSLA .NET (for Windows and Silverlight) follow the standard role-based .NET model. In fact, CSLA .NET rests directly on the existing .NET infrastructure.

So what's the problem? Why doesn't the role-based model work in the real world?

First off, it is important to realize that it does work for some scenarios. It isn't bad for course-grained models where users are authorized at the page or form level. ASP.NET directly uses this model for its authorization, and many people are happy with that.

But it doesn't match the requirements of a lot of organizations in my experience. Many organizations have a slightly more complex structure that provides better administrative control and manageability.

image

Whether a user can get to a page/form, or can view a property or edit a property is often controlled by a permission, not a role. In other words, users are in roles, and a role is essentially a permission set: a list of permissions the role has (or doesn't have).

This doesn't map real well into the .NET IPrincipal interface, which only exposes an IsInRole() method. Finding out if the user is in a role isn't particularly useful, because the application really needs to call some sort of HasPermission() method.

In my view the answer is relatively simple.

The first step is understanding that there are two concerns here: the administrative issues, and the runtime issues.

At administration time the concepts of "user", "role" and "permission" are all important. Admins will associate permissions with roles, and roles with users. This gives them the kind of control and manageability they require.

At runtime, when the user is actually using the application, the roles are entirely meaningless. However, if you consider that IsInRole() can be thought of as "HasPermission()", then there's a solution. When you load the .NET principal with a list of "roles", you really load it with a list of permissions. So when your application asks "IsInRole()", it does it like this:

bool result = currentPrincipal.IsInRole(requiredPermission);

Notice that I am "misusing" the IsInRole() method by passing in the name of a permission, not the name of a role. But that's ok, assuming that I've loaded my principal object with a list of permissions instead of a list of roles. Remember, the IsInRole() method typically does nothing more than determine whether the string parameter value is in a list of known values. It doesn't really matter if that list of values are "roles" or "permissions".

And since, at runtime, no one cares about roles at all, there's no sense loading them into memory. This means the list of "roles" can instead be a list of "permissions".

The great thing is that many people store their users, roles and permissions in some sort of relational store (like SQL Server). In that case it is a simple JOIN statement to retrieve all permissions for a user, merging all the user's roles together to get that list, and not returning the actual role values at all (because they are only useful at admin time).

Wednesday, December 3, 2008 11:19:01 AM (Central Standard Time, UTC-06:00)  #    Disclaimer
 Wednesday, November 19, 2008

As the Expert C# 2008 Business Objects book becomes available (now in alpha ebook form, and in paper by the end of this year), I'm getting more questions about what book to buy for different versions of CSLA .NET, and for different .NET technologies.

(Expert VB 2008 Business Objects should be out in February 2009. The one unknown with this effort is how quickly the team of volunteers can get the VB port of CSLA .NET 3.6 complete, as I don't think we can release the book until the code is also available for download.)

I do have a summary of book editions that is often helpful in understanding what book(s) cover which version of CSLA .NET. But I thought I'd slice and dice the information a little differently to help answer some of the current questions.

First, by version of CSLA .NET:

CSLA .NET Version Book(s)
3.6 Expert 2008 Business Objects
3.5 <no book>
3.0 Expert 2005 Business Objects &
CSLA .NET Version 2.1 Handbook &
Using CSLA .NET 3.0
2.1 Expert 2005 Business Objects &
CSLA .NET Version 2.1 Handbook
2.0 Expert 2005 Business Objects

 

Next, by .NET technology:

.NET Technology Book(s)
WPF Expert 2008 Business Objects
WCF Expert 2008 Business Objects
Silverlight <no book yet>
Expert 2008 Business Objects gets you 80% there though
WF Expert 2008 Business Objects &
Using CSLA .NET 3.0 (for WF example)
ASP.NET MVC <no book yet>
ASP.NET Web Forms Expert 2008 Business Objects
Windows Forms Expert 2008 Business Objects &
Using CSLA .NET 3.0 (for important data binding info)
LINQ to CSLA Expert 2008 Business Objects
LINQ to SQL Expert 2008 Business Objects
ADO.NET Entity Framework Expert 2008 Business Objects (limited coverage)
.NET Remoting Expert 2005 Business Objects
asmx Web Services Expert 2005 Business Objects

 

As always, here are the locations to download CSLA .NET for Windows and to download CSLA .NET for Silverlight.

Wednesday, November 19, 2008 9:17:58 AM (Central Standard Time, UTC-06:00)  #    Disclaimer