Rockford Lhotka's Blog

Home | Lhotka.net | CSLA .NET

 Monday, October 07, 2013

The short answer to the question of whether the Microsoft .NET Framework (and its related tools and technologies) has a future is of course, don’t be silly.

The reality is that successful technologies take years, usually decades, perhaps longer, to fade away. Most people would be shocked at how much of the world runs on RPG, COBOL, FORTRAN, C, and C++ – all languages that became obsolete decades ago. Software written in these languages runs on mainframes and minicomputers (also obsolete decades ago) as well as more modern hardware in some cases. Of course in reality mainframes and minicomputers are still manufactured, so perhaps they aren’t technically “obsolete” except in our minds.

It is reasonable to assume that .NET (and Java) along with their primary platforms (Windows and Unix/Linux) will follow those older languages into the misty twilight of time. And that such a thing will take years, most likely decades, perhaps longer, to occur.

I think it is critical to understand that point, because if you’ve built and bet your career on .NET or Java it is good to know that nothing is really forcing you to give them up. Although your chosen technology is already losing (or has lost) its trendiness, and will eventually become extremely obscure, it is a pretty safe bet that you’ll always have work. Even better, odds are good that your skills will become sharply more valuable over time as knowledgeable .NET/Java resources become more rare.

Alternately you may choose some trendier alternative; the only seemingly viable candidate being JavaScript or its spawn such as CoffeeScript or TypeScript.

How will this fading of .NET/Java technology relevance occur?

To answer I’ll subdivide the software world into two parts: client devices and servers.

Client Devices

On client devices (PCs, laptops, ultrabooks, tablets, phones, etc.) I feel the need to further split into two parts: consumer apps and business apps. Yes, I know that people seem to think there’s no difference, but as I’ve said before, I think there’s an important economic distinction between the consumer and business apps.

Consumer Apps

Consumer apps are driven by a set of economic factors that make it well worth the investment to build native apps for every platform. In this environment Objective C, Java, and .NET (along with C++) all have a bright future.

Perhaps JavaScript will become a contender here, but that presupposes Apple, Google, and Microsoft work to make that possible by undermining their existing proprietary development tooling. There are some strong economic reasons why none of them would want every app on the planet to run equally on every vendor’s device, so this seems unlikely. That said, for reasons I can’t fathom, Microsoft is doing their best to make sure JavaScript really does work well on Windows 8, so perhaps Apple will follow suit and encourage their developers to abandon Objective C in favor of cross-platform JavaScript?

Google already loves the idea of JavaScript and would clearly prefer if we all just wrote every app in JavaScript for Chrome on Android, iOS, and Windows. The only question in my mind is how they will work advertising into all of our Chrome apps in the future?

My interest doesn’t really lie in the consumer app space, as I think relatively few people are going to get rich building casual games, fart apps, metro transit mapping apps, and so forth. From a commercial perspective there is some money to be made building apps for corporations, such as banking apps, brochure-ware apps, travel apps, etc. But even that is a niche market compared to the business app space.

Business Apps

Business apps (apps for use by a business’s employees) are driven by an important economic factor called a natural monopoly. Businesses want software that is built and maintained as cheaply as possible. Rewriting the same app several times to get a “native experience” on numerous operating systems has never been viable, and I can’t see where IT budgets will be expanding to enable such waste in the near future. In other words, businesses are almost certain to continue to build business apps in a single language for a single client platform. For a couple decades this has been Windows, with only a small number of language/tool combinations considered viable (VB, PowerBuilder, .NET).

But today businesses are confronted with pressure to write apps that work on the iPad as well as Windows (and outside the US on Android). The only two options available are to write the app 3+ times or to find some cross-platform technology, such as JavaScript.

The natural monopoly concept creates some tension here.

A business might insist on supporting just one platform, probably Windows. A couple years ago I thought Microsoft’s Windows 8 strategy was to make it realistic for businesses to choose Windows and .NET as this single platform. Sadly they’ve created a side loading cost model that basically blocks WinRT business app deployment, making Windows far less interesting in terms of being the single platform. The only thing Windows has going for it is Microsoft’s legacy monopoly, which will carry them for years, but (barring business-friendly changes to WinRT licensing) is doomed to erode.

You can probably tell I think Microsoft has royally screwed themselves over with their current Windows 8 business app “strategy”. I’ve been one of the loudest and most consistent voices on this issue for the past couple years, but Microsoft appears oblivious to the problem and has shown no signs of even recognizing the problem much less looking at solutions. I’ve come to the conclusion that they expect .NET on the client to fade away, and for Windows to compete as just one of several platforms that can run JavaScript apps. In other words I’ve come to the conclusion that Microsoft is willingly giving up on any sort of technology lock-in or differentiation of the Windows client in terms of business app development. They want us to write cross-platform JavaScript apps, and they simply hope that businesses and end users will choose Windows for other reasons than because the apps only run on Windows.

Perhaps a business would settle on iOS or Android as the “one client platform”, but that poses serious challenges given that virtually all businesses have massive legacies of Windows apps. The only realistic way to switch clients to iOS or Android is to run all those Windows apps on Citrix servers (or equivalent), and to ensure that the client devices have keyboards and mice so users can actually interact with the legacy Windows apps for the next several years/decades. Android probably has a leg up here because most Android devices have USB ports for keyboards/mice, but really neither iOS nor Android have the peripheral or multi-monitor support necessary to truly replace legacy Windows (Win32/.NET).

This leaves us with the idea that businesses won’t choose one platform in the traditional sense, but rather will choose a more abstract runtime: namely JavaScript running in a browser DOM (real or simulated). Today this is pretty hard because of differences between browsers and between browsers on different platforms. JavaScript libraries such as jquery, angular, and many others seek to abstract away those differences, but there’s no doubt that building a JavaScript client app costs more today than building the same app in .NET or some other more mature/consistent technology.

At the same time, only JavaScript really offers any hope of building a client app codebase that can run on iOS, Android, and Windows tablets, ultrabooks. laptops, and PCs. So though it may be more expensive than just writing a .NET app for Windows, JavaScript might be cheaper than rewriting the app 3+ times for iOS, Android, and Windows. And there’s always hope that JavaScript (or its offspring like CoffeScript or TypeScript) will rapidly mature enough to make this “platform” more cost-effective.

I look at JavaScript today much like Visual Basic 3 in the early 1990s (or C in the late 1980s). It is typeless and primitive compared to modern C#/VB or Java. To overcome this it relies on tons of external components (VB had its component model, JavaScript has myriad open source libraries). These third party components change rapidly and with little or no cross-coordination, meaning that you are lucky if you have a stable development target for a few weeks (as opposed to .NET or Java where you could have a stable target for months or years). As a result a lot of the development practices we’ve learned and mastered over the past 20 years are no longer relevant, and new practices must be devised, refined, and taught.

Also we must recognize that JavaScript apps never go into a pure maintenance mode. Browsers and underlying operating systems, along with the numerous open source libraries you must use, are constantly versioning and changing, so you can never stop updating your app codebase to accommodate this changing landscape. If you do stop, you’ll end up where so many businesses are today: trapped on IE6 and Windows XP because nothing they wrote for IE6 can run on any modern browser. We know that is a doomed strategy, so we therefore know that JavaScript apps will require continual rewrites to keep them functional over time.

What I’m getting at here is that businesses have an extremely ugly choice on the client:

  1. Rewrite and maintain every app 3+ times to be native on Windows, iOS, and Android
  2. Absorb the up-front and ongoing cost of building and maintaining apps in cross-platform JavaScript
  3. Select one platform (almost certainly Windows) on which to write all client apps, and require users to use that platform

I think I’ve listed those in order from most to least expensive, though numbers 1 and 2 could be reversed in some cases. I think in all cases it is far cheaper for businesses to do what Delta recently did and just issue Windows devices to their employees, thus allowing them to write, maintain, and support apps on a single, predictable platform.

The thing is that businesses are run by humans, and humans are often highly irrational. People are foolishly enamored of BYOD (bring your own device), which might feel good, but is ultimately expensive and highly problematic. And executives are often the drivers for alternate platforms because they like their cool new gadgets; oblivious to the reality that supporting their latest tech fad (iPad, Android, whatever) might cost the business many thousands (often easily 100’s of thousands) of dollars each year in software development, maintenance, and support costs.

Of course I work for a software development consulting company. Like all such companies we effectively charge by the hour. So from my perspective I’d really prefer if everyone did decide to write all their apps 3+ times, or write them in cross-platform JavaScript. That’s just more work for us, even if objectively it is pretty damn stupid from the perspective of our customers’ software budgets.

Server Software

Servers are a bit simpler than client devices.

The primary technologies used today on servers are .NET and Java. Though as I pointed out at the start of this post, you shouldn’t discount the amount of COBOL, RPG, FORTRAN, and other legacy languages/tools/platforms that make our world function.

Although JavaScript has a nescient presence on the server via tools like node.js, I don’t think any responsible business decision maker is looking at moving away from existing server platform tools in the foreseeable future.

In other words the current 60/40 split (or 50/50, depending on whose numbers you believe) between .NET and Java on the server isn’t likely to change any time soon.

Personally I am loath to give up the idea of a common technology platform between client and server – something provided by VB in the 1990s and .NET over the past 13 years. So if we really do end up writing all our client software in JavaScript I’ll be a strong advocate for things like node.js on the server.

In the mid-1990s it was pretty common to write “middle tier” software in C++ and “client tier” software in PowerBuilder or VB. Having observed such projects and the attendant complexity of having a middle tier dev team who theoretically coordinated with the client dev team, I can say that this isn’t a desirable model. I can’t support the idea of a middle tier in .NET and a client tier in JavaScript, because I can’t see how team dynamics and inter-personal communication capabilities have changed enough (or at all) over the past 15 years such that we should expect any better outcome now than we got back then.

So from a server software perspective I think .NET and Java have a perfectly fine future, because the server-side JavaScript concept is even less mature than client-side JavaScript.

At the same time, I really hope that (if we move to JavaScript on the client) JavaScript matures rapidly on both client and server, eliminating the need for .NET/Java on the server as well as the client.

Conclusion

In the early 1990s I was a VB expert. In fact, I was one of the world’s leading VB champions through the 1990s. So if we are going to select JavaScript as the “one technology to rule them all” I guess I’m OK with going back to something like that world.

I’m not totally OK with it, because I rather enjoy modern C#/VB and .NET. And yes, I could easily ride out the rest of my career on .NET, there’s no doubt in my mind. But I have never in my career been a legacy platform developer, and I can’t imagine working in a stagnant and increasingly irrelevant technology, so I doubt I’ll make that choice – stable though it might be.

Fwiw, I do still think Microsoft has a chance to make Windows 8, WinRT, and .NET a viable business app development target into the future. But their time is running out, and as I said earlier they seem oblivious to the danger (or are perhaps embracing the loss of Windows as the primary app dev target on the client). I would like to see Microsoft wake up and get a clue, resulting in WinRT and .NET being a viable future for business app dev.

Failing that however, we all need to start putting increasing pressure on vendors (commercial and open source) to mature JavaScript, its related libraries and tools, and its offspring such as TypeScript – on both client and server. The status of JavaScript today is too primitive to replace .NET/Java, and if we’re to go down this road a lot of money and effort needs to be expended rapidly to create a stable, predictable, and productive enterprise-level JavaScript app dev platform.

Monday, October 07, 2013 9:24:20 PM (Central Standard Time, UTC-06:00)  #    Disclaimer  |  Comments [39]  | 
 Thursday, February 14, 2013

In a recent email thread I ended up writing a lengthy bit of content summarizing some of my thoughts around the idea of automatically projecting js code into an HTML 5 (h5js) browser app.

Another participant in the thread mentioned that he’s a strong proponent of separation of concerns, and in particular keeping the “model” separate from data access. In his context the “model” is basically a set of data container or DTO objects. My response:

-----------------------------

I agree about separation of concerns at the lower levels.

I am a firm believer in domain focused business objects though. In the use of “real” OOD, which largely eliminates the need for add-on hacks like a viewmodel.

In other words, apps should have clearly defined logical layers. I use this model:

Interface
Interface control
Business
Data access
Data storage

This model works for pretty much everything: web apps, smart client apps, service apps, workflow tasks (apps), etc.

The key is that the business layer consists of honest-to-god real life business domain objects. These are designed using OOD so they reflect the requirements of the user scenario, not the database design.

If you have data-centric objects, they’ll live in the Data access layer. And that’s pretty common when using any ORM or something like EF, where the tools help you create data-centric types. That’s very useful – then all you need to do is use object:object mapping (OOM) to get the data from the data-centric objects into the more meaningful business domain objects.

At no point should any layer talk to the database other than the Data access layer. And at no point should the Interface/Interface control layers interact with anything except the Business layer.

Given all that, the question with smart client web apps (as I’ve taken to calling these weird h5js/.NET hybrids) is whether you are using a service-oriented architecture or an n-tier architecture. This choice must be made _first_ because it impacts every other decision.

The service-oriented approach says you are creating a system composed of multiple apps. In our discussion this would be the smart client h5js app and the server-side service app. SOA mandates that these apps don’t trust each other, and that they communicate through loosely coupled and clearly defined interface contracts. That allows the apps to version independently. And the lack of trust means that data flowing from the consuming app (h5js) to the service app isn’t trusted – which makes sense given how easy it is to hack anything running in the browser. In this world each app should (imo) consist of a series of layers such as those I mentioned earlier.

The n-tier approach says you are creating one app with multiple layers, and those layers might be deployed on different physical tiers. Because this is one app, the layers can and should have reasonable levels of trust between them. As a result you shouldn’t feel the need to re-run business logic just because the data flowed from one layer/tier to another (completely different from SOA).

N-tier can be challenging because you typically have to decide where to physically put the business layer: on the client to give the user a rich and interactive experience, or on the server for more control and easier maintenance. In the case of my CSLA .NET framework I embraced the concept of _mobile objects_ where the business layer literally runs on the client AND on the server, allowing you to easily run business logic where most appropriate. Sadly this requires that the same code can actually run on the client and server, which isn’t the case when the client and server are disparate platforms (e.g. h5js and .NET).

This idea of projecting server-side business domain objects into the client fits naturally into the n-tier world. This has been an area of deep discussion for months within the CSLA dev team – how to make it practical to translate the rich domain business behaviors into js without imposing a major burden of writing js alongside C#.

CSLA objects have a very rich set of rules and behaviors that ideally would be automatically projected into a js business layer for use by the smart client h5js Interface and Interface control layers. I love this idea – but the trick is to make it possible such that there’s not a major new burden for developers.

This idea of projecting server-side business domain objects into the client is a less natural fit for a service-oriented system, because there’s a clear and obvious level of coupling between the service app and the h5js app (given that parts of the h5js app literally generate based on the service app). I’m not sure this is a total roadblock, but you have to go into this recognizing that such an approach compromises the primary purpose of SOA, which is loose coupling between the apps in the system…

Thursday, February 14, 2013 10:39:23 AM (Central Standard Time, UTC-06:00)  #    Disclaimer  |  Comments [8]  | 
 Wednesday, November 09, 2011

Joe writes a nice summary of why a web app is not a web page – it is something different, and in important ways.

http://www.misfitgeek.com/2011/11/html5-app-versus-html5-page/

This distinction becomes very important when considering building H5/js apps on WinRT in Windows 8, or if you believe (in general) that H5/js will replace existing dev platforms like Java and .NET. For that to happen, we have to stop thinking about HTML and js as web technologies – they must be thought of as general purpose technologies that sometimes happen to be used on the web too.

Web | WinRT
Wednesday, November 09, 2011 10:10:01 AM (Central Standard Time, UTC-06:00)  #    Disclaimer  |  Comments [0]  | 
 Tuesday, June 07, 2011

The final draft of the Using CSLA 4: ASP.NET MVC ebook is now online, and available for purchase.

If you own the Using CSLA 4 ebook series, you already own this new ebook and can download it from http://download.lhotka.net/Default.aspx?t=UsingCsla4.

If you are buying the ebooks individually, the new book is now available for purchase from http://store.lhotka.net.

This ebook also includes a code download, containing the latest ProjectTracker codebase. This includes the Mvc3UI project: an ASP.NET MVC 3 application built using the CSLA .NET business layer.

Books | CSLA .NET | Web
Tuesday, June 07, 2011 3:50:00 PM (Central Standard Time, UTC-06:00)  #    Disclaimer  |  Comments [0]  | 
 Thursday, April 28, 2011

Today I spent hours trying to make one line of code work. Fortunately Miguel Castro was ultimately able to help me troubleshoot the issue thanks to his deep understanding of all things ASP.NET.

The line of code seems so simple:

var data = Roles.GetRolesForUser(“rocky”);

In this case Roles is the RoleProvider API from System.Web.Security, and the method returns a string array with the roles for the user.

This line of code works great in my aspx page. No problem at all.

But in the same web project it throws a NullReferenceException when called from a WCF service (in a svc).

No stack trace. No detail. No nothing. I Binged, and Googled, and created a whole new solution (figuring my first solution was somehow corrupted). But nothing helped.

To cut to the answer, this works in a WCF service:

var data = Roles.Provider.GetRolesForUser(“rocky”);

See the difference? Instead of relying on the static helper method on the Roles type, this code explicitly calls the method on the default provider instance.

(fwiw, I think this behavior is relatively new, because that code used to work. I just don’t remember how long ago it was that it worked – .NET 3 or 3.5 – so perhaps something broke in .NET 4?)

In any case, problem solved – thanks Miguel!!

WCF | Web
Thursday, April 28, 2011 5:46:57 PM (Central Standard Time, UTC-06:00)  #    Disclaimer  |  Comments [1]  | 
 Wednesday, January 12, 2011

Anyone who expected HTML5 to be more standard or consistent than previous HTML standards hasn’t been paying attention to the web over the past 13+ years.

Google’s decision to drop H.264 may be surprising as a specific item, but the idea of supporting and not supporting various parts of HTML5 (or what vendors hope might be in HTML5 as it becomes a standard over the next several years) is something everyone should expect.

I have held out exactly zero hope for a consistent HTML5 implementation across the various browser vendors. Why? It is simple economics.

If IE, Chrome, FF and other browsers on Windows were completely standards complaint in all respects there would be no reason for all those browsers. The only browser that would exist is IE because it ships with Windows.

The same is true on the Mac and any other platform. If the browsers implement exactly the same behaviors, they can’t differentiate, so nobody would use anything except the browser shipping with the OS.

But the browser vendors have their own agendas and goals, which have nothing to do with standards compliance or consistency. Every browser vendor is motivated by economics – whether that’s mindshare for other products or web properties, data mining, advertising, or other factors. These vendors want you to use their browser more than other browsers.

So in a “standards-based world” how do you convince people to use your product when it is (in theory) identical to every other product?

When products have a price you could try to compete on that price. That’s a losing game though, because when you undercut the average price you have less money to innovate or even keep up. So even in the 1990’s with Unix standards vendors didn’t play the pricing game – it is just a slow death.

But browsers are free, so even if you wanted the slow death of the price-war strategy you can’t play it in the browser world. So you are left with “embrace and extend” or “deviate from the standard” as your options. And every browser vendor must play this game or just give up and die.

This happened to Unix in the 1990’s too – the various vendors didn’t want to play on price, so they added features beyond any standard with the intent of doing two things:

  1. Lure users to their version of Unix because it is “better”
  2. Lock them into your version of Unix because as soon as they use your cool features they are stuck on your version

The same is true with browsers and HTML5. All browser vendors are jockeying now (before the standard is set) to differentiate and shape the standard. But even once there is a “standard” they’ll continue to support the various features they’ve added that didn’t make it into the standard.

This is necessary, because it is the only way any given browser can lure users away from the other browsers, thereby meeting the vendors’ long term goals.

People often say they don’t want a homogenous computing world. That they want a lot of variation and variety in the development platforms used across the industry. Over the past 13+ years HTML has proven that variation is absolutely possible, and that it is very expensive and frustrating.

Working for consulting companies during all this time, I can say that HTML is a great thing. Consultants charge by the hour, and any scenario where the same app must be rebuilt, or at least tweaked, for every browser and every new version of every browser is just a way to generate more revenue.

Is HTML a drag on the world economy overall? Sure it is. Anything that automatically increases the cost of software development like HTML is an economic drag by definition. The inefficiency is built-in.

The only place (today) with more inefficiency is the mobile space, where every mobile platform has a unique development environment, tools, languages and technologies. I love the mobile space – any app you want to build must be created for 2-5 different platforms, all paid for at an hourly rate. (this is sarcasm btw – I really dislike this sort of blatant inefficiency and waste)

But I digress. Ultimately what I’m saying is that expecting HTML5 to provide more consistency than previous versions of HTML is unrealistic, and moves like Google just took are something we should all expect to happen on a continuing basis.

Wednesday, January 12, 2011 11:49:44 AM (Central Standard Time, UTC-06:00)  #    Disclaimer  |  Comments [10]  | 
 Friday, October 22, 2010

I am pleased to announce the first beta release of CSLA 4 version 4.1 with support for Windows Phone 7, and continuing support for .NET 4 and Silverlight 4.

Download CSLA 4 version 4.1.0 here

With this release, it is now possible to write a single business layer composed of business domain objects that run unchanged on a WP7 device, in Silverlight, on a Windows client, on Windows Server and on Windows Azure.

The samples download includes Samples\Net\cs\SimpleNTier, which does implement the following interfaces over one common business layer:

  • Windows Phone
  • WPF
  • Silverlight
  • ASP.NET MVC

Of course this is just the first beta release, so there’s more work to be done. At the same time, we have completed the vast majority of the effort, and it is quite possible to build WP7 applications using this beta.

As with all CSLA releases, this one does include some bug fixes and enhancements to other parts of the framework. Please see the change log for a list of all changes. Enhancement highlights include:

  • Add ability to get a consolidated list of broken rules for an object graph
  • New BackgroundWorker component that automatically initializes background threads with the current principal and culture from the UI thread
  • TriggerAction provides better debugging information, following the lead of many Microsoft XAML controls
  • and much more…

In related news, UnitDriven has also been updated to support WP7, and provides a pretty comprehensive unit test runner and framework for WP7 code. CSLA uses UnitDriven for its automated testing, but UnitDriven can be used for any application on .NET, Silverlight or WP7.

Similarly, Bxf (Basic XAML Framework) has been updated to support WP7, thereby providing a common MVVM framework for WPF, Silverlight and WP7 UI development efforts. Some CSLA sample apps use Bxf, but Bxf can be used for any application, including those that don’t involve CSLA at all.

Bxf | CSLA .NET | Silverlight | Web | Windows Azure | WP7 | WPF
Friday, October 22, 2010 1:35:18 PM (Central Standard Time, UTC-06:00)  #    Disclaimer  |  Comments [0]  | 
 Friday, June 25, 2010

CSLA .NET 3.8.4 is now available as a beta download. This is mostly a bug fix release to address a few issues from 3.8.3, plus some ASP.NET MVC work.

Version 3.8.4 targets .NET 3.5 and Silverlight 3 (though with a little effort it works with Silverlight 4 as well).

See the change log for a list of changes. There aren’t many, but if they affect you then they are important.

The only feature change in 3.8.4 is that most of the new ASP.NET MVC 2 support from CSLA 4 has been back-ported to 3.8. This means that the Csla.Web.Mvc project now targets and supports ASP.NET MVC 2, and provides more features and functionality that was there in 3.8.3.

This is a stable beta, given the small number of changes (other than the MVC support). So if you are affected by any of the issues listed in the change log I strongly recommend moving from 3.8.3 to 3.8.4 beta to test and utilize the changes.

Friday, June 25, 2010 11:19:03 AM (Central Standard Time, UTC-06:00)  #    Disclaimer  |  Comments [0]  | 
 Wednesday, January 13, 2010

I’m often asked whether CSLA .NET supports the MVC, MVP, MVVM or other “M” patterns.

The answer is an absolute YES! Of course it does, because CSLA .NET is all about the “M”.

The “M” in all these patterns stands for Model – the business layer that is supposed to encapsulate all your business logic and the data required for that logic to work.

Since its inception 13+ years ago, CSLA has been primarily focused on helping people build a powerful business layer composed of business domain objects. This implies strong separation of concerns where UI issues are (as much as possible) kept out of the business layer, as are data access concerns.

The various “M” patterns also support separation of concerns. Their perspective is from the UI level, as they are all UI patterns. But at the core of each of them is the idea that the “M” should encapsulate all the business logic, business processing and data required to perform that processing.

Whether you want a Controller-View or a Presenter-View or a View-ViewModel to organize your UI code, in every case you need a clearly separate Model against which the UI will work.

CSLA .NET 3.8 specifically added a bunch of features to simplify the use of MVVM in Silverlight and WPF, but that’s just simplification. MVVM worked fine without those enhancements, it is just a little easier now. Of course you still need an MVVM UI framework, because CSLA isn’t about the UI, it is about the Model.

CSLA .NET 3.8.2 includes a CslaModelBinder for ASP.NET MVC. Again, this simplifies one aspect of using ASP.NET MVC for your UI framework, but it isn’t strictly necessary – it just makes life easier. The same rule applies, CSLA isn’t a UI framework and so does as little as possible at the “V” and “C” level because its focus is all about the “M”.

Wednesday, January 13, 2010 12:39:46 PM (Central Standard Time, UTC-06:00)  #    Disclaimer  |  Comments [0]  | 
 Wednesday, November 04, 2009

Of course I’m referring to Windows Forms, which is about 8 years old. Even in dog years that’s not old. But in software years it is pretty old I’m afraid…

I’m writing this post because here and in other venues I’ve recently referred to Windows Forms as “legacy”, along with asmx and even possibly Web Forms. This has caused a certain amount of alarm, but I’m not here to apologize or mollify.

Technologies come and go. That’s just life in our industry. I was a DEC VAX guy for many years (I hear Ted Neward laughing now, he loves these stories), but I could see the end coming years before it faded away, so I switched to the woefully immature Windows platform (Windows 3.0 – what a step backward from the VAX!). I know many FoxPro people who transitioned, albeit painfully, to VB or other tools/languages. The same with Clipper/dBase/etc. Most PowerBuilder people transitioned to Java or .NET (though much to my surprise I recently learned that PowerBuilder still actually exists – like you can still buy it!!).

All through my career I’ve been lucky or observant enough to jump ship before any technology came down on my head. I switched to Windows before the VAX collapsed, and switched to .NET before VB6 collapsed, etc. And honestly I can’t think of a case where I didn’t feel like I was stepping back in time to use the “new technology” because it was so immature compared to the old stuff. But every single time it was worth the effort, because I avoided being trapped on a slowly fading platform/technology with my skills becoming less relevant every day.

But what is “legacy”? I once heard a consultant say “legacy is anything you’ve put in production”. Which might be good for a laugh, but isn’t terribly useful in any practical sense.

I think “legacy” refers to a technology or platform that is no longer an area of focus or investment by the creator/maintainer. In our world that mostly means Microsoft, and so the question is where is Microsoft focused, where are they spending their money and what are they enhancing?

The answers are pretty clear:

  • Azure
  • Silverlight
  • ASP.NET MVC
  • WPF (to a lesser degree)
  • ADO.NET EF
  • WCF

These are the areas where the research, development, marketing and general energy are all focused. Ask a Microsoft guy what’s cool or hot and you’ll hear about Azure or Silverlight, maybe ADO.NET EF or ASP.NET MVC and possibly WPF or WCF. But you won’t hear Windows Forms, Web Forms, asmx web services, Enterprise Services, Remoting, LINQ to SQL, DataSet/TableAdapter/DataTable or numerous other technologies.

Some of those other technologies aren’t legacy – they aren’t going away, they just aren’t sexy. Raw ADO.NET, for example. Nobody talks about that, but ADO.NET EF can’t exist without it, so it is safe. But in theory ADO.NET EF competes with the DataSet (poorly, but still) and so the DataSet is a strong candidate for the “legacy” label.

Silverlight and WPF both compete with Windows Forms. Poor Windows Forms is getting no love, no meaningful enhancements or new features. It is just there. At the same time, Silverlight gets a new release in less than 12 month cycles, and WPF gets all sorts of amazingly cool new features for Windows 7. You tell me whether Windows Forms is legacy. But whatever you decide, I’m surely spending zero cycles of my time on it.

asmx is obvious legacy too. Has been ever since WCF showed up, though WCF’s configuration issues have been a plague on its existence. I rather suspect .NET 4.0 will address those shortcomings though, making WCF as easy to use as asmx and driving the final nail in the asmx coffin.

Web Forms isn’t so clear to me. All the buzz is on ASP.NET MVC. That’s the technology all the cool kids are using, and it really is some nice technology – I like it as much as I’ll probably ever like a web technology. But if you look at .NET 4.0, Microsoft has done some really nice things in Web Forms. So while it isn’t getting the hype of MVC, it is still getting some very real love from the Microsoft development group that owns the technology. So I don’t think Web Forms is legacy now or in .NET 4.0, but beyond that it is hard to say. I strongly suspect the fate of Web Forms lies mostly in its user base and whether they fight for it, whether they make Microsoft believe it continues to be worth serious investment and improvement into the .NET 5.0 timeframe.

For my part, I can tell you that it is amazingly (impossibly?) time-consuming to be an expert on 7-9 different interface technologies (UI, service, workflow, etc). Sure CSLA .NET supports all of them, but there are increasing tensions between the stagnant technologies (most notably Windows Forms) and the vibrant technologies like Silverlight and WPF. It is no longer possible, for example, to create a collection object that works with all the interface technologies – you just can’t do it. And the time needed to deeply understand the different binding models and subtle differences grows with each release of .NET.

CSLA .NET 4.0 will absolutely still support all the interface technologies. But it would be foolish to cut off the future to protect the past – that way lies doom. So in CSLA .NET 4.0 you should expect to see support for Windows Forms still there, but probably moved into another namespace (Csla.Windows or something), while the main Csla namespace provides support for modern interface technologies like WPF, ASP.NET MVC, Silverlight, etc.

I am absolutely committed to providing a window of time where Windows Forms users can migrate their apps to WPF or Silverlight while still enjoying the value of CSLA .NET. And I really hope to make that reasonably smooth – ideally you’ll just have to change your base class types for your business objects when you switch the UI for the object from Windows Forms to XAML – though I suspect other minor tweaks may be necessary as well in some edge cases.

But let’s face it, at some point CSLA .NET does have to drop legacy technologies. I’m just one guy, and even with Magenic being such a great patron it isn’t realistic to support every technology ever invented for .NET :)  I don’t think the time to drop Windows Forms is in 4.0, because there are way too many people who need to migrate to WPF over the next 2-3 years.

On the other hand, if you and your organization aren’t developing a strategy to move off Windows Forms in the next few years I suspect you’ll eventually wake up one day and realize you are in a bad spot. One of those spots where you can’t hire anyone because no one else has done your technology for years, and nobody really remembers how it works (or at least won’t admit they do unless you offer them huge sums of money).

I don’t see this as bad. People who want stability shouldn’t be in computing. They should be in something like accounts receivable or accounts payable – parts of business that haven’t changed substantially for decades, or perhaps centuries.

Wednesday, November 04, 2009 9:20:16 PM (Central Standard Time, UTC-06:00)  #    Disclaimer  |  Comments [0]  | 
 Tuesday, February 03, 2009

I recently installed the latest version of IE8, bowing to pressure from some of my Microsoft colleagues :)

I don’t regret the decision! IE8 is easily as fast as FF3, and has a lot of really nice, often subtle, features that make it far more useful than IE7.

I was rather excited when I first went to my home page (www.lhotka.net), because it rendered correctly. This page renders nice in FF2, FF3 and now IE8. But it never quite worked right in IE7. I was never able to justify the (probably many) hours it would take to troubleshoot the css. Just getting my site working as well as it does was enough to turn my antipathy toward web UI work into tangible dislike.

(I know some people enjoy web UI work. I’ve only found it to be one of the most frustrating experiences in all the years I’ve been programming… Seriously – could we have invented anything more arcane??)

Anyway, my home page renders nicely in IE8, so I’m excited.

Until I try to navigate using one of the drop-down menus. All that appears is a while box! How can IE8 not render an ASP.NET Menu control?!?!?!?!?!

Well, it turns out that this is a known issue that the IE8 and ASP.NET teams are working on. It also turns out that it is apparently a z-order issue with the Menu control and the way it renders, so it can be argued that IE8 is actually doing the “right thing” (even though other browsers, including FF3 render the way I’d expect).

In talking to Joshua, Brad and Giorgio at Microsoft a workaround came to light, and Giorgio blogged the ASP.NET Menu control workaround.

I just had time to try this on my site. I’ll save you the suspense and say that it now does render correctly :)

In my case, I’m using themes and skins, and so I am using a css style to fix the z-order.

To do this, I added the workaround to my site-wide css style sheet:

.IE8Fix
{
    z-index: 1000;

}

And then I edited my skin file to apply this style to all my Menu controls:

<asp:Menu runat="server" BackColor="#83B8E4"

...
  <DynamicMenuStyle BackColor="#B5C7DE" CssClass="IE8Fix" />
...

</asp:Menu>

This was easy for me, because I already had the skin set up to apply numerous other properties and styles to the elements of the Menu control. I simply added the CssClass property to the existing DynamicMenuStyle element.

I published the project to my web server and just like that my Menu control is displaying correctly in IE8.

Web
Tuesday, February 03, 2009 9:54:59 AM (Central Standard Time, UTC-06:00)  #    Disclaimer  |  Comments [0]  | 
 Sunday, March 09, 2008

I'm just back from the MIX 08 conference. This was the first conference I've attended in many years (around 10 I think) where I wasn't speaking or somehow working. I'd forgotten just how fun and inspiring it can be to simply attend sessions and network with people in the open areas. No wonder people come to conference!! :)

Not that it was all fun and games. I did have meetings with some key Microsoft people and Scott Hanselman interviewed me for an upcoming episode of Hanselminutes (discussing the various data access and ORM technologies and how they relate to CSLA .NET 3.5).

The Day 1 keynote was everything I'd hoped for.

Well, nearly. The first part of the keynote was Ray Ozzie trying to convey how Microsoft and the web got to where it is now. The goal was to show the vision they are pursuing now and into the future, but I thought the whole segment was rather flat.

But then Scott Guthrie came on stage and that was everything you could hope for. Scott is a great guy, and his dedication and openness seem unparalleled within Microsoft. I remember first meeting him when ASP.NET was being unveiled. At that time he seemed so young and enthusiastic, and he was basically just this kick-ass dev who'd created the core of something that ultimately changed the Microsoft world. Today he seems nearly as young and easily as enthusiastic, and he's overseeing most of the cool technologies that continue to change the Microsoft world. Awesome!

So ScottGu gets on stage and orchestrates a keynote that really illustrates the future of the web. Silverlight (which makes me SOOoooo happy!), IE8, new data access technologies (like we needed more, but they are still cool!) and things like ASP.NET MVC and more.

As I expected, they released a whole lot of beta code. You can get a full list with links from Tim Sneath's blog. He also has links to some getting started materials.

The real reason for keynotes though, is to inspire. And this keynote didn't disappoint. The demos of Silverlight and related technologies were awesome! There was some funny and cute banter with the casting director from Circ del Sole as she demonstrated using a cool disconnected WPF app. There was a fellow RD, Scott Stanfield, showing integration of SeaDragon into Silveright so we can look (in exquisite detail) at the memorabilia owned by the Hard Rock Cafe company, some thought-provoking demos of Silverlight on mobile devices and more.

Now to be honest, I've never been a fan of the web development model. Having done terminal-based programming for many years before coming to Windows, I find it hard to get excited about returning to that ancient programming model. Well, a worse one actually, because at least the mainframe/minicomputer world had decent state management...

AJAX helps, but the browser makes for a pretty lame programming platform. It is more comparable perhaps to an Apple II or a Commodore 64 than to a modern environment, and that's before you get into the inconsistencies across browsers and that whole mess. Yuck!

Which is why Silverlight is so darn cool! Silverlight 2.0 is really a way to do smart client development with a true web deployment model. Much of the power of .NET and WPF/XAML, with the transparent deployment and cross-platform capabilities of the browser world. THIS is impressive stuff. To me Silverlight represents the real future of the web.

It should come as no surprise then, that I spent my time in Silverlight 2.0 sessions after the keynote. Sure, I've been working (on and off) with Silverlight 1.1/2.0 for the past several months, but it was a lot of fun to see presentations by great speakers like Joe Stegman (a Microsoft PM) and various other people.

One of the best sessions was on game development with Silverlight. I dabble in game development whenever I have spare time (not nearly as much as I'd like), and so the talk was interesting from that perspective. But many of the concepts and techniques they used in their games are things designers and developers will likely use in many other types of application. Background loading of assemblies and content while the app is running, and some clever animation techniques using pure XAML-based concepts (as opposed to some other animation techniques I saw that use custom controls written in C#/VB - which isn't bad, but it was fun to see the pure-XAML approaches).

Many people have asked about "CSLA Light", my planned version of CSLA .NET for Silverlight. Now that we have a Beta 1 of Silverlight I'll be working on a public release of CSLA Light, based on CSLA .NET 3.5. Microsoft has put a lot more functionality into Silverlight 2.0 than they'd originally planned - things like data binding, reflection and other key concepts are directly supported. This means that the majority of CSLA can be ported (with some work) into Silverlight. The data portal is the one big sticking point, and I'm sure that'll be the topic of future blog posts.

My goal is to support the CSLA .NET 3.5 syntax for property declaration and other coding constructs such that with little or no change you can take a business class from full CSLA and have it work in CSLA Light. This goal excludes the DataPortal_XZY implementations - those will almost always be different, though if you plan ahead and use a DTO-based data access model even that code may be the same. Of course time will tell how closely I'll meet this goal - but given my work with pre-beta Silverlight 2.0 code I think it is pretty realistic.

Scott Guthrie indicated that Silverlight 2.0 Beta 1 has a non-commercial go-live license - right now. And that Beta 2 would be in Q2 (I'm guessing June) and would have a commercial go-live license, meaning it can be used for real work in any context.

The future of the web is Silverlight, and Beta 1 is the start of that future. 2008 is going to be a great year!

Sunday, March 09, 2008 5:18:27 AM (Central Standard Time, UTC-06:00)  #    Disclaimer  |  Comments [0]  | 
 Thursday, January 24, 2008

I reader recently sent me an email asking why the PTWebService project in the CSLA .NET ProjectTracker reference app has support for using the data portal to talk to an application server. His understanding was that web services were end points, and that they should just talk to the database directly. Here's my answer:

A web service is just like any regular web application. The only meaningful difference is that it exposes XML instead of HTML.

 

Web applications often need to talk to application servers. While you are correct – it is ideal to build web apps in a 2-tier model, there are valid reasons (mostly around security) why organizations choose to build them using a 3-tier model.

 

The most common scenario (probably used by 40% of all organizations) is to put a second firewall between the web server and any internal servers. The web server is then never allowed to talk to the database directly (for security reasons), and instead is required to talk through that second firewall to an app server, which talks to the database.

 

The data portal in CSLA .NET helps address this issue, by allowing a web application to talk to an app server using several possible technologies. Since different organizations allow different technologies to penetrate that second firewall this flexibility is important.

 

Additionally, the data portal exists for other scenarios, like Windows Forms or WPF applications. It would be impractical for me to create different versions of CSLA for different types of application interface. In fact that would defeat the whole point of the framework – which is to provide a consistent and unified environment for building business layers that support all valid interface types (Windows, Web, WPF, Workflow, web service, WCF service, console, etc).

Thursday, January 24, 2008 8:01:56 AM (Central Standard Time, UTC-06:00)  #    Disclaimer  |  Comments [0]  | 
 Tuesday, November 13, 2007

As long as I'm doing simple blog links, here's another tool worth looking at, if you need to skin a SharePoint or other web site: 

SharePoint Skinner Overview - eLumenotion Blog

Web
Tuesday, November 13, 2007 10:53:57 AM (Central Standard Time, UTC-06:00)  #    Disclaimer  |  Comments [0]  | 
 Tuesday, January 17, 2006

In a recent entry there was a comment pointing to this article, which discusses the way ASP.NET uses threads – a fact that has serious ramifications on software design.

 

It made me do some serious thinking/research, because CSLA .NET uses various elements of the Thread object, including the CurrentPrincipal, thread-local storage and (in 2.0) the culture properties. My fear was that if the thread can switch “randomly” during page processing, then how can you count on any of these elements from the Thread object?

 

Scott Guthrie from Microsoft clarified this for me on a couple key points.

 

First, ASP.NET doesn’t change your thread at “random” or “without warning” (phrases I’d used in describing my worries). It changes it only in a clearly defined scenario. Specifically, when your thread performs an async IO operation. Scott points out that this is perhaps most common in a page that takes advantage of the new async page capability, or within a HttpModule that uses any async IO.

 

In other words, normal web pages really aren’t subject to this issue at all. It only occurs in relatively advanced scenarios. And you, as the developer, should know darn well when you invoke an async operation; thus you know when you open yourself up for thread switching.

 

Still, I wanted to make sure the Business Objects book didn’t go down a bad path with the Thread object. But Scott clarified further.

 

ASP.NET ensures that, even if they switch your thread, the CurrentPrincipal and culture properties from the original thread carry forward to the new thread. This is automatic, and you don’t need to worry about losing those values. Whew!

 

However, thread-local storage doesn’t carry forward. If you use that feature of the Thread object in ASP.NET code you could be in trouble. This did cause me to rework some code in the book and in CSLA .NET. Specifically CSLA .NET 2.0 now detects whether it is running in ASP.NET or not and uses either thread-local storage or HttpContext.Current.Items to maintain its context data.

 

I also went the rest of the way and created a Csla.ApplicationContext.User property that gets and sets the user’s principal on either the Thread or HttpContext based on whether the code is running in ASP.NET or not. This allows you to write code in your UI and objects and other libraries without worrying about whether it might run inside our outside ASP.NET at some point.

 

Certainly for mobile business objects this is very important! They can (and often do) run in both a smart client and ASP.NET environment within the same application.

Tuesday, January 17, 2006 2:48:29 PM (Central Standard Time, UTC-06:00)  #    Disclaimer  |  Comments [0]  | 
 Tuesday, November 15, 2005

The ADO Guy expresses his frustration that the ObjectDataSource doesn't work with typed DataSets in ADO.NET 2.0. In an earlier post I expressed my frustration about using it with business objects.

In the case of CSLA .NET 2.0 I'm writing my own CslaDataSource that understands how to work nicely with CSLA .NET business objects. In conferring with the ASP.NET team it seems like this is the best option. It turns out that writing a DataSource control isn't overly difficult - it is writing the designer support (for VS 2005) that is the hard part...

Tuesday, November 15, 2005 9:55:10 AM (Central Standard Time, UTC-06:00)  #    Disclaimer  |  Comments [0]  | 
 Thursday, October 13, 2005

ASP.NET 2.0 has bi-directional data binding, which is a big step forward. This means you can bind the UI to a data source (DataTable, object, etc.) and not only display the data, but allow the user to do “in-place” editing that updates the data source when the page is posted back.

 

The end result is very cool, since it radically reduces the amount of code required for many common data-oriented web pages.

 

Unfortunately the data binding implementation isn’t very flexible when it comes to objects. The base assumption is that “objects” aren’t intelligent. In fact, the assumption is that you are binding to “data objects” – commonly known as Data Transfer Objects or DTOs. They have properties, but no logic.

 

In my case I’m working with CSLA .NET 2.0 objects – and they do have logic. Lots of it, validation, authorization and so forth. They are “real” objects in that they are behavior-based, not data-based. And this causes a bit of an issue.

 

There are two key constraints that are problematic.

 

First, ASP.NET insists on creating instances of your objects at will – using a default constructor. CSLA .NET follows a factory method model where objects are always created through a factory method, providing for more control, abstraction and flexibility in object design.

 

Second, when updating data (the user has edited existing data and clicked Update), ASP.NET creates an instance of your object, sets its properties and calls an Update method. CSLA .NET objects are aware of whether they are new or old – whether they contain a primary key value that matches a value in the database or not. This means that the object knows whether to do an INSERT or UPDATE automatically. But when a CSLA .NET object is created out of thin air it obviously thinks it is new – yet in the ASP.NET case the object is actually old, but has no way of knowing that.

 

The easiest way to overcome the first problem is to make business objects have a public default constructor. Then they play nicely with ASP.NET data binding. The drawback to this is that anyone can then bypass the factory methods and incorrectly create the objects with the New keyword. That is very sad, since it means your object design can’t count on being created the correct way, and a developer consuming the object might use the New keyword rather than the factory method and really mess things up. Yet at present this is how I’m solving issue number one.

 

I am currently solving issue number two through an overloaded Save method in BusinessBase: Save(forceUpdate) where forceUpdate is a Boolean. Set it to True and the business object forces its IsNew flag to False, thus ensuring that an update operation occurs as desired. This solution works wonderfully for the web scenario, but again opens up the door to abuse in other settings. Yet again a consumer of the object could introduce bugs into their app by calling this overload when it isn’t appropriate.

 

The only way out of this mess that I can see is to create my own ASP.NET data control that understands how CSLA .NET objects work. I haven’t really researched that yet, so I don’t know how complex it is to write such a control. I did try to subclass the current Object control, but they don’t provide extensibility points like I need, so that doesn’t work. The only answer is probably to subclass the base data control class itself…

Thursday, October 13, 2005 6:17:12 PM (Central Standard Time, UTC-06:00)  #    Disclaimer  |  Comments [0]  | 
On this page....
Search
Archives
Feed your aggregator (RSS 2.0)
July, 2014 (3)
June, 2014 (4)
May, 2014 (2)
April, 2014 (6)
March, 2014 (4)
February, 2014 (4)
January, 2014 (2)
December, 2013 (3)
October, 2013 (3)
August, 2013 (5)
July, 2013 (2)
May, 2013 (3)
April, 2013 (2)
March, 2013 (3)
February, 2013 (7)
January, 2013 (4)
December, 2012 (3)
November, 2012 (3)
October, 2012 (7)
September, 2012 (1)
August, 2012 (4)
July, 2012 (3)
June, 2012 (5)
May, 2012 (4)
April, 2012 (6)
March, 2012 (10)
February, 2012 (2)
January, 2012 (2)
December, 2011 (4)
November, 2011 (6)
October, 2011 (14)
September, 2011 (5)
August, 2011 (3)
June, 2011 (2)
May, 2011 (1)
April, 2011 (3)
March, 2011 (6)
February, 2011 (3)
January, 2011 (6)
December, 2010 (3)
November, 2010 (8)
October, 2010 (6)
September, 2010 (6)
August, 2010 (7)
July, 2010 (8)
June, 2010 (6)
May, 2010 (8)
April, 2010 (13)
March, 2010 (7)
February, 2010 (5)
January, 2010 (9)
December, 2009 (6)
November, 2009 (8)
October, 2009 (11)
September, 2009 (5)
August, 2009 (5)
July, 2009 (10)
June, 2009 (5)
May, 2009 (7)
April, 2009 (7)
March, 2009 (11)
February, 2009 (6)
January, 2009 (9)
December, 2008 (5)
November, 2008 (4)
October, 2008 (7)
September, 2008 (8)
August, 2008 (11)
July, 2008 (11)
June, 2008 (10)
May, 2008 (6)
April, 2008 (8)
March, 2008 (9)
February, 2008 (6)
January, 2008 (6)
December, 2007 (6)
November, 2007 (9)
October, 2007 (7)
September, 2007 (5)
August, 2007 (8)
July, 2007 (6)
June, 2007 (8)
May, 2007 (7)
April, 2007 (9)
March, 2007 (8)
February, 2007 (5)
January, 2007 (9)
December, 2006 (4)
November, 2006 (3)
October, 2006 (4)
September, 2006 (9)
August, 2006 (4)
July, 2006 (9)
June, 2006 (4)
May, 2006 (10)
April, 2006 (4)
March, 2006 (11)
February, 2006 (3)
January, 2006 (13)
December, 2005 (6)
November, 2005 (7)
October, 2005 (4)
September, 2005 (9)
August, 2005 (6)
July, 2005 (7)
June, 2005 (5)
May, 2005 (4)
April, 2005 (7)
March, 2005 (16)
February, 2005 (17)
January, 2005 (17)
December, 2004 (13)
November, 2004 (7)
October, 2004 (14)
September, 2004 (11)
August, 2004 (7)
July, 2004 (3)
June, 2004 (6)
May, 2004 (3)
April, 2004 (2)
March, 2004 (1)
February, 2004 (5)
Categories
About

Powered by: newtelligence dasBlog 2.0.7226.0

Disclaimer
The opinions expressed herein are my own personal opinions and do not represent my employer's view in any way.

© Copyright 2014, Marimer LLC

Send mail to the author(s) E-mail



Sign In