Rockford Lhotka

 Thursday, August 10, 2006
Yesterday I recorded two more DNR TV shows on CSLA .NET with Carl Franklin, so watch for those to go online in the next few weeks. Carl does a nice job of editing the recordings and cleaning up the audio, so it takes some time between recording and "airing", but it is well worth it!

Thursday, August 10, 2006 1:57:00 PM (Central Standard Time, UTC-06:00)  #    Disclaimer
 Saturday, August 5, 2006

People often ask me how business is going, or how Magenic is doing. I answer “good”. But that’s not really true.


It turns out that it is better than good. I’ve been in consulting, one way or another, for nearly 12 years now and there are a couple times of year when things always slow down: late summer and the end-of-year holidays.


Well this summer, for the first time in my recollection, summer has gotten more busy, not less!


So I’m posting this blog entry in an attempt to address Magenic’s primary issue right now: finding good .NET development resources. Yup, we just can’t find enough people with good .NET experience. This includes regular Web and Windows .NET development, middle-tier work and SQL Server. But it also includes Biztalk Server and SharePoint Server as well. Any peripheral technologies like content management are always welcome too.


I’ve been with Magenic for 6 years as of this month, and I’ve never regretted my choice to work for the company. Greg and Paul (the owners) have steered the company through the dot-com bubble and its subsequent crash with admirable skill. And while no one could say 2002-4 were fun, I can honestly say that I felt more secure at Magenic than my friends at other companies were feeling at the time.


Sure, consulting is consulting. If you want to come work for Magenic you need to realize right up front the realities of being a consultant. But given that, I find it hard to imagine a better consulting company to work for, especially if your focus is around Microsoft and .NET.


One of Magenic’s core tenants is to try and find “cool work”. Obviously that’s not always possible, because each individual defines “cool” differently, and there are business realities around consulting that simply can’t be ignored. But just the fact that this is one of the company’s core tenants says a lot!


So here’s the deal. If you live (or would like to live) in the Boston, Atlanta, Chicago, Minneapolis or San Francisco areas, and if you’ve got .NET skills, and if you want to escape the politics of IT in exchange for the life of a consultant then this is an excellent time to see if Magenic is the right place for you.


If you are interested, just send me an email and I’ll forward it on to one of our recruiters (kind of a fast-track offer :) ). Of you can contact recruiting directly through the web site.

Saturday, August 5, 2006 2:17:13 PM (Central Standard Time, UTC-06:00)  #    Disclaimer
 Wednesday, August 2, 2006

In a previous post I made a reference to the EU anti-trust case. Stefan asked me to clarify my statement:


Could you please explain more detailed what you mean by "will likely do serious harm to European consumers"? As I know the European antitrust suit against Microsoft is to strengthen consumer rights. The charge against Microsoft is that they use their monopoly in OS and undocumented functions in their OS to push their own products. Why should it do harm to European consumers if Microsoft has to offer an OS without Internet Explorer and without Mediaplayer, or to open undocumented functions?


Stefan, I realize that the purported purpose of the anti-trust action is to strengthen consumer rights. Yet for the most part, the real result is to protect the "rights" of Microsoft's competitors, not consumers. And perhaps you can't do one without the other - that could be the case.


But consider this. In the US anti-trust findings, Judge Jackson was very clear that computer operating systems are a "natural monopoly". That market forces exist that have, and will, create a monopoly in this space. You can read the findings yourself, just search for the term “monopoly” and you’ll get close to the relevant section.


The reason for the natural monopoly is sound: it is very expensive for organizations to support varied platforms. Thus they standardize on as few installation sets as possible - preferably one client and one server configuration. So an organization will standardize on Windows XP, or on Windows XP without IE or Media Player, but not both.


But then the effect spreads, and this is where consumers get hurt. ISVs build software for the broadest market segment, because that is the most cost-effective approach. So they look at the market and see that the majority of systems are real Windows XP and so they build for that. Only if there's enough outcry from the XPN users will they port their software to run on XPN. The XPN user base needs to be able to find the cost of porting, plus the normal profit the ISV would have made.


This causes a feedback loop. Users of XPN always feel like second-class citizens, because they always get software later than real XP users. So when they purchase their next OS, they are more likely to buy real XP rather than the version that gets less ISV support.


People don't buy computers for the OS after all - they buy computers for the apps they want to run, and those apps come from ISVs.


It is this same effect that causes comparatively slow adoption of the Mac or of Linux as a desktop. Compared to Windows, there’s a dearth of real applications for these other platforms, and the reason is economic.


So “getting rid of” the monopoly isn’t a viable goal. The only realistic approach is to manage the monopoly, because destroying it would merely result in some other company rising to take its place – purely due to these economic factors.


With that background, I’ll explain why I think the EU’s actions are likely to hurt consumers – just like some of the proposed US actions from 6 years ago would almost certainly have hurt consumers.


Restricting the inclusion of the IE and Media Player user interfaces is, I think, immaterial. But the EU is restricting Microsoft from including the underlying components on which they are built. These underlying components are also available to ISV’s, and most of us consider them simply part of the Windows operating system.


So Windows XP N, and future crippled versions of Windows mandated by the EU merely cause an artificial split in the operating system. One which any sane organization would ignore, because it is very clearly out of the mainstream (see the natural monopoly discussion above).


Of course the reality is that only a very few applications rely on the API exposed by Media Player’s libraries, so that isn’t such a big deal today. More applications rely on IE libraries and that can be more problematic.


The bigger issue is the precedent, because further bifurcation of the OS over time will merely make Windows Vista N (or whatever) more marginal.


But still, where’s the harm to consumers?


  1. There’s harm to the uneducated people who are naïve enough to buy the government-mandated, crippled versions of the OS. They’ll find themselves unable to run certain apps and won’t know why.
  2. There’s harm to all of us, because it costs Microsoft money to build these OS variants – and to test them, and to test all their other products against them (like Money, Visual Studio, SQL Server Express, and on and on – they all must be tested against this crippled version of the OS along with the real OS). Someone must pay for this development and testing.

    Microsoft has been criticized for selling XPN at the same price as XP. Really they should charge more for it, because it is merely a cost to Microsoft. The EU prevents that (as they need to), so it is sold at the same price. If Microsoft were to sell XPN at a lower price, they’d need to raise the price of regular XP to subsidize XPN in order to maintain the same profit margin they make today.

    Software is fundamentally different from physical goods like a car – having fewer options is not cheaper, it is more expensive…
  3. There’s harm to consumers of 3rd party apps that use any features missing from the crippled versions of the OS. They must spend extra time and money compensating for the lack of those features – either through installer UI support to get the user to reinstall the missing code, or by rewriting their app to use a non-OS equivalent API that they can install themselves. This extra cost to the ISV ultimately gets passed along to consumers as a higher price.

    Of course this presupposes that the XPN market is big enough for the ISV to actually care. If not, we’re back to the economic factors causing the natural monopoly in the first place. And even if the XPN market is big enough, XPN users will likely get their software later, and with less convenience during install and so they’ll be less likely to buy XPN in the future – again reinforcing the natural monopoly.


As a totally separate issue, the EU is also following the US courts in mandating that Microsoft release better documentation of its APIs. I don’t necessarily believe that this harms consumers (with one exception, so read on).


I also am not entirely convinced that forcing the release of documentation helps consumers either – at least not nearly as much as it helps Microsoft’s competitors. Only time will tell on this point.


Since browsers and media players are free, there can be no help for consumers in these spaces; at least from a cost-of-acquisition perspective. It may be that some competitor will create software that is more capable, easier to use, cheaper to maintain/support or something like that – in which case I’ll happily acknowledge a benefit to consumers.


But to date, it is very hard to find third party software that integrates as well as Microsoft’s set of software. Just try to copy-paste from Word into a rich text editor in FireFox as compared to IE to see what I mean… Both work, but IE tends to work substantially better.


Of course you can’t blame third parties for limiting their coupling to Microsoft’s APIs. And this is where things (to me) get puzzling. I work very hard to limit my exposure to Microsoft technologies, and I try to only use their APIs at the surface level. Why? Because Microsoft owns those APIs and changes them over time – and I don’t want to absorb the cost of compensating every time they change something.


This is why CSLA .NET, for instance, abstracts things like Enterprise Services and System.Transactions and Remoting and Web Services. Because these are all fluid APIs, and tightly coupling your business application to those APIs is costly. CSLA .NET offers a buffer between them, and the business code we all create and maintain.


This, I think, is also why FireFox has limited copy-paste capabilities when compared to IE. It is quite likely that they’d need to take a deeper dependency on a Windows API to have comparable functionality, and that’d make them vulnerable to Microsoft changing things over time.


And so here we arrive at the one possible cost to consumers of Microsoft providing more open documentation of its deeper APIs. Documentation doesn’t equate to stability. Just because they are documented doesn’t mean they stop changing. Any software that does accept a dependency on these APIs is even more subject to change than normal software – and we all know the pain of API changes to normal software.


So yes, third party apps may become more integrated than before – but they’ll almost certainly cost more, because that integration (coupling) has a high cost to the ISV creating the software, and they’ll need to pass that along to consumers.


My guess is that very few ISVs will actually use these deeper APIs. These companies are out to make a profit too, and so they’ll weight the cost of coupling/integration against the benefit of increased sales and determine, case by case, whether they can increase sales (or prices) enough to justify the costs of tighter coupling and its attendant vulnerability to change.


Yes, this is a weaker argument – and personally I think the more open Microsoft is in documenting its APIs the better. My real point here is that it isn’t just all goodness and light. There is a serious tradeoff to actually using the APIs that can’t be ignored – and which, I think, will ultimately negate any “benefit to consumers”.


So if you’ve stuck with me this far, I’ll summarize: I think the EU went too far by forcing the creation of a crippled OS. I think both the US and EU courts are doing a fine thing by forcing Microsoft to document more of its APIs, but I doubt that will actually help consumers as much as a few ISVs and some competitors.


I haven’t even touched the OEM market, and what the courts have changed there – generally for the betterment of consumers. With luck Microsoft is really learning to live within these boundaries (they have a 12 step program after all :) ), and consumers will continue to see benefits in this area going forward.


So to close, my primary criticism of the EU is around mandating the creation and support of a crippled OS. There is no upside for consumers there, just extra cost and pain.

Wednesday, August 2, 2006 10:52:43 AM (Central Standard Time, UTC-06:00)  #    Disclaimer
 Tuesday, August 1, 2006

Maintaining CSLA .NET, along with the ProjectTracker sample app, in both VB and C# is challenging. I've blogged before about how tedious it is converting C# to VB or visa versa, and how it slows down progress on the framework.

For CSLA .NET 2.0 I was fortunate enough to have the help of Brant Estes from Magenic, who put in amazing time and effort to keep the C# code in sync with the VB code through the development and writing process. There was no way I'd have completed the books and framework on schedule without his help!

But recently I've been playing with a set of tools from Tangible Software Solutions: Instant VB and Instant C#. By way of disclaimer, I got free licenses to these tools thanks to Tangible, and that's why I've been using them. But having used them, I would suggest that they are certainly worth the cost for anyone who needs to convert code one way or the other.

I have been very skeptical of language conversion tools. By and large the results don't look "right". Convert VB to C# and you get working code, but not the kind of code you'd write by hand. It is even worse when converting C# to VB - the results look like C# without the semicolons, and don't resemble real VB code hardly at all...

Now I'm not ready to say that Instant VB/C# are perfect, but they are pretty darn good. Better still, I have been providing feedback to Tangible about what I wish was different, and the speed at which they have responded by improving the tools is excellent! Seriously, I think they've given me 2-3 new builds just in the past couple weeks - each one making my life easier.

I'm using the tools in per-file mode, translating individual CSLA .NET source files from C# to VB and from VB to C#. The VB to C# conversions are quite good, and there are only a couple things I'm having to do to the resulting code (clean up the using (Imports) statements, and rename instance variables because I use a 'm' prefix in VB and a '_' prefix in C#).

The C# to VB isn't quite as smooth yet, but it is getting rapidly better. Tangible is working on an enhancement to translate several common string and type conversion patterns into VB equivalents (Len(), CInt(), etc.) I believe that'll be an option, so if you prefer the C# approach even in your VB code you don't need to worry. Once that is working, I expect to be pretty happy. Even so, I'll need to change instance variables because of my choice of different prefixes - but that's really my issue :)

The thing is, the time savings for me is tremendous. But what makes me write this blog entry is the fact that the resulting code really does look like the kind of code you'd write by hand - and for me that's the true test.

Tuesday, August 1, 2006 1:43:49 PM (Central Standard Time, UTC-06:00)  #    Disclaimer
 Monday, July 31, 2006

I submit that this is a good move by Microsoft: making the MSDN Library available for free download.

Some may argue that this devalues the MSDN subscription - but frankly that's silly. The vast majority of the Library is available online anyway, all Microsoft has done here is provided a more convenient way to access the data. It isn't like they decided to give away the software for free! Personally I haven't installed the Library on my machine for well over a year, because I find the web access more convenient.

Dollar per bit, an MSDN subscription is an unbeatable deal for a developer. The ability to get almost every OS, server and development tool for the purposes of development at just over the cost of Visual Studio alone is really quite amazing when you think about it.

Other people will likely argue that this is in response to government actions (the EU in particular). If so, then so be it. I think the EU is out of control and will likely do serious harm to European consumers, and maybe to Microsoft. But the upside for me is that I work for a consulting company, and the more variations on the OS the more time it takes us to build even simple software. Since we charge by the money, it merely means that software for use in the EU will make us more money that software for use in the US or elsewhere. So perhaps I should be rooting for the EU, because in some perverse way they're likely to make me more money?

Regardless, even if Microsoft is releasing the Library free to help mitigate some "openness" issues in the EU, that is only good news for developers who (for some reason) find it hard to get the content over the Internet.

My view is this: I've worked with IBM software, and the lack of an MSDN-equivalent is devastating to productivity. And I've worked with (and continue to work with) open source software, where the lack of decent documentation and organized support materials is infamous. The investment Microsoft has always made around supporting developer productivity through documentation and MSDN is one of its key success factors - at least in the development world. To me, this is just another small step in Microsoft's continuing support for developers on their platform.

Monday, July 31, 2006 7:31:25 PM (Central Standard Time, UTC-06:00)  #    Disclaimer
 Wednesday, July 26, 2006

Some exciting news! Dunn Training is building a formal training class around CLSA .NET, with plans for the class to be ready in September. I often get requests for CSLA .NET training, and now there'll be a great answer.

Of course Magenic remains the premier source for consulting and mentoring around CSLA .NET. Training is important, but you can't underestimate the value of longer term mentoring!

Given the combination of my books, a formal CSLA .NET class and longer term mentoring and consulting from Magenic, a full array of CSLA .NET resources is coming into being.

And while I'm plugging Dunn Training, I should mention that they have an excellent BizTalk Server 2006 class - just tell them that Rocky sent you :)

Update: Here is a link to the information page on the CSLA .NET class.

Wednesday, July 26, 2006 11:56:44 AM (Central Standard Time, UTC-06:00)  #    Disclaimer
 Friday, July 21, 2006

Paul Sheriff has extended a discount for Paul Sheriff's Inner Circle to people who read my blog:

Anyone who signs up between now and Sept. 1,

2006 will receive a discount.

Have them enter the PROMO code: ROCKY01

If they purchase a Yearly membership, they will receive 1 additional month for free.

Friday, July 21, 2006 8:55:03 AM (Central Standard Time, UTC-06:00)  #    Disclaimer
 Wednesday, July 19, 2006
Wednesday, July 19, 2006 10:29:40 PM (Central Standard Time, UTC-06:00)  #    Disclaimer

Yesterday I posted about Paul Sheriff’s new subscription-based online venture. It is an experiment on Paul’s part, and it is something he’s put a huge amount of time and effort into building.


Interestingly, there’s been a bit of pushback – at least in the comments on my blog – to Paul charging for his site. Of course this is an experiment, and so only time will tell if Paul’s investment in time and money putting it together, and his ongoing investment in building content will actually pay off.


But I hope it does work, and this is why.


It has been clear for a while now that the world is undergoing some major changes. While the Internet didn't transform the world like all the dot-com nuts thought it would, it really is having a non-trivial (if ponderous) impact as time goes by.


(for a thought-provoking view of a possible future, check out Epic 2014).


A few of us, Paul and myself included, are trying to figure out how to adapt to this new world. With book sales radically down and magazine subscriptions failing and technical conferences struggling, it is becoming less and less practical for a professional author/speaker to make a living.


Now it might be the case that free content will have the same quality as professionally created, reviewed and edited content. But I doubt it. Some people can generate quality content without reviewers and editors, but most can’t. And in any case there’s no substitute for experience. As with anything, experience has tremendous value. If you look at any professional author’s work you’ll see a progression as they get better and better at explaining their ideas over time.


Not that there isn't some great free content out there, but wading through all the random content to find it is very expensive. There’s no doubt that some people invest their time and effort in improving their writing skills for free, but over time it is hard to commit to that level of focus without some level of compensation.


I specifically avoided saying that some people do this as a hobby. Because I think that is very rare. People write to get compensation. In many cases it is financial – either directly, because they get paid to write, or indirectly, because they expect to get a raise, or to more easily job-hop into a raise.


Coming back to that sifting through the web thing though… Time isn't free. In fact I'm of the opinion that time is far more valuable than money for most of the people in our profession. Wasting hours sifting through random outdated, or just plain poor, content to find that one gem on someone's blog is really costly.


For some people it is worth that time, for others it is not. There's no way to pass a global value judgment on this, because different people have different jobs and priorities. If I can spend a couple hours writing code, I'm much happier than if I spent a couple hours reading random web content. Other people love reading and sifting through random web content and don't begrudge that time in the slightest.


One thing that I always keep in mind though, is that we (in the US and Europe anyway) cost 4-7 times more than people in India or China. That means we need to be 4-7 times more productive to justify our existence. So that time spent sifting through the web needs to result in some pretty impressive productivity or it was just a very high cost.


I sift through the web at least as much as the next guy, don’t get me wrong. But not really by choice. If some web-sifter out there started a subscription-based index into content that is actually up to date and valid I’d pay for it. Google is great, but just think if there was a Google that only searched meaningful content!?! I don’t care about the vast majority of what people put on the web, there are just a few gems I’m looking for.


Unfortunately, thus far the idea of a paid index for content hasn’t proven to be a viable business model. And the web is undermining traditional forms of providing content. So the world is changing.


But I don’t believe for a minute that the value of professional content is lower than in the past, I just think the delivery of that content is in flux.


So the question then, is how to deliver professional content in this new world? And in a way where the producers, reviewers and editors of the content are compensated for their effort. Time isn’t free, not for you as the reader, nor for those of us engaged in professionally producing that content.


We’ll all find out whether Paul’s experiment works or not over time. But he’s not alone in looking for ways to adapt to this new world, and you can expect to see some experiments from other people as well – including me – in the relatively near future.

Wednesday, July 19, 2006 5:31:00 PM (Central Standard Time, UTC-06:00)  #    Disclaimer
 Tuesday, July 18, 2006

My good friend Paul Sheriff is trying something new - a subscription-based web site where you can tap into his expertise on various topics of tremendous interest to developers. The site is Paul Sheriff's Inner Circle, and it is worth taking a look.

Tuesday, July 18, 2006 7:04:57 PM (Central Standard Time, UTC-06:00)  #    Disclaimer