Rockford Lhotka

 Wednesday, January 19, 2005

When I first got involved with Microsoft it was around 1990 or ’91. The world of the time was dominated by IBM, with DEC a close second. If you wanted to network PCs you used Novell or Banyan. All PC programs were DOS programs. “Windows” was just one of many graphical libraries being used by software to handle drawing on the screen. And if you wanted to do real work you used a mainframe or minicomputer.


At the time, Microsoft was an underdog.


Personally I wouldn’t have anything to do with them. Novell was the obvious PC networking choice, and PCs in general were so incredibly limited compared to my beautiful VAX computers that I couldn’t understand why anyone would bother to use them. If you were going to use a PC-like computer, the Amiga was the way to go. It was far closer to a real computer than the PC!


Of course the open source community of the time hated the VAX, Novell and anything commercial. At the time the open source world was tied into Unix, even though it wasn’t open source either. But they hated us VAX people who could use the “search” command to search rather than using “grep”. (if you don’t believe me, use google or the wayback machine to look at the newsgroup flame wars of the time)


In any case, the manufacturing company I worked for kept bringing in more and more PCs to do more and more things. And we needed a way to make the PC software accessible to our end users. On the VAX I had a very nice bit of menu software that did personalization, authorization and so forth. We investigated the lame-ass equivalents that existed for DOS, but the PC world had no concept of centralized management and so we adopted none of them.


Finally, out of a busy field of competitors, Microsoft’s Windows product showed up. It was pretty lame too, but at least we were able to use it as a meaningful launch platform for software. This allowed our end users to actually find and use the software we had installed on our Novell server. It also forced us to hire another helpdesk/PC support person… Windows was not cheap, but it was obvious that the world was moving that direction and so we did too.


I should point out that none of the Windows competitors were cheap either. All of them used the highly cantankerous PC as a platform and anything we chose would have required that we hire at least one more PC support person… Compared to our VAXen (with their 99.995% uptime) the PC was just plain terrible. This was true of Windows, our Novell server – all of it.


Personally, I tried to keep at arm’s length from the whole debacle. My boss however, felt that my career would be enhanced if I got involved, so I ended up as the Novell administrator. That included making Windows work with Novell, and eventually with that VAX (because we installed software on the VAX to make it look like a Novell server to the PCs).


The only part of this that was fun was the VAX-as-Novell-server bit. Using that technology we were able to use gawk (open source awk, a text processing tool) to transform our massive manufacturing reports into tab-delimited “spreadsheet” files that could be used in Lotus 123, Quattro Pro or that lame excuse for a spreadsheet, Excel.


Even with Windows, Microsoft was still the underdog. Heck, this was the time that OS2 was supposed to take over the world. Unfortunately IBM forgot that people (end users) don’t give a rip about operating systems. They only care about the programs they use. OS2 didn’t have any software (compared to DOS, and thus Windows – which was still just a graphical shell on DOS). This was the problem with the Amiga too. A superior piece of work in almost every way, but there was no software for it. Not on the level of the stuff for DOS (Windows) or the Mac.


Then in the early 1990’s Microsoft came out with Windows NT. The core of NT is the same as that of OpenVMS (the VAX operating system). That was enough for me to start looking seriously at the PC. I even bought one so I could be part of the Windows NT beta program. It sat next to my Amiga and limped along as best it could.


It is worth noting that the VAX operating system was originally named VMS. DEC changed the name to OpenVMS in response to the open source/unix community’s unceasing assaults and criticisms. Today’s attacks on Microsoft are just the latest version of a decades-old pattern of abusive behavior on the part of the open source community. Before DEC it was IBM and before that there were no computers to speak of.


I know that the open source “community” can’t be represented by any one person or one group any more than Christians can be represented by Evangelicals or Catholics. But after listening to the same tired rants from what must be clones of the same people for 20 years it does seem like there’s a cohesive voice of the open source community… At the very least there’s a consistent dogma.


I started to find out about Windows programming at this time. I had never programmed for Windows itself (or DOS), but I figured that since NT was, at its core, like OpenVMS. I assumed that I could program it in comparable fashion. Not so. The most trivial program took pages of code. There was more code involved to interact with the OS than to do actual business work. So we just kept working on the VAX where we could get real work done.


Then came Visual Basic 1.0. It had a whole host of competitors, and wasn’t necessarily the best of the bunch. But it provided the most direct access to Windows programming of all the options. It allowed us to write Windows programs where most of our code was actual business code. Where we didn’t need pages of code just to draw a window or other silliness.


With the advent of Windows NT 4 Microsoft finally also provided an option to Novell. It was realistic to use NT 4 as a production server, and so we switched from Novell to NT somewhere in there. That was a powerful change, because now our server was programmable.


Yes, I know it was technically possible to program a Novell server. But you have to be kidding. That was really unlike a productive programming experience – especially for someone coming from a VAX! You need to understand that I have never had to deal with all the  memory segmentation issues. The VAX, the Amiga and Windows NT all provide a flat memory model. To this day I fail to understand why anyone voluntarily chose to program on a PC in DOS or Windows…


Around this time it was clear that DEC was dying, and Microsoft was no longer an underdog. Windows NT and Visual Basic propelled Microsoft into a space where they were accepted as a primary option. For small and mid-size business they were often the primary option in fact.


It also helped that through the early 1990’s the “spreadsheet wars” between Lotus, Quattro Pro and Excel eventually settled out. As we all know, the victor was Excel. But that was not a foregone conclusion. During that whole time, Microsoft was the underdog. Quattro Pro was easiest to use and Lotus was the most powerful (and had the backing of existing users). But Lotus never got easy to use, and Quattro never got powerful. Excel got easier to use and more powerful until it was easy enough and powerful enough to displace the other two.


There are lessons to be learned here. Lotus could have made themselves easier to use and they would have won hands down. Quattro had the harder job, but they could have made themselves more powerful and maybe could have won. But the odd man out, the underdog Microsoft was the one who managed to strike the balance that won the war.


I imagine there are similar stories around Word and WordPerfect, but I never witnessed that conflict.


By the mid-1990’s the dust had settled and Microsoft was a major player in corporate and home computing, and I left the manufacturing company for consulting. Why? So I could focus all my energies on programming PCs and not be distracted by the dying VAX.


Quite a turnaround for someone who wouldn’t touch a PC with a 10 foot pole :-)

Wednesday, January 19, 2005 10:29:50 AM (Central Standard Time, UTC-06:00)  #    Disclaimer
 Thursday, January 13, 2005

A while ago published an article with Jeff Richter’s thoughts on the future of .NET assembly versioning.


If you read the article you’ll find that the Longhorn OS will seriously change the way that .NET itself is versioned. In fact, it turns out that to a serious degree the whole idea of installing “side-by-side” versions of .NET itself will go away when Longhorn shows up.


Oh sure, they have plans for a complex scheme by which assemblies can be categorized into different dependency levels. Some levels can be versioned more easily, while others can only be versioned with the OS itself.


What this really means is that .NET is losing its independence from the OS. In the end, we’ll only get new versions of .NET when we get new versions of the OS – and we all know how often that happens…


I’d say that this was inevitable, but frankly it was not. Java hasn’t fallen into this trap, and .NET doesn’t need to either. Not that it will be easy to avoid, but the end result of the current train of thought portrayed by Richter is devastating.


Fortunately there’s the mono project. As .NET becomes more brittle and stagnant due to its binding with Longhorn, we might find that mono on Windows becomes a very compelling story. mono will be able to innovate, change and adapt much faster than the .NET that inspired it. Better still, mono will remain unbound from the underlying OS (like .NET was originally) and thus will be able to run side-by-side in cases where .NET becomes crippled.


Hopefully Microsoft will realize what they are doing to themselves before all this comes to pass. Otherwise, I foresee a bright future for mono on Windows.

Thursday, January 13, 2005 12:59:26 PM (Central Standard Time, UTC-06:00)  #    Disclaimer
 Monday, January 10, 2005

On the PeerCast website there is an FAQ in which they recommend turning off the Windows XP Firewall:

I got everything right with my broadcast, but no matter what I do, my stream can't get through. Should I get lost?
You are probably behind a firewall, if it is a personal firewall installed on your local PC, try turning it off. (Windows XP Pro for example..)

No wonder open-source is “more secure”, when they are actively running around telling people to disable the primary safety mechanism provided on Windows.

(to be fair, their blanket statement would apply to Linux firewalls too, but their example is Windows, which leads one to believe they prefer having lots of unprotected Windows machines on the Internet or something...)

Sabotage or just stupid advice?

If open-source people are as smart as they claim, they should be telling people to open only those ports required for the software, not to turn off all defenses and let anything through!!

Monday, January 10, 2005 11:50:27 AM (Central Standard Time, UTC-06:00)  #    Disclaimer

Google has released a 20 year timeline of usenet newsgroup history, highlighting notable events along the way.

I find it very interesting to see the history of the “world's largest BBS“.

Personally I got involved in usenet in 1989 or 1990. I was working for a bio-medical manufacturing company, and managed to convince a local defense contractor to allow us to tap a usenet feed off them. The feed came through our 1200 baud modem, with us dialing into their system to transfer the data. Later they also allowed us to route email through their system over our much faster 2400 baud modems. Ahh, those were the days! :-)

Why'd they do this you ask? Because we were both DEC VAX shops, and there was a hacker-metality brotherhood amongst VAX admins. (that's hacker in the good sense, not the dark sense) I don't think much of that mentality still exists in our industry, and that is a sad loss. I suppose it is the price of “progress“.

Prior to getting the usenet feed, I was a Citadel user. That was a Commodore 64 BBS that worked in a manner similar to usenet. It might have run on the Amiga too, but I don't recall for certain. Anyway, Citadel was a store-and-forward relay system like usenet (at the time), and so I was able to interact with people from all across North America via a local phone call. Someone in the area obviously made long-distance calls as a gateway, and my thanks go to that unknown benefactor!

To bring it back to the usenet timeline though, I think this timeline is interesting by itself. But it also should help us remember that the Web is just one thread in the much richer tapestry that is the Internet proper.

Monday, January 10, 2005 9:45:59 AM (Central Standard Time, UTC-06:00)  #    Disclaimer
 Friday, January 7, 2005
  1. Less choice leads to better results.
  2. Higher level languages and frameworks restrict choice.
  3. Thus, higher level languages and frameworks should lead to better results.


The “less choice” statement flows from this article entitled Choice Cuts. Ignore the politics and focus on the research beneath the statement. That’s what is valuable in this context.


It is obvious that higher level languages and frameworks restrict choice, so I’m not going to cite a bunch of references for that. While many of these languages and frameworks provide a way to drop out to a lower level, in and of themselves they restrict choice. Just look at Visual Basic versions 1-6. Very productive, but often restrictive.


Whether the conclusion that higher level languages and frameworks will lead to better results is true or not is unknown.


Certainly there are examples where better results have been gained. This train of thought is lurking somewhere behind the software factories movement, and does seem quite compelling. Certainly many people using CSLA .NET with code generators have found radically increased productivity – and the combination of the two does restrict choice.


Yet there are countless examples (especially from the era of CASE tools) where the results were totally counter-productive. Where restriction of choice forced intricate workarounds that decreased productivity and made things worse.


Charles Simonyi contributed to an article on The Edge, essentially noting that there still is a software crisis and that low-level languages aren’t solving the problem. He argues in fact, that low-level languages like C#, VB and Java are a dead-end. That we must move to higher-level language concepts in order to adequately represent the real world through software.


This is the view of the software factories people and the domain-specific language movement. And it is rather hard to argue the point. There are many days when I feel like I’m writing the same code I wrote 15 years ago – except now its in a GUI instead of on a VT100 terminal. But the business code just hasn’t changed all that terribly much…


To look at it a different way, a software architect’s job is to restrict choice. Our job in this role is to look at the wide array of choices and narrow them down to a specific set that our organization can use. Why? Because having each developer or project lead do all that filtering would seriously cut into productivity.


Companies employ architects specifically to limit choice. To set standards and policies that narrow the technology focus to a limited set of options. The rest of the organization then lives within those artificial boundaries. There are many reasons for this, including licensing costs, training and so forth. I doubt that many people have considered the direct (presumably positive) impact of the reduction of choice however.


I’ve been working on an idea I’ve dubbed an Entity Description Language (EDL). My original motivation was to reduce the amount of plumbing code we write. Look at a typical Java, C# or VB class and you’ll find that less than 4% of the actual lines in the class perform tangible business functions. Most of the lines are just language syntax or pure plumbing activities like opening a database connection.


In working on EDL though, I’ve discovered that it is virtually impossible – and perhaps undesirable – to replicate the capabilities of C# or VB in their entirety. There are reasons why these languages use such a verbose syntax. They require verbosity in order to provide flexibility, or choice. Virtually unlimited choice.


If we now consider that limited choice is better, then perhaps the fact that something like EDL restricts choice is only good…

Friday, January 7, 2005 10:23:13 PM (Central Standard Time, UTC-06:00)  #    Disclaimer
 Thursday, January 6, 2005

OK, I figured it out (I think)



    EventHandler _nonSerializableHandlers;

    EventHandler _serializableHandlers;


    /// <summary>

    /// Declares a serialization-safe IsDirtyChanged event.

    /// </summary>

    public event EventHandler IsDirtyChanged




        if (value.Target.GetType().IsSerializable)

          _serializableHandlers =

             (EventHandler)Delegate.Combine(_serializableHandlers, value);


          _nonSerializableHandlers =

             (EventHandler)Delegate.Combine(_nonSerializableHandlers, value);




        if (value.Target.GetType().IsSerializable)

          _serializableHandlers =

            (EventHandler)Delegate.Remove(_serializableHandlers, value);


          _nonSerializableHandlers =

            (EventHandler)Delegate.Remove(_nonSerializableHandlers, value);




    /// <summary>

    /// Call this method to raise the IsDirtyChanged event.

    /// </summary>

    virtual protected void OnIsDirtyChanged()


      if (_nonSerializableHandlers != null)

        _nonSerializableHandlers(this, EventArgs.Empty);

      if (_serializableHandlers != null)

        _serializableHandlers(this, EventArgs.Empty);



I temporarily forgot that C# makes you invoke the delegate directly anyway, so having a separate clause in the manual event declaration isn’t required. I still think it makes the code easier to read, but functionality is king in the end.

Thursday, January 6, 2005 8:58:17 PM (Central Standard Time, UTC-06:00)  #    Disclaimer

Several months ago I posted an entry about the new VB syntax for manual declaration of events – specifically showing how it can be used to solve today’s issue with attempted serialization of nonserializable event listeners.


Subsequent to that post, I posted a follow-up with a better version of the code, thanks to input from a reader.


At the moment I am working through some eventing issues in both VB and C#, and I’ve found what appears to be a troubling limitation in C#.


In the new VB manual declaration scheme we have the ability to manually raise the event using the backing field. This allows me to have two backing fields - one serialized and one not as shown in my follow-up post. This is really nice, because it means we can retain serializable delegate references, while dropping nonserializable references.


Unfortunately, it doesn’t appear that the C# syntax allows us to control how the backing field is invoked. It appears that only one backing field is possible, and it is invoked automatically such that we don't have control over the invocation.


If we can’t control the invocation, then we can’t invoke both the serialized and nonserialized set of delegates. This will force the C# code to treat all the delegates as nonserialized, even those that could be serialized.


Of course I’m researching this for CSLA .NET 2.0. Wouldn’t it be a joke if this time the C# framework had to have some VB code (since last time it was the other way around)?


Anyone have any insight into a solution on the C# side?

Thursday, January 6, 2005 8:42:22 PM (Central Standard Time, UTC-06:00)  #    Disclaimer

Want to hear about SOA and/or Indigo? Eric Rudder, Don Box, Doug Purdy and Rich Turner are all speaking at VS Live (as is yours truly).

Even if you like me, are rather skeptical about SOA, the fact is that Indigo is coming.

Don't let the SOA hype fool you. Indigo will impact you if you use .NET.

Personally I look at Indigo much more as a replacement for remoting and DCOM, along with integrating the WSE stuff into Web services. Because of this, Indigo is a very important thing to me - and to anyone building client/server or n-tier distributed systems in .NET.

Indigo alters the way objects are serialized, the way data is marshalled across networks and more. It is pretty extensive, and is going to be harder to abstract away than either asmx or remoting have been. This means we, as consumers of the technology, will need to understand more of it than we have needed to with existing technologies.

Since VS Live has a whole day on Indigo, this is a chance to get a good look at what's coming and assess what it is going to do to you.

And of course while you are at VS Live, you can attend my distributed object-oriented workshop :-)

Thursday, January 6, 2005 4:48:49 PM (Central Standard Time, UTC-06:00)  #    Disclaimer

Dan Miser blogs that there's a new version of Trillian out there. Off to download we go :-) - and I suggest you buy it! It is well worth the money!

Thursday, January 6, 2005 3:07:56 PM (Central Standard Time, UTC-06:00)  #    Disclaimer

You can now get details about, and download the beta of, Microsoft's new AntiSpyware software at this location.

Thursday, January 6, 2005 10:57:09 AM (Central Standard Time, UTC-06:00)  #    Disclaimer
 Wednesday, January 5, 2005

I was going to post this on my personal blog, but it occurred to me that it is technical enough in nature to fit here. 

EPIC is a flash video (which has been out for a while now) that portrays the future of media. It is scary, but interesting. Moreover it is thought provoking.

Certainly the web/blog/wiki thing has driven down the value of professional writing. Magazines (and vendors such as Microsoft) are paying less and less for content. Why should they pay when tons of people will produce content for free? Why should advertisers go with established publishers when some blogs outstrip them in readership?

Book sales are down. Even factoring in the dot-bust and the Bush-era recession of the past few years, book sales are down from where they should be. Who needs to buy a book when you can get so much online for free through a quick google search.

It is easy to look at Epic from a technology perspective, but it is the larger social perspective that I find interesting and troubling. And the social effects are real and there's active debate.

Take the controversy over for example - the encyclopedia business is under siege by who? Us. Anyone with a fact can share it with the world, without going through a formal company or process. Without any opportunity for anyone to make money on it. On one hand this is good, but on the other it is bad. Who’s going to fund archeological research? Who’s going to do the hard parts of finding facts? For free?

In the technology space, many of us blog things – including me. In many cases these are things that might have been paid articles prior to blogging, but now we gleefully put them on the web for free. That's fun for a while, and is good for notoriety, but in the long run it isn't sustainable.

Several people (friends or acquaintances of mine in both the Microsoft and Java spaces) recently have indicated that they are done writing - books, articles - they are done. This is troublesome. Is Atlas shrugging? Will the content of the future consist merely of the myriad voices of mundane souls?

Epic portrays at least one alternative, where it is at least possible for an author to get paid for their craft. Whether that is a realistic model doesn't matter as much as the fact that some model must be found.

Because we're not talking about just technical authors. We're talking about fiction. We're talking about music, and eventually movies. How will content creators get paid to do their work when random people do it for free? Will true artists bother? Do we care? Perhaps the people doing it for free are as good or better?

Perhaps they are just good enough, which is even scarier. That, after all, is the primary sin people ascribe to Microsoft. That they aren't the best, but rather are just good enough – leaving us stuck living in a “good enough“ world rather than a really kick ass world.

I don't necessarily buy that viewpoint on Microsoft. Having used various flavors of Linux I don’t see that as the “kick-ass world” I’m personally looking for anyway. But it is easy to look at reality TV and see where everything could sink to that level.

Epic raises serious questions that only time will answer...

Wednesday, January 5, 2005 5:27:48 PM (Central Standard Time, UTC-06:00)  #    Disclaimer
 Monday, January 3, 2005

In my previous entry Randy H notes that Microsoft has a different approach to marketing:


MS has some incredibly talented marketers. The Technical Product Manager role is essentially a marketer that helps to determine what features go into products and how things should work. To me, that kind of marketing has a lot of value. I wouldn't dismiss the role of marketing in our greatest technology companies. Wasn't .NET a whole lot of marketing as well?


While it is true that Microsoft has a unique approach to marketing, they really aren't much different than anyone else. While .NET was as much marketing as anything else (since the ".NET" got slapped on _everything_ for a while), the reason it has been successful is due to its technical merits.


Notice that the ".NET" label is fading already - Visual Studio 2005, Visual Basic 2005, etc. No .NET left in the product names at all. My guess is that ".NET" the term will fade away into the same marketing hole that swallowed up Remote OLE Automation, MTS and soon SOA.


I have always found it amazing when Microsoft is said to have this "great marketing machine". In many ways they are the worst marketers out there. Certainly far, far worse than Apple or IBM for instance.


Apple has the trendy thing going, and has for a very long time. Microsoft has never been trendy or fashionable or cool or hip. But Apple sure is hip, and it shows in their iPod sales. For some reason though, having powerful marketing in the "cool space" doesn't translate to widespread use.


IBM has those really kick-ass commercials that juxtapose business situations with strange solutions. And prior to that they had the cool commercials showing non-tech scenarios that were just metaphors for IT issues. Very cool and very smart stuff. Very effective too, as IBM’s global consulting arm has become large and influential due to that kind of marketing.


Microsoft has never had anything remotely similar to “real” marketing like that. Microsoft’s marketing has always been more subtle and focused on technologists. In reality, Microsoft’s marketing has always been more grass-roots, much like the open-source world.


And there’s some humor for you. The open-source world has apparently decided that it too needs marketing. Even if you make no money off your work, you certainly want the fame/notoriety – and to get fame you need people to use your stuff rather than your competitors’ stuff (regardless of whether they are commercial or OSS).


At the same time, Microsoft really wants to move into the enterprise space, and so they have been trying to figure out how to do “actual” marketing along the lines of IBM. And they want to sell consumer items like the Media PC, so they’ve been struggling to figure out how to be hip like Apple. Hopefully as they do this, they’ll manage to continue the MSDN and TechNet-style marketing to the technical community. We’ve been the bread-and-butter for them over the past 12 or so years after all.

Monday, January 3, 2005 8:52:53 AM (Central Standard Time, UTC-06:00)  #    Disclaimer