Rockford Lhotka

 Sunday, March 11, 2007
« Paul Sheriff's Inner Circle (updated and... | Main | Is Computer Science dead? »

As I’ve mentioned before, my mother is battling cancer. Due to this, I’ve spent more time in hospitals over the past few months than I have in my life before now. And in so doing, I’ve drawn some scary conclusions about where our industry may be headed.

When I got into the industry 20 years ago (yes, it’s true…), it was quite realistic to think that a hot-shot programmer could be an expert in everything they touched.

In my case, I got hired to write software on a DEC VAX using VAX Basic. Now, I was a dyed in the wool Pascal fanatic, and the idea of using “Basic” was abhorrent to me. But not as abhorrent as not having a job (this was during the Reagan-era recession after all), and besides, I loved the VAX and this was a VAX job.

What I learned, was that VAX Basic wasn’t “Basic” at all. It was a hybrid between FORTRAN (the native VAX language) and Pascal. A couple months later, I was an expert at, not only the VMS operating system, but at this new language as well. It was my sixth programming language, and after a while learning a new language just isn’t that hard.

My point is this: as a junior (though dedicated) programmer, I knew pretty much everything about my operating system and programming language. For that matter, I was versed in the assembly language for the platform too, though I didn’t need to use it.

Fast-forward 8 years to 1995: the beginning of the end. In 1995 it was still possible for a dedicated programmer to know virtually everything about their platform and language. I’d fortunately anticipated the fall of OpenVMS some years earlier, and had hitched my wagon to Windows NT and Visual Basic. Windows NT was, and is, at its core, the same as OpenVMS (same threading, same virtual memory, etc.) and so it wasn’t as big a shift as it could have been. The bigger shift was from linear/procedural programming to event-driven/procedural programming…

But, in 1995 it was quite realistic to know the ins and outs of VB 3.0, and to fully understand the workings of Windows NT (especially if you read Helen Custer’s excellent book). The world hadn’t changed in any substantive way, at least on the surface, from 1987.

Beneath the surface the changes were happening however. Remote OLE Automation arrived with VB 4.0, rapidly followed by real DCOM. SQL Server appeared on the scene, offering an affordable RDBMS and rapidly spreading the use of that technology.

(Note that I’d already worked with Oracle at this point, but its adoption was restrained by its cost. Regardless of the relative technical merits of SQL Server, its price point was an agent for change.)

And, of course, the HTTP/WAIS/Gopher battle was resolved, with HTTP the only protocol left standing. Not that anyone really cared a lot in 1995, but the seeds were there.

Now step forward just 3 years, to 1998. Already, in 1998 it was becoming difficult for even a dedicated developer to be an expert in everything they used. Relational databases, distributed programming technologies, multiple programming languages per application, the rapidly changing and unreliable HTML and related technologies. And at the same time, it isn’t like Windows GUI development went away or even stood still either.

Also at this time we started to see the rise of frameworks. COM was the primary instigator – at least in the Microsoft space. The face that there was a common convention for interaction meant that it was possible for people to create frameworks – for development, or to provide pre-built application functionality at a higher level. And those frameworks could be used by many languages and in many settings.

I smile at this point, because OpenVMS had a standardized integration scheme in 1987 – a concept that was lost for a few years until it was reborn in another form through COM. What comes around, goes around.

The thing is, these frameworks were beneficial. At the same time, they were yet another thing to learn. The surface area of technology a developer was expected to know now included everything about their platform and programming tools, and one or more frameworks that they might be using.

Still, if you were willing to forego having friends, family or a real life, it was technically possible to be an expert in all these things. Most of us started selecting subsets though, focusing our expertise on certain platforms, tools and technologies and struggling to balance even that against something resembling a “normal life”.

Now come forward to today. What began in 1995 has continued through to today, and we're on the cusp of some new changes that add even more complexity.

Every single piece of our world has grown. The Vista operating system is now so complex that it isn’t realistic to understand the entire platform – especially when we’re expected to also know Windows XP and Server 2003 and probably still Windows 2000.

For many of us, the .NET Framework has replaced the operating system as a point of focus. Who cares about Windows when I’ve got .NET. But the .NET Framework is now well over 10,000 classes and some totally insane number of methods and properties. It is impractical to be an expert on all of .NET.

Below the operating system and .NET, the hardware is undergoing the first meaningful change in 20 years: from single processor to multiple processors and/or cores. Yes, I know multiprocessor boxes have been around forever – our VAX was dual CPU in 1989. But we are now looking at desktop computers having dual core standard. Quad core within a couple years.

(I know most people went from 16 to 32 bits – but the VAX was 32 bit when I started, and 64 bit when I moved to Windows, so I can’t get too excited over how many bits are in a C language int. After you've gone back and forth on the bit count a couple times it doesn't seem so important.)

But this dual/quad processor hardware isn’t uniform. Dual processor means separate L1/L2 caches. Dual core means separate L1, but sometimes combined L2 caches. AMD is working on a CPU with a shared L3 cache. And this actually matters in terms of how we write software! Write your software wrong, and it will run slower rather than faster, thanks to cache and/or memory contention.

(This sort of thing, btw, is why understanding the actual OS is so important too. The paging model used by OpenVMS and Windows can have a tremendous positive or negative impact on your application’s performance if you are looping through arrays or collections in certain ways. Of course Vista changes this somewhat, so yet again we see expansion in what we need to know to be effective…)

At least now we only have one real programming language: VB/C# (plus or minus semi-colons), though that language keeps expanding. In .NET 2.0 we got generics, in 3.5 there’s a whole set of functional language additions to support LINQ. And I’m not even mentioning all the little tweaky additions that have been added here and there – like custom events in VB or anonymous delegates in C#.

And how many of us know the .NET “assembly language”: CIL? I could code in the VMS macro assembly language, but personally I struggle to read anything beyond the simplest CIL…

I could belabor the point, but I won’t. Technology staples like SQL Server have grown immensely. Numerous widely used frameworks and tools have come and gone and morphed and changed (Commerce Server, SharePoint, Biztalk, etc).

The point is that today it is impossible for a developer to be an expert in everything they need to use when building many applications.

As I mentioned at the beginning, I’ve been spending a lot of time in hospitals. And so I’ve interacted with a lot of nurses and doctors. And it is scary. Very, very scary.

Why?

Because they are all so specialized that they can’t actually care for their patients. As a patient, if you don’t keep track of what all the specialists say and try to do to you, you can die. An oncologist may prescribe treatment A, while the gastro-intestinal specialist prescribes treatment B. And they may conflict.

Now in that simple case, the specialists might (might!) have collaborated. But if you are seriously ill, you can easily have 4-8 specialists prescribing treatments for various subsystems of your body. And the odds of conflict is very high!

In short, the consumer (patient) is forced to become their own general physician or risk serious injury or death at the hands of the well-meaning, but incredibly-specialized physicians surrounding them.

And I think this is where our industry is headed.

I know people who are building their entire career by specializing on TFS, or on SharePoint Server, or on SQL Server BI technologies, or Biztalk. And I don’t blame them for a second, because there’s an increasing market demand for people who have real understanding of each of these technologies. And if you want to be a real expert, you need to give up any pretense of expertise in other areas.

The same is true with the Windows/Web bifurcation. And now WPF is coming on the scene, so I suppose that’s a “trifurcation”? (apparently that’s a real word, because the spell checker didn’t barf!)

What amazes me is that this insane explosion in complexity has occurred, and yet most of my customers still want basically the same thing: to have an application that collects and processes data, generates some reports and doesn’t crash.

But I don’t think this trend is reversible.

I do wonder what the original OO people think. You know, the ones who coined the “software crisis” phrase 20 or so years ago? Back then there was no real crisis – at least not compared to today…

So what does this mean for us and our industry?

It is an incredible shift for an industry that is entirely built on generalists. Companies are used to treating developers like interchangeable cogs. We know that’s bad, but soon they will know it too! They’ll know it because they’ll need a bigger staff, and one that has more idle time per person, to accomplish things a single developer might have done in the past.

Consulting companies are built on utilization models that pre-suppose a consultant can fill many roles, and can handle many technology situations. More specialization means lower utilization (and more travel), which can only result in higher hourly rates.

Envision a computer industry that works like the medical industry. “Developers” employed in corporate settings become general practitioners: people who know a little about a lot, and pretty much nothing about anything. Their role is primarily to take a guess about the issue and refer the customer to a specialist.

These specialists are almost always consultants. But those consultants have comparatively low utilization. They focus on some subset of technology, allowing them to have expertise, but they’re largely ignorant of the surrounding technologies. Because their utilization is low, they are hopping from client to client, never focused entirely an any one – this reduces their efficiency. Yet they are experts, and they are in demand, so they command a high hourly rate – gotta balance the lower hours somehow to get the same annual income…

Many projects require multiple specialists, and the consumer is the only one who knows what all is going on. Hopefully that “general practitioner” role can help – but that’s not the case in the medical profession. So we can extrapolate that in many cases it is the end consumer, the real customer, that must become tech-savvy enough to coordinate all these specialists as they accidentally conflict with each other.

Maybe, just maybe, we can head this off somewhat. Maybe, unlike in the medical industry, we can develop generalists as a formal role. Not that they can directly contribute much to the effort, but at least the people in this role can effectively coordinate the specialists that do the actual work.

Just think. If each cancer patient had a dedicated general practitioner focused on coordinating the efforts of the various specialists, how much better would the results be? Of course, there’s that whole issue of paying for such a dedicated coordinator – somehow who’s continuing education and skills would have to be extraordinary…

Then again, maybe I’m just being overly gloomy and doomy. Maybe we’ll rebel against the current ridiculous increases in complexity. Maybe we’ll wake up one day and say “Enough! Take this complexity and shove it!”

Sunday, March 11, 2007 12:01:25 AM (Central Standard Time, UTC-06:00)  #    Disclaimer