Rockford Lhotka

 Monday, May 3, 2010
« Using Rx with the data portal | Main | CSLA 4 Beta 1 »

At the Chicago codecamp this past weekend I was sitting next to a younger guy (probably mid-20’s) who was angsting about what to learn and not learn.

“I’ll never learn C#, it is Microsoft only”

“Should I learn Python? I don’t have a use for it, but people talk about it a lot”

And much, much more.

Eventually, for better or worse (and I suspect worse), I suggested that you can’t go wrong. That every programming language has something to teach, as does every platform and every architectural model.

For my part, a few years ago I listed many of the languages I’d learned up to that point – and I don’t regret learning any of them. OK, I also learned RPG and COBOL, and they weren’t fun – but even those languages taught me something about how a language shouldn’t work :)  But I must say that PostScript was a fun language – programming printers is kind of an interesting thing to do for a while.

Platform-wise, I’ve used Apple II, C64, AmigaOS, OpenVMS, PC DOS, Novell, Windows 3.x, Windows NT, and modern Windows, as well as Unix, Linux and a tiny bit of some JCL-oriented mainframe via batch processing.

Every platform has something to teach, just like every programming language.

I also suggested that people who’ve only ever done web development, or only ever done smart-client or terminal-based development – well – they’re crippled. They lack perspective and it almost always limits their career. If you’ve never written a real distributed n-tier app, it is really hard to understand the potential of leveraging the all those resources. If you’ve never written a server-only app (like with a VT100 terminal) it is really hard to understand the simplicity of a completely self-contained app model. If you’ve never written a web app it is really hard to understand the intricacies of managing state in a world where state is so fluid.

Sadly (in my view at least), the angsting 20-something wasn’t interested in hearing that there’s value in learning all these things. Perhaps he was secretly enjoying being paralyzed by the fear that he’d “waste” some time learning something that wasn’t immediately useful. Who knows…

It is true that things aren’t always immediately useful. Most of the classes I took in college weren’t immediately useful. But even something as non-useful as Minnesota history turns out to be useful later in life – it is a lot more fun to travel the state knowing all the interesting things that happened in various places and times. (I pick that topic, because when I took that class I thought it was a total waste of time – I needed credits and it seemed easy enough – but now I’m glad it took it!)

(And some classes were just fun. I took a “Foundations of Math” class – my favorite class ever. We learned how to prove that 1+1=2 – why is that actually how it works? And why x+y=y+x, but x-z<>z-x. Useful? No. Never in my career have I directly used any of that. But fun? Oh yeah! :) )

A colleague of mine, some years ago, was on an ASP classic gig. This was in 2004 or 2005. He was very frustrated to be working on obsolete technology, right as .NET 2.0 was about to come out. His fear was that he’d be left behind as ASP.NET moved forward, and he had 3-4 more months of working on 1997 technologies. My point to him was that even if the technology wasn’t cutting edge, there were things to be learned about the business, the management style and the process being used by the client.

A career isn’t about a given technology, or even a technology wave. It is a long-term 20-30 year endeavor that requires accumulation of knowledge around technology, architecture, software design, best practices, business concepts, management concepts, software process, business process and more.

People who fixate on technology simply can’t go as far in their career as people who also pay attention to the business, management and inter-personal aspects of the world. It is the combination of broad sweeping technical knowledge and business knowledge that really makes a person valuable over the long haul.

“Wasting” 6-8 months on some old technology, or learning a technology that never becomes mainstream (I learned a language called Envelope of all things – and it was in the cool-but-useless category) is not that big a deal. That’s maybe 2% of your career. And if, during that time, you didn’t also learn something interesting about business, management and process – as well as deepening your appreciation for some aspects of technology, then I suggest you just weren’t paying attention…

All that said, spending several years working on obsolete technology really can harm your career – assuming you want to remain a technologist. I’m not saying you should blissfully remain somewhere like that.

But I am suggesting that, as a general rule, spending a few weeks or even months learning and using anything is useful. Nothing is a waste.