Rockford Lhotka's Blog

Home | | CSLA .NET

 Tuesday, 10 October 2017

The CSLA framework is now officially 20 years old. Seriously, some version of CSLA has been out in the world for people to freely use since 1997 when my Professional VB5 Business Objects book came out with the code on a disc attached inside the back cover.

Although I didn't technically call it "open source" back then, because the more common terms were "freeware" and "shareware", and I considered CSLA to be "freeware". Those terms eventually became subsumed by what we now know as Open Source Software (OSS), and of course that's the term I use today.

I'm proud of the fact that I've been contributing to the OSS world for 20 years. Not too many open source projects last this long.

And more importantly, I'm humbled and amazed by the emails and feedback I've received over the years - both in terms of people using CSLA, and also people who've benefited from reading my books. And of course by the amazing contributions by colleagues and community members, several of whom I'll call out later in this article.

The following is a pretty long post, but I thought that it would be good to record the highlights of the past 20 years.

Why Open Source?

Lots of people have opinions and views and motivations about why they use and/or contribute to open source.

Personally, my motivation to contribute flows from being an early user of the concept. I very much believe that my career wouldn't have gone where it did, except for me having access to freeware, shareware, and open source software.

Back in 1989 (give or take) I was making my living being a DEC VAX programmer. And 1-2 times each year we'd get a couple big magnetic tapes (like in those very old movies) that contained tons and tons of freeware, shareware, and early open source (GNU and copyleft) stuff. I'd typically come in on a Saturday, download the index from the tapes and then selectively download and play with various tools.

Things like awk (really gawk) that I used to transform how our company processed reporting data. Or like uupc, which was a UseNet protocol implementation for the VAX that allowed us to get store-and-forward email service to the world over our 1200 baud modems.

I looked like a frickin' rockstar thanks to those tools!

Similarly, at the time, I was an Amiga fanatic. And the Fred Fish collection of software was like a lifeline to all that was cool and fun in the world of ... everything!

Yes, like everyone I've needed to make a living, support my family, save for the kids college and my retirement. But, and I know this sounds sappy, having built my career on the shoulders of some impressive people's contributions, I was on the lookout for a way to try and do the same for others.

CSLA COM or "Classic"

CSLA started out as the "Component-based, Scalable, Logical Architecture" in the world of COM. And I didn't originally intend for it to be a standalone framework. My original thought was that this code was an example of what people could do, and it fit into the bigger narrative of my book.

By the time I wrote the Professional VB6 Business Objects and Professional VB6 Distributed Objects books though, it was already becoming clear that I was mistaken. I phrase it that way on purpose, because by that time I was receiving emails with bug reports, feature suggestions, and other comments - where people were clearly taking the code and using it as a framework, not as an example.

Want to see that old code (and a picture of me from 20 years ago)? I put the old VB6 CSLA code in a GitHub repo for safe keeping.

Kevin Ford and I worked together on a project in 1996-97, and the two of us argued extensively about how things should work. Without his input, CSLA would not have been what it was, and Kevin has been providing input and contributions from then through today.


Toward the end of the 1990's I was seriously considering switching to the shiny new Java world. Primarily because I'd become increasingly frustrated with COM and DCOM, and yearned for the more object-oriented aspects of Java. Fortunately for me, I became aware of things like a new "COOL" language, and "COM+3". Technologies that ultimately became what we know as the .NET framework, VB.NET, and C#. So I stuck with the Microsoft world through that time.

Working with the earliest pre-releases of .NET I tried directly porting the COM code to .NET. That sort of worked, but clearly wasn't what I wanted, because it didn't leverage the new OO features available to me. I believe I wrote and rewrote the core of CSLA three different times into .NET, each time learning more and more about how .NET worked and what I could do with it.

The end result was CSLA .NET 1.0. Still very much a bare bones "framework" that I still personally considered as much an example of what to do as a framework someone could use out of the box. But by that point I had reconciled myself to the reality that people would be using it as-is, so I use XML comments and a short-lived, but very good, OSS tool called ndoc to create a help file for the framework.

And of course CSLA .NET came out concurrent with my Professional VB.NET Business Objects book. That was republished not long after under the title Expert One-On-One Visual Basic .NET Business Objects because my books were purchased from Wrox Press by APress. Not that this affected CSLA .NET, but it obviously affected me and my books - in a positive way, as APress was great to work with!

CSLA .NET 1.1 and C#

In fact, I wrote the Expert C# Business Objects book with APress. The port of CSLA .NET from VB.NET to C# was almost entirely done by a colleague of mine: Brant Estes. And I think he did it in just a few days - amazing work by an amazing developer!

Thus started the era of maintaining the "same codebase" twice in two different languages. You can see it in the v1-5-x branch - two versions of everything.

This is also about the time that the term "CSLA" stopped being an acronym and became just an unpronounceable word. The acronym no longer applied really, because .NET is object-oriented, not component-based. What I should have done is given it a cool name instead of some letters. Saying "see es el ay" is tedious. People were saying "the Lhotka framework", or "the coleslaw framework", or the "cuz law framework". But I didn't have that foresight, and after 20 years I doubt I'll change the branding now.

So now we're in 2002-2003, give or take, and the Internet as we know it today was really coming together, including the idea that OSS projects should have public source repos. So I set up a Subversion (svn) server and a web site with downloadable zip archives containing the code and samples. I still figured people should leverage the code, not the framework as a "product".


That changed in 2005 with CSLA .NET 2.0, which primarily supported .NET 2.0 and generics. And of course the new Expert C# 2005 Business Objects and Expert VB 2005 Business Objects books. But somewhere around this time is when I started releasing not just the code, but also pre-compiled binaries for people to leverage.


In 2007 Microsoft released .NET 3.0, including WPF, WCF, and WF. Supporting WCF meant decoupling the data portal from .NET Remoting, so people could use Remoting, asmx, or WCF as a transport protocol. Supporting WPF meant lots of alterations to data binding code, because WPF interacts with the data binding interfaces differently from Windows Forms. Very messy and hard to accommodate, so this was the point in the evolution of CSLA where I realized the primary benefit was how it mitigates all the differences in data binding, not the data portal.

Around this time was my first foray into self-publishing content via the web, with my CSLA .NET 2.1 Handbook and Using CSLA .NET 3.0 ebooks. Both published in VB and C# editions.

CSLA .NET 3.7 and Silverlight

2008 brought Silverlight 2, the first time .NET fragmented into two totally different platforms. Getting CSLA decoupled from full .NET, such that it could also support the Silverlight flavor of .NET, was a major undertaking. Thankfully CSLA had Magenic as a patron, and Magenic committed several full time developers for a period of months to work through all the issues necessary for CSLA to run on both full .NET and Silverlight.

2008 is also the year I published my last paper books: Expert C# 2008 Business Objects and Expert VB 2008 Business Objects. By this time I'd switched entirely to C# myself, and a colleague named Joe Fallon was responsible for getting the VB code and book out there.

Subsequent to 2008 the VB source code for the framework itself was dropped, and only the C# framework code continued forward. Obviously the framework supports all .NET languages, so it can be used by VB or C#, but the framework code itself is C# only from 2008 forward.

Justin Chase was instrumental in the port to Silverlight. In fact, there were no good ways to run unit tests in Silverlight, especially if you wanted the same tests to run in .NET. So Justin created UnitDriven to solve that problem.

Marimer LLC

In 2009 I created Marimer LLC as an official company to own CSLA .NET and all related code, content, and business operations. Prior to that point I'd been operating as what's called a Sole Proprietor, and having an LLC provides a level of legal protection. Not that I've ever needed that, thankfully.

The trigger for this was a very large company who wanted to use CSLA .NET, but even though it was open source they didn't want to use a product that didn't have a company behind it. Other people were covering the legal costs to create the LLC and ultimately it was to my benefit, so I saw no downside.


A couple years later, in 2010, came some of the biggest changes to CSLA .NET since the addition of generics.

This is also the point where I switched completely to self-published books and videos via the web with the Using CSLA 4 ebook.

The has been online with that content for seven years now. Wow!

The massive work to decouple from .NET and support Silverlight was far more work than what we did in .NET 4, but it was largely transparent to users of CSLA .NET (which is kind of the point after all). But the changes we made to CSLA .NET 4 were breaking changes for users of the framework.

I say we, because by this point in time CSLA had accumulated quite a number of contributors. A list of contributors since we switched to GitHub is on the CSLA .NET contributors page. I'm truly sorry I have no way to list all the people who contributed to the svn codebase, though the CSLA .NET Open Hub page has captured quite a lot of historical data and says 61 people have contributed.

But back to CSLA .NET 4 and the big changes. What did did in this version was add the basis for a robust rules engine. This was based on lots of community feedback from our early attempts with a rules engine starting in version 3.8 and the influence of David West through his Object Thinking book.

Building on this rules engine, Jonny Bekkum has evolved the engine substantially over the past several years, and today it is able to handle extremely complex scenarios. Best of all, the engine retains its relative simplicity for less complex scenarios and rules.

Beyond that, Jonny has been a major contributor to other parts of the framework, one of my most trusted advisors, and a stalwart member of the broader community over many, many years.

I also want to call out Tiago Freitas Leal, who's run a parallel OSS project called CslaCodeGenFork, has worked closely with Jonny to manage the community-based CslaContrib project, and has been an active contributor to the CSLA framework and the community. Tiago has been involved for almost as many years as Jonny.

This is also about the time that Jaans helped create the initial NuGet definitions necessary to get CSLA into NuGet. Most of his nuspec files remain intact, though perhaps evolved slightly over time.

Enter mono, mono for Android, and Windows Phone

In 2011 the open source mono port of .NET became much more relevant to me than it had been prior to that point. Primarily because the mono project embraced Android and iOS as well as Linux.

Jonny Bekkum leveraged the decoupling of CSLA .NET from full .NET to get a version running on mono, and based on that I tried to get it working on Android and iOS. Android was successful, but technical limitations of mono on iOS blocked our progress at that time.

We also leveraged the decoupling from full .NET in order to get a version of CSLA running on the nascent Windows Phone platform.

Turns out that all that work back in 2007 to support Silverlight was incredibly valuable going forward, even though Silverlight itself ultimately had no future. Because CSLA .NET had been largely decoupled from full .NET though, we've been able to get CSLA .NET running on over 11 different flavors/versions of .NET simultaneously - all from a single codebase!


In 2012 Micrsoft came out with the ill-fated Windows 8, and what I personally describe as the evolution of Silverlight: WinRT. Porting CSLA .NET to WinRT took some effort, mostly because Microsoft messed around with the way the .NET reflection API worked. Why they changed one of the fundamental building blocks of .NET itself is beyond me, and it caused a great deal of pain. But we worked through it.

The other big impact was that the XAML supported by WinRT was comparable to that from Silverlight 2. Which is to say it was pretty limited compared to WPF or Silverlight 5. Fortunately I was able to dust off a lot of old Silverlight 2 XAML code and use it to support WinRT. This goes to show the value of having good source control history over a period of years!

I think this is also about the time that we switched the source code repository from svn to git, and the server from my personal server to GitHub. Bringing all the svn history forward was not trivial. At all. It was my colleague Nermin Dibek who convinced me to switch to git, and Nick Schonning suffered through the pains necessary to convert the svn history into git so it was (mostly) preserved.

Enter Xamarin

By 2013 that mono for Android and iOS initiative had been canceled by its patron/owner Novell. Fortunately the project remained alive via a new company named Xamarin. Better still, Xamarin overcame some of the earlier iOS limitations, and so CSLA .NET was able to provide full support for Android and iOS via Xamarin.

From a historical perspective, it is also worth noting that this is about the time that, thanks to the confusion around the iPad and Windows 8, enterprises generally decided to stop building smart client apps at all, favoring to focus on HTML5 web apps and the newly coined "single page app" concept.

So supporting Xamarin was important from my viewpoint, because it felt like people might never again write a .NET client-side app otherwise. Today I'm not so pessimistic, primarily because today Xamarin is owned by Microsoft and supports cross-platform development across Windows 7, 8, 10, iOS, Android, Linux, MacOS, and more. Damned impressive!

Windows 10

In 2015 Microsoft released Windows 10 to replace Windows 8. Now Windows 8 was largely a commercial failure, but I believe it did one very important thing: it broke everyone's conception about what Windows looked like and how it worked.

Windows 10 brought back many of the best-loved features of Windows 7, while keeping and refining the bits of Windows 8 that were actually good. Personally I really enjoy Windows 10, more than I did Windows 7 (which I thought at the time was the best Windows ever).

Windows 10 brought with it an evolution of WinRT called UWP, or the Universal Windows Platform. UWP had an interesting quirk, in that it ran on a super-early implementation of yet another flavor of .NET: .NET Core. Getting CSLA .NET to run on this new UWP flavor of .NET wasn't too hard, primarily because it really is just an evolution of WinRT, which we already supported.

Support for UWP means that CSLA .NET supports native Windows 10 development, and also works for Xbox and Hololens development. I think that's pretty cool!

.NET Analyzers

Jason Bock has been a colleague and friend for many years, and his passion for compilers and low-level implementation details around .NET is hard to match. In 2015 Microsoft introduced this idea of .NET analyzers, which allow us to write components that runs within the .NET compilers and analyzes our code as we write and compile it.

Jason wrote a bunch of analyzers for CSLA .NET that help enforce the guidelines people should follow when using the framework. Prior to this point those guidelines were documented in my books and code samples, but there was nothing to help remind people what to do as they were coding. Jason's analyzers changed that entirely, providing real-time feedback and code refactorings directly within Visual Studio. Awesome!

MIT License

From the inception of CSLA until 2016 I'd been using a custom license based largely off the Apache license. And that never really caused any issue, but it did occasionally result in me having an email exchange with an organization's lawyer as they approved the license.

Switching to the MIT license is something, in retrospect, that I should have done much earlier. Now I don't get any such email exchanges, because everyone knows and understands the MIT license.

That's perhaps one of the lessons to learn here. I started doing all this back before "open source" was as formal or defined as it is today. And "little" things like licensing rarely get updated over time, because the focus is always on the code and other fun stuff. It is probably a good idea to review things like licensing more frequently over the lifetime of a project, just to make sure those elements remain up to date and relevant as times change.

.NET Core and NetStandard

The most recent changes to the .NET world have largely centered around the new .NET Core implementation of .NET, a version of .NET than is truly cross-platform. And something called netstandard, which is an API definition against which we can write code, and have that same compiled code work on multiple flavors of .NET.

CSLA .NET 4.7 supports netstandard 2.0, and that means a single Csla.dll works on all the platforms that conform to netstandard 2.0, including:

  • Full .NET 4.6.1+
  • Xamarin Android, iOS, and more
  • UWP starting in October 2017
  • .NET Core 2.0

Basically all the modern versions/flavors of .NET now support netstandard 2.0. And so does CSLA .NET 4.7.

So where we've been compiling the CSLA source code 11+ times - once for each flavor of .NET that's running out there, the future looks bright, in that hopefully, eventually, we'll be able to build it just one time.

Of course it'll be a while before that happens. Today CSLA still supports people as far back as .NET 4.0. But the future is bright, and today's messy reality of various versions/flavors of .NET and PCLs and everything else is clearly going to be replaced by netstandard.

Thank you

It has been a pretty incredible 20 years. I've met people (virtually and IRL) from all corners of the globe, and I've become good friends with many folks.

I've been able to continually improve and maintain the CSLA framework through the revenue generated by my books, training videos, and by having Magenic as a patron. Even with all that though, none of this would have been possible without the fantastic support from many thousands of people who've read my books and used CSLA, and the hundreds of people who've participated in the CSLA forums over the years, and the scores of people who've directly contributed to the CSLA framework and related projects.

As I wrote this article, I called out the names of a few people who've helped shape and create CSLA and the community. But there are other people in our small community that were and are also part of the narrative. Please know that I value each and every one of you as a colleague and friend.

Closer to home, the steadfast support from my wife and kids and close friends - really I don't have the words to express my gratitude and love for all of you. Much of my early work on CSLA and the books was done with my sons bouncing on my knees, sleeping in my lap as I typed, or making sure I got outside occasionally for some sunshine. And believe it or not, my wife and I are almost going to celebrate our 30th anniversary. So even with me disappearing to write for extended periods of time, we're still together and enjoying life.

CSLA .NET Futures

Where to from here?

I'm more optimistic about the future of .NET than I have been for many years.

.NET on the server is gaining new life and enthusiasm thanks to .NET Core and the ability to run your code on Windows, Linux, MacOS, in Docker containers, and I'm sure more to come.

.NET on the client might be on the rebound. I think as more people discover just how horrific Electron is as a development platform for desktops, and they are already backing off from Cordova on mobile, this will open the door for Xamarin and .NET to be viewed as a desirable alternative. You know, a technology that doesn't suck down memory and CPU for no good reason, and a technology that truly allows developers to leverage each device and platform when needed.

And then there's the great hope, which is web assembly (wasm). If this project continues to progress, and I think it will, browsers will no longer be a homogeneous runtime for JavaScript. They'll be much closer to a true operating system or runtime, supporting many languages, include C# and .NET. Just imagine going to a "web page" and having it run .NET code instead of JavaScript!

And where .NET goes, CSLA .NET goes too.

Tuesday, 10 October 2017 12:36:47 (Central Standard Time, UTC-06:00)  #    Disclaimer
 Friday, 08 September 2017

ASP.NET Core only works with ClaimsPrincipal. Specifically, the http context from ASP.NET Core only accepts a ClaimsPrincipal instance, not the more general IPrincipal type.

Confusingly, the rest of the .NET world (full .NET, Xamarin, .NET Core, and netstandard2.0) still support IPrincipal. This makes ASP.NET Core an outlier, but an important one.

As a library author, I’m wondering if the consensus is that IPrincipal is dead, and that all principal objects should now subclass ClaimsPrincipal? Is this a new universal truth?

Specifically, should I run through all of CSLA .NET and in the netstandard2.0 version only support ClaimsPrincipal?

This would ultimately affect people building for Xamarin, full .NET, UWP, .NET Core, mono, as well as ASP.NET Core.

This would be a major breaking change for anyone trying to get existing .NET code (using CSLA) to run in any netstandard2.0 environment. The thing is, if you want to use ASP.NET Core you are kind of forced into that major breaking change anyway right?

My first reaction is NO – I shouldn’t make such a big change, because all the platforms not running in ASP.NET Core shouldn’t be forced to accept this burden just because ASP.NET decided to make a low-level breaking change by not supporting the IPrincipal type.

But I’m interested in hearing other people’s thoughts on this. What is the right answer?

Friday, 08 September 2017 15:16:28 (Central Standard Time, UTC-06:00)  #    Disclaimer

I was just reading this article about how to migrate from LastPass to 1Password.

I can't argue with what the author says in terms of LastPass having had some security issues. So I quickly checked to see if 1Password supported Windows 10.

It does not. No app in the store, no plug-in for the Edge browser.

Conversely, LastPass has an Edge browser plug-in and a (clunky-but-functional) app in the store.

Having a password vault and actually using a password vault aren't the same thing, and I'm pretty sure I wouldn't actually use one if it was a pain.

So at the moment LastPass wins, because they've put in the work to make it easy to use on my Win10 and iOS devices.

Their store app could be a lot better, but even a clunky app is infinitely better than no app at all.

Friday, 08 September 2017 14:55:01 (Central Standard Time, UTC-06:00)  #    Disclaimer
 Thursday, 07 September 2017

One of Magenic's largest business areas is QAT, and we have a serious focus on automated testing (including our open source MAQS testing framework).

The Software Test Professionals Conference (STPCon) is the leading event where test leadership, management and strategy converge.

I'm extremely pleased that this year's STPCon keynote speaker is Paul Grizzaffi) from Magenic.

Join us as Paul Grizzaffi explains responsible ways to approach automation, some of the knowledge we’ll need in order to be responsible, and shares insights about automation responsibility from his own career. Let’s allow history to remember our automation initiatives fondly instead of as Pyrrhic forays into irresponsibility.

Paul is also hosting a round table discussion on automation challenges.

Please join us in a round table discussion of attendee-provided automation challenges where we can share our thoughts and potential solutions to these challenges.

Troy Walsh, Magenic's practice lead for QAT, is also presenting at the conference. He'll talk about WinAppDriver vs Winium.

In this session, we will go hands on with WinAppDriver and Winum. We will dig into code and see how each tool works. We will also compare and contrast the tool features, usages and shortcomings.

Finally, Paul and Troy will team up to provide a demo of the open source MAQS framework.

We will demonstrate the Magenic Automation Quick Start framework, and its integration with CI/CD/CT workflows. MAQS is an open source package designed so that you can be running automated tests in minutes.

We are proud to be involved in STPCon, and hope you'll join us at the event!

Thursday, 07 September 2017 08:14:22 (Central Standard Time, UTC-06:00)  #    Disclaimer
 Wednesday, 30 August 2017

Groove Music

Long ago I switched from a Windows 10 phone to an iPhone. But I remained a Groove Music user, because I really like the service and its features.

  1. Offline sync of playlists so I can listen on the airplane
  2. Available on iPhone, Android, Windows, Xbox
  3. Plays music videos when available
  4. "Radio" (used to be Smart DJ or something) to help find new music

The only big issue I've struggled with on the iPhone is that the app doesn't have an easy way to sync all my music from OneDrive onto the phone for offline play. This was a simple option on Windows, but there's no support in the iPhone app.

I've got a few thousand songs in my library, all on OneDrive. They auto-sync to my Windows 10 devices for offline play, but the most important device for offline play is my phone, which I use when on airplanes and driving through northern Minnesota.

I tried syncing the music to the phone using iTunes, but that only makes the music available via the Apple Music app, not Groove, and I want my music in a single app across all my devices.

After trading some emails with a gentleman named Bob Spiker (I think the original email exchange was in response to a twitter rant of mine), it turns out there's a hack that sort of works.

These are the steps:

  1. Open the Groove app on Windows and create a set of playlists that contain only your OneDrive music
    1. Playlists can only have 1000 songs, so you may need to create several playlists
    2. It is probably easiest to add artists to the playlists, as you'll have fewer of them then albums or songs
    3. You can do this on your phone too, but the UI is tedious, so it is far easier on Win10
  2. Open the Groove app on your iPhone and you'll see the playlists; open each playlist and mark it available for offline use
  3. Make sure your phone has enough storage to handle your music; the sync will stall if the phone runs out of storage
  4. Groove will only download songs when the app is open and the phone's screen isn't locked
    1. Go to the iPhone settings, search for "Lock" and set the phone to never lock
    2. Plug in the phone so it is on AC power
    3. Leave the Groove app open as the active app
    4. Make sure your phone is on high speed wifi
    5. Wait patiently until the music has all synced onto the phone (which might take a long time)
  5. Go back into your phone's settings and set the auto-lock back to its original setting

This isn't a perfect solution. It does get all the music onto the phone, but I'm finding that artist/album indexing isn't always working as expected against the music synced from OneDrive. In other words, the music is there and you can see it in the playlists, but you can't always find it from the artist or album views (though it is usually there).

Maybe the "right" answer is to switch to another music service. But I'll only do that if I get the same cross-device support (including xbox), offline sync for my phone, and music videos.

Wednesday, 30 August 2017 13:03:12 (Central Standard Time, UTC-06:00)  #    Disclaimer
 Friday, 28 July 2017

People might wonder why I'm personally so pro-diversity when it comes to STEM (and pretty much everything else for that matter). Some perhaps assume I'm just a blind SJW gamma male or whatever.

My motivation certainly does flow, in part, from a broad sense of fairness and inclusion. But there are two key points that really drove me toward being active in this space.

First, diverse perspectives and ways of thinking through a problem are, frankly, a lot of what makes America great.

There are other countries out there with much larger populations, and rapidly expanding middle classes and educational systems. China, for example, has more children in their gifted and talented school programs than we have children total here in the US.

We have a substantial cultural advantage, at least in terms of the western style corporate world, because our culture is non-conformist. Americans generally feel comfortable expressing their viewpoints and "sticking their necks out" with their ideas. The result is that we often seem to come up with, and implement, new ideas at a comparatively fast pace.

That diversity of ideas, and the willingness to take that risk, is really key. This is backed up by research btw, here's an article from Scientific American for example.

Diversity of thought comes from diverse backgrounds, cultures (regional or global), educational experiences, and overall life experiences. The best way to get that diversity of thought is to have a diverse workforce in terms of gender, race, background, etc.

Or to put it another way: diversity is good for America.

Second, and perhaps somewhat related, it seems entirely unreasonable to me that mankind can be successful in the long term if we are only willing to accept contributions from a minority of humans - most notably straight white males.

It is true, I'm a SWM. But my life is full of non-white and/or non-male and/or non-straight people who are amazingly smart, talented, educated, and driven. Several of these folks have contributed directly or indirectly to my personal experience/success/whatever over the years - in work and life in general. Hopefully the reverse is true as well.

I guess my point is that, specifically from a US-centric perspective, we can't afford to treat any smart, educated, driven people as second class or unworthy if we are to compete on the global stage. Our absolute population is too small, and we need all our people to remain competitive.

Half the population is female. It is crazy to think we'd ignore half the brainpower in the world. Self-defeating actually.

Similarly, 37% of the US population is non-white. And that number is rapidly growing. Again, it would be self-defeating to ignore well over a third of our country's brainpower.

So yes, some of my motivation comes from my view that all people are created equal. A view that seems like an obvious part of being American.

Add to that the hard reality that to be against diversity in STEM is to intentionally shut out a majority of the brainpower in the US, much less the world at large. That's obviously ridiculous and irresponsible - as a professional and as a human.

Friday, 28 July 2017 11:13:48 (Central Standard Time, UTC-06:00)  #    Disclaimer
 Sunday, 04 June 2017

I read this thread on reddit thanks to terrajobst - with some amazement, and empathy.

Perhaps most people haven't made a major mistake in their careers, but I've made more than one. And my mistakes were probably more directly my fault than the mistake this poor person make - a mistake that was clearly caused by poor practices by the employer, not be the fresh-out-of-college employee.

I'll summarize what I believe are the two worst mistakes I've made.

Major mistake one was about eight months into my first real job out of university. This predated the concepts of source control like we know it now, so we all worked on a common directory structure that contained the source code for everything. And I deleted it all.

Yup, thought I was deleting something else, but did a recursive delete of the entire source code directory structure, instantly bringing all our developers to a full stop and losing a day's work (or more if the backups weren't good).

Fortunately the backups were good, thanks to a competent system administrator who not only performed the backups, but also regularly tested them. Yeah, just because you "have a backup" means nothing unless your regular IT process includes testing the restore process.

My mistake essentially cost all developers at least two days of time. The day lost when I deleted their work, some hours for the restore, and then another day for everyone to redo their work.

Still, I didn't get fired, though I did get a lot of crap from my colleagues and was on management's sh*t list for a while. And rightly so in my view.

This is probably a bit arrogant, but I strongly suspect I got to keep my job because I was a young hot-shot with no kids, and really no life to speak of, so my productivity as a developer was the best on our entire team. Except probably for my boss, who was an amazing developer!

Major mistake number two was about three years into my career (working at my second real job after university). I worked in IT and had (temporarily) left software development to become the system administrator and manager of the help desk.

I thought our security policies were too lax, and I'd been researching how to tighten up the rules around who had which kind of network and system permissions. Unfortunately what I didn't know was that changing these policies would invalidate everyone's password. Nor did I have the wisdom to do this over a weekend or anything - so I made the change midday.

Next thing you know, a few hundred people lost access to the entire computer system, basically bringing the entire bio-medical manufacturing company to a halt.

Sweating profusely, with basically every manager in the company breathing down my neck, I wrote a script to reset all passwords to a known value so it was possible to get everyone back online.

Basically I cost the company a half day's work, and I'm sure people had to work overtime to catch up and meet deadlines for products to be delivered to customers on time.

Yet again, my f*ckup to be sure. Fortunately I'd been there for quite some time and had built up non-trivial personal capital - all of which was probably spent in that one brief moment when I pressed enter on the line that accidentally locked everyone out of the system.

I read through that reddit post from the poor junior dev, apparently just following flawed onboarding instructions. I suppose the end result of that mistake is comparable to mine, and they had no personal capital to spend (this being day one on the job).

Regardless, from the poster's account it is so clear that the mistake was absolutely the responsibility of the employer - flawed onboarding instructions, extremely shoddy separation between dev and prod environments, apparently no regular testing of backups to make sure they could restore. The sort of environment I experienced back in the 1980's - and wouldn't expect to see anywhere today!

In my view the poster on reddit shouldn't have lost their job like that. They probably should get a lot of crap from coworkers, and perhaps go down in company history as the person who accidentally instigated better processes for development and IT. But not job loss.

On the other hand, perhaps this is for the best - a place run so poorly perhaps isn't a great start to anyone's career. Just think of all the bad habits a new employee might pick up working in a place like that.

Sunday, 04 June 2017 13:39:38 (Central Standard Time, UTC-06:00)  #    Disclaimer
 Tuesday, 02 May 2017

There seems to be some confusion around what Microsoft announced today around Windows 10 and the Surface Laptop.

These are two separate things.

Windows 10 S

This is a new flavor of the Windows 10 operating system. It has nothing to do with hardware. Numerous hardware vendors announced Intel-based devices that'll run this flavor of Windows - including Microsoft.

This version of Windows 10 is restricted to running apps deployed from the Windows store. That includes WinRT/UWP apps, and it includes Win32 apps. For some time now it has been possible for software developers to deploy Win32/.NET apps via the store - Slack is a good example.

Microsoft has said that they'll soon have the full Win32 version of Office in the store. Which makes sense, since they'll want Windows 10 S users to also use Office.

It is also the case that Windows 10 S is more locked down than standard Windows 10, both from a security and battery life perspective. Lower-level features/tools used by developers aren't available, improving security and battery life by eliminating things you don't want students (or most users) to do anyway.

Can a flavor of Windows survive if it only runs apps deployed from the store? I don't know, but given that it is pretty easy for software developers to deploy their existing apps via the store, plus there's quite a lot of nice UWP apps there too, I think it might have a shot.

Personally I wish more software vendors deployed via the store, as that radically reduces the chance that people will get a virus from some random website deploy.

Surface Laptop

This is a new member of the Surface hardware family. It is a laptop, not a tablet or convertible like the Surface Pro or Surface Book.

This is a nice looking and pretty high end laptop. It has Intel Core i5 or i7 chips, a beautiful touch screen like all the other Surface devices, works with the stylus, and comes with as much as 16gb memory and a 1tb SSD. Microsoft is claiming up to 12 hours battery life.

Personally I really enjoy my Surface Pro 4, and use it as a tablet quite often, so I'm not planning to switch to a laptop. So I'm holding out for a Surface Pro 5 😃

But I understand that a lot of people really like the laptop form factor, and this is like a super-powered Macbook Air with a touch screen and (imo) a better operating system.

Speaking of which, the Surface Laptop will ship with Windows 10 S, and can be upgraded to Windows 10 Pro. So for a lot of "regular users" they'll be able to use it as-is, and for power users or developers we can upgrade to Pro to unlock all the power (though Microsoft warns that this will reduce battery life, because Windows 10 S does a better job in that regard).


Hopefully this helps with some of the confusion. This is not another Surface RT sort of thing. Nor is it a return to ARM-based hardware.

It is a new flavor of Windows 10 focused on regular computer users, thus providing enhanced security and battery life, with a consistent way of deploying apps.

And it is a new member of the Surface hardware family. A high-end laptop for Windows 10 S or Windows 10 Pro.

Tuesday, 02 May 2017 14:45:50 (Central Standard Time, UTC-06:00)  #    Disclaimer
 Friday, 07 April 2017

Trying to figure out the core "meat" behind blockchain is really difficult. I'm going to try and tease out all the hype, and all the references to specific use cases to get down to the actual technology at play here.

(this is my second blockchain post, in my previous post I compare blockchain today to XML in the year 2000)

There are, of course, tons of articles about blockchain out there. Nearly all of them talk about the technology only in the context of specific use cases. Most commonly bitcoin and distributed ledgers.

But blockchain the technology is neither a currency nor a distributed ledger: it is a tool used to implement those two types of application or use case.

There's precious little content out there about the technology involved, at least at the level of the content you can find for things like SQL Server, MongoDb, and other types of data store. And the thing is, blockchain is basically just a specialized type of data store that offers a defined set of behaviors. Different in the specifics from the behaviors of a RDBMS or document database, but comparable at a conceptual level.

I suspect the lack of "consumable" technical information is because blockchain is very immature at the moment. Blockchain seems to be where RDBMS technology was around 1990. It exists, and uber-geeks are playing with it, and lots of business people see $$$ in their future with it. But it will take years for the technology to settle down and become tractable for mainstream DBA or app dev people.

Today what you can find are discussions about extremely low-level mathematics, cryptography, and computer science. Which is fine, that's also what you find if you dig deep enough into Oracle's database, SQL Server, and lots of other building-block technologies on top of which nearly everything we do is created.

In other words, only hard-core database geeks really care about how blockchain is implemented - just like only hard-core geeks really care about how an RDBMS is implemented. Obviously we need a small number of people to live and breathe that deep technology, so the rest of us can build cool stuff using that technology.

So what is blockchain? From what I can gather, it is this: a distributed, immutable, persistent, append-only linked list.

Breaking that down a bit:

  1. A linked list where each node contains data
  2. Immutable
    1. Each new node is cryptographically linked to the previous node
    2. The list and the data in each node is therefore immutable, tampering breaks the cryptography
  3. Append-only
    1. New nodes can be added to the list, though existing nodes can't be altered
  4. Persistent
    1. Hence it is a data store - the list and nodes of data are persisted
  5. Distributed
    1. Copies of the list exist on many physical devices/servers
    2. Failure of 1+ physical devices has no impact on the integrity of the data
    3. The physical devices form a type of networked cluster and work together
    4. New nodes are only appended to the list if some quorum of physical devices agree with the cryptography and validity of the node via consistent algorithms running on all devices
      1. This is why blockchain is often described as a "trusted third party", because the cluster is self-policing

Terminology-wise, where I say "node" you can read "block". And where I say "data" a lot of the literature uses the term "transaction" or "block of transactions". But from what I've been able to discover, the underlying technology at play here doesn't really care if each block contains "transactions" or other arbitrary blobs of data.

What we build on top of this technology then becomes the question. Thus far what we've seen are distributed ledgers for cryptocurrencies (e.g. bitcoin) and proof of concept ledgers for banking or other financial scenarios.

Maybe that's all this is good for - and if so it is clearly still very valuable. But I strongly suspect that, as a low level foundational technology, blockchain will ultimately be used for other things as well.

I'm also convinced that blockchain is almost at the top of the hype cycle and is about to take a big plunge as people figure out what it can and can't actually do.

Finally, I believe that blockchain, assuming there's money to be made in the technology, will become part of established platforms such as Azure, AWS, and GCP. And there might be some other niche players left, but the majority of the many, many blockchain tech providers out there today will ultimately be purchased by the big players or will just vanish.

Friday, 07 April 2017 10:02:32 (Central Standard Time, UTC-06:00)  #    Disclaimer
On this page....
Feed your aggregator (RSS 2.0)
January, 2019 (1)
December, 2018 (1)
November, 2018 (1)
October, 2018 (1)
September, 2018 (3)
August, 2018 (3)
June, 2018 (4)
May, 2018 (1)
April, 2018 (3)
March, 2018 (4)
December, 2017 (1)
November, 2017 (2)
October, 2017 (1)
September, 2017 (3)
August, 2017 (1)
July, 2017 (1)
June, 2017 (1)
May, 2017 (1)
April, 2017 (2)
March, 2017 (1)
February, 2017 (2)
January, 2017 (2)
December, 2016 (5)
November, 2016 (2)
August, 2016 (4)
July, 2016 (2)
June, 2016 (4)
May, 2016 (3)
April, 2016 (4)
March, 2016 (1)
February, 2016 (7)
January, 2016 (4)
December, 2015 (4)
November, 2015 (2)
October, 2015 (2)
September, 2015 (3)
August, 2015 (3)
July, 2015 (2)
June, 2015 (2)
May, 2015 (1)
February, 2015 (1)
January, 2015 (1)
October, 2014 (1)
August, 2014 (2)
July, 2014 (3)
June, 2014 (4)
May, 2014 (2)
April, 2014 (6)
March, 2014 (4)
February, 2014 (4)
January, 2014 (2)
December, 2013 (3)
October, 2013 (3)
August, 2013 (5)
July, 2013 (2)
May, 2013 (3)
April, 2013 (2)
March, 2013 (3)
February, 2013 (7)
January, 2013 (4)
December, 2012 (3)
November, 2012 (3)
October, 2012 (7)
September, 2012 (1)
August, 2012 (4)
July, 2012 (3)
June, 2012 (5)
May, 2012 (4)
April, 2012 (6)
March, 2012 (10)
February, 2012 (2)
January, 2012 (2)
December, 2011 (4)
November, 2011 (6)
October, 2011 (14)
September, 2011 (5)
August, 2011 (3)
June, 2011 (2)
May, 2011 (1)
April, 2011 (3)
March, 2011 (6)
February, 2011 (3)
January, 2011 (6)
December, 2010 (3)
November, 2010 (8)
October, 2010 (6)
September, 2010 (6)
August, 2010 (7)
July, 2010 (8)
June, 2010 (6)
May, 2010 (8)
April, 2010 (13)
March, 2010 (7)
February, 2010 (5)
January, 2010 (9)
December, 2009 (6)
November, 2009 (8)
October, 2009 (11)
September, 2009 (5)
August, 2009 (5)
July, 2009 (10)
June, 2009 (5)
May, 2009 (7)
April, 2009 (7)
March, 2009 (11)
February, 2009 (6)
January, 2009 (9)
December, 2008 (5)
November, 2008 (4)
October, 2008 (7)
September, 2008 (8)
August, 2008 (11)
July, 2008 (11)
June, 2008 (10)
May, 2008 (6)
April, 2008 (8)
March, 2008 (9)
February, 2008 (6)
January, 2008 (6)
December, 2007 (6)
November, 2007 (9)
October, 2007 (7)
September, 2007 (5)
August, 2007 (8)
July, 2007 (6)
June, 2007 (8)
May, 2007 (7)
April, 2007 (9)
March, 2007 (8)
February, 2007 (5)
January, 2007 (9)
December, 2006 (4)
November, 2006 (3)
October, 2006 (4)
September, 2006 (9)
August, 2006 (4)
July, 2006 (9)
June, 2006 (4)
May, 2006 (10)
April, 2006 (4)
March, 2006 (11)
February, 2006 (3)
January, 2006 (13)
December, 2005 (6)
November, 2005 (7)
October, 2005 (4)
September, 2005 (9)
August, 2005 (6)
July, 2005 (7)
June, 2005 (5)
May, 2005 (4)
April, 2005 (7)
March, 2005 (16)
February, 2005 (17)
January, 2005 (17)
December, 2004 (13)
November, 2004 (7)
October, 2004 (14)
September, 2004 (11)
August, 2004 (7)
July, 2004 (3)
June, 2004 (6)
May, 2004 (3)
April, 2004 (2)
March, 2004 (1)
February, 2004 (5)

Powered by: newtelligence dasBlog 2.0.7226.0

The opinions expressed herein are my own personal opinions and do not represent my employer's view in any way.

© Copyright 2019, Marimer LLC

Send mail to the author(s) E-mail

Sign In