Rockford Lhotka's Blog

Home | Lhotka.net | CSLA .NET

 Wednesday, 05 April 2017

It seems to me that blockchain today is where XML was at the beginning. A low level building block on which people are constructing massive hopes and dreams, mentally bypassing the massive amounts of work necessary to get from there to the goals. What I mean by this is perhaps best illustrated by a work environment I was in just prior to XML coming on the scene.

The business was in the bio-chemical agriculture sector, so they dealt with all the major chemical manufacturer/providers in the world. They'd been part of an industry working group composed of these manufacturers and various competitors for many years at that point. The purpose of the working group was to develop a standard way of describing the products, components, parts, and other aspects of the various "products" being manufactured, purchased, resold, and applied to farm fields.

You'll note that I used the word "product" twice, and put it in quotes. This is because, after all those years, the working group never did figure out a common definition for the word "product".

One more detail that's relevant, which is that everyone had agreed to transfer data via COBOL-defined file structures. I suppose that dated back to when they traded reels of magnetic tape, but carried forward to transferring files via ftp, and subsequently the web.

Along comes XML, offering (to some) a miracle solution. Of course XML only solved the part of the problem that these people had already solved, which was how to devise a common data transfer language. Was XML better than COBOL headers? Probably. Did it solve the actual problem of what the word "product" meant? Not at all.

I think blockchain is in the same position today. It is a distributed, append-only, reliable database. It doesn't define what goes in the database, just that whatever you put in there can't be altered or removed. So in that regard it is a lot like XML, which defined an encoding structure for data, but didn't define any semantics around that data.

The concept of XML then, and blockchain today, is enough to inspire people's imaginations in some amazing ways.

Given amazing amounts of work (mostly not technical work either, but at a business level) over many, many years XML became something useful via various XML languages (e.g. XAML). And a lot of the low-level technical benefits of XML have been superseded by JSON, but that's really kind of irrelevant, since all the hard work of devising standardized data definitions applies to JSON as well as XML.

I won't be at all surprised if the same general path isn't followed by blockchain. We're at the start of years and years of hard non-technical work to devise ways to use the concept of a distributed, append-only, reliable database. Along the way the underlying technology will become standardized and will merge into existing platforms like .NET, Java, AWS, Azure, etc. And I won't be surprised if some better technical solution is discovered (like JSON was) along the way, but that better technical solution probably won't really matter because the hard work is in figuring out the business-level models and data structures necessary to make use of this underlying database concept.

Wednesday, 05 April 2017 11:40:44 (Central Standard Time, UTC-06:00)  #    Disclaimer
 Thursday, 30 March 2017

I really like the new VS 2017 tooling.

However, it has some real problems – it is far from stable for netstandard.

I have a netstandard class library project. Here are issues I’m facing.

  1. Every time I edit the compiler directives in the Build tab in project properties it adds YET ANOTHER set of compiler constants for RELEASE;NETSTANDARD1_6 – those duplicates add up fast!
  2. The output path acts really odd – always insists on appending something like \netstandard1.5\ to the end of the output path – even if the output path already ends with \netstandard1.5\ - in NO case can I get it to use the path I actually want!! This should act like normal projects imo – not arbitrarily appending crap to my path!
  3. I have one netstandard class library referencing another via a project reference and this doesn’t seem to be working at all – none of my types from the first class library project seem available in the second
  4. The Add References dialog doesn’t show existing references to Shared Projects – the only way to know that the reference is already there is to look at the csproj file in a text editor

We’re going in (what I think) is a good direction with the tooling, but right now it is hard/impossible to integrate netstandard projects into a normal workflow because the tooling is pretty buggy.

Thursday, 30 March 2017 16:39:30 (Central Standard Time, UTC-06:00)  #    Disclaimer
 Monday, 20 February 2017

According to this article, the smartphone boom is over, and the "next big thing" isn't really here yet. I would argue that's good. We need a breather to catch up with all the changes from the past several years.

In a sense, there have been two periods in my career that were really fun from the perspective of solving business problems (as opposed to other points that were equally fun from the perspective of learning new tech).

One was a couple years before and after 1990, when the minicomputer ecosystem was generally stable (HP 3000, Unix, VAX were common options). The other period was the six years when VB6 was dominant, while .NET was still nascent, VB had matured, and Windows was the defacto target for all client software.

In both those cases there was a 5-6 year window when the platforms were slow-changing, the dev tools were mature, and disruption was around the fringes, not in the mainstream. From a "learn new tech" perspective those were probably pretty boring periods of time. But from a "solve big business problems" perspective they were amazing periods of time, because everyone felt pretty comfortable using the platforms/tools at hand to actually do something useful for end users.

The iPad turned the world on its ear, and we're just now back to a point where it is clear that the platform is .NET/Java on the server and Angular on the client (regardless of the client OS). The server tooling has been fine for years, but I think we can see real stability for client development in the near future - whew!

So if the chaos we've been suffering through for the past several years (decade?) is coming to an end, and there's no clear "next big thing", then with any luck we'll find ourselves in a nice period of actual productivity for a little while. And I think that'd be refreshing.

Monday, 20 February 2017 13:29:39 (Central Standard Time, UTC-06:00)  #    Disclaimer
 Wednesday, 08 February 2017

The concept of identity with Microsoft services is a mess, something I probably don't have to tell any Microsoft developer.

Some services can only be attached to a personal Microsoft Account (MSA), and other services can only be used from an AD account. For example, MSDN can only be directly associated with an MSA, while Office 365 can only be associated with an AD account. Some things, like VSTS, can be associated with either one depending on the scenario.

I used to have the following:

  • r___y@lhotka.net - MSA with my MSDN associated
  • r___y@magenic.com - Magenic AD account
  • r___y@magenic.com - MSA with nothing attached (I created this long ago and forgot about it)

That was a total pain when I started using O365 and an AD-linked VSTS site with my r___y@magenic.com AD account, because Microsoft couldn't automatically distinguish between my AD and MSA accounts; both named r___y@magenic.com. As a result, every time I tried to log into one of these sites they'd ask if this was a personal or work/school account.

Fortunately you can rename an MSA to a different email address. I renamed my (essentially unused) r___y@magenic.com account to a dummy email address so now I really just have two identities:

  • r___y@lhotka.net - MSA with my MSDN associated
  • r___y@magenic.com - Magenic AD account

This way Microsoft automatically knows that when I use my AD login that it is a work/school account and I don't have to mess with that confusion.

There's still the issue of having MSDN attached to an MSA, and also needing to have some connection from my AD account to my MSDN subscription. This is required because we have VSTS sites associated with Magenic's AD, so I need to log in with my AD account, but still need to ensure VSTS knows I'm a valid MSDN user.

Here's info on how to link your work account to your MSDN/MSA account.

At the end of the day, if I'd never created that r___y@magenic.com MSA account (many years ago) my life would have been much simpler to start. Fortunately the solution to that problem is to rename the MSA email to something else and remove the confusion between AD and MSA.

The MSDN linking makes sense, given that you need an MSA for MSDN, and many of us need corporate AD credentials for all our work sites.

Wednesday, 08 February 2017 12:51:54 (Central Standard Time, UTC-06:00)  #    Disclaimer
 Thursday, 05 January 2017

I was reading an HBR article about Why Being Unpredictable Is a Bad Strategy and all I could think about was the Windows 8 debacle.

Leading up to the development and release of Windows 8 Microsoft switched from an open and predictable model to a very closed and secretive model. Sure, they'd waffled back and forth in the years prior to Windows 8, but it wasn't until that point in their history that they went "entirely dark" about something as important as Windows itself.

Personally I think they were copying Apple, because at that point in time Apple was ascendant with the iPad and Microsoft was worried. The thing is, a secretive model works for Apple because nobody relies on their long-term vision for stability. Their target are consumers, who like fun stuff and care little if things break every couple years.

Microsoft's primary customer base are small, medium, and large enterprises who spend millions or billions on IT. They don't like fun, they like predictable roadmaps that minimize cost and risk. The last thing a business wants is a version of Windows that comes out of the blue and breaks all their software, or requires complete retraining of their entire user base.

Worse yet, Microsoft not only increased risk for all of its business customers with Windows 8, they totally cut off all avenues for feedback and improvement of the product until after it was released. After it was too late to address the numerous major issues with the new OS.

Fortunately Windows 10 has been a whole other story. Microsoft not only returned to their original open communication model, but they've actually became more open than they've ever been in their history. And it shows, in that the business world now has a predictable roadmap, and Windows has never been so closely shaped by real-world customer feedback.

The result is that Windows 10 adoption is proceeding at a rapid pace, and Microsoft is (very so) slowly rebuilding trust with its customers.

Thursday, 05 January 2017 16:04:55 (Central Standard Time, UTC-06:00)  #    Disclaimer
 Tuesday, 03 January 2017

We're having a conversation on Magenic's internal forum where we're discussing the current JavaScript community reaction to all these frameworks. Some people in the industry are looking at the chaos of frameworks and libraries and deciding to just write everything in vanilla js - eschewing the use of external dependencies.

And I get that, I really do. However, I'm also an advocate of using frameworks - which shouldn't be surprising coming from the author of CSLA .NET.

Many years ago I spoke at a Java conference (they were trying to expand into the .NET space too).

At lunch I listened to a conversation between some other folks at the table; they were discussing the use of Spring (which was fairly new at the time).

Their conclusion was that although Spring did a ton of useful and powerful things, it was too big/complex and so they'd rather not use it and solve all those problems themselves (the problems solved by Spring).

I see the same thing all the time with CSLA .NET. People look at it and see something that is big and complex, and thing "those problems can't be that hard to solve", so they end up rewriting (usually poorly) large parts of CSLA.

I say "usually poorly" because their job isn't to create a well-tested and reusable framework. Their job is to solve some business problem. So they solve some subset of each problem that Spring or CSLA solves in-depth, and then wonder why their resulting app is unreliable, or performs badly, or whatever.

As the author of a widely used OSS framework, my job is to create a framework that solves and abstracts away key problems that business developers would otherwise encounter. Because of this, I'm able to solve those problems in a broader and deeper way than a business developer, who's goal is to put as little effort into solving the lower level problem, because it is just a distraction from solving the actual business problem.

So yeah, I do understand that some of these frameworks, like Angular, Spring, CSLA .NET, etc. are complex, and they have their own learning curve. But they exist because they solve a bunch of lower level non-business related problems that you will otherwise have to solve yourself. And the time you spend solving those problems provides zero business value, and does ultimately add to the long-term maintenance cost of your resulting business software.

There's not a perfect answer here to be sure. But for my part, I like to think that the massive amounts of time and energy spent by framework authors to truly understand solve those hard non-business problems is time well spent, allowing business developers to be more focused on solving the problems they are actually paid to address.

Tuesday, 03 January 2017 10:53:41 (Central Standard Time, UTC-06:00)  #    Disclaimer
 Wednesday, 28 December 2016

In a previous blog post I related a coding standards horror story from early in my career. A couple commenters asked for part 2 of the story, mostly to see how my boss, Mark, dealt with the chaos we found in the company who acquired us.

There are two things I think are fortunate that relate to the story.

First, they bought our company because they wanted our software and because they wanted Mark. It is quite possible that nobody in the world understood the vertical industry and had the software dev chops that Mark provided, so he had a lot of personal value.

Second, before the acquisition I'd been tasked with writing tooling to enable globalization support for our software. Keep in mind that this was VT terminal style software, and all the human readable text shown anywhere on the screen came from text literals or strings generated by our code. The concept of a resx file like we have in Windows didn't (to our knowledge) exist, and certainly wasn't used in our code. Coming right out of university, the concept of lexical parsing and building compilers was fresh in my mind, so my solution was to write a relatively simplistic parser that found all the text in the code and externalized it into what today we'd call a resource file.

That project was one of the most fun things I've probably ever done. And one of the few times in my career where a university compiler design course directly applied to anything in the real world.

Because Mark was so well regarded by the new company, he ended up in charge of the entire new development team. As such, he had the authority to impose his coding standards on the whole group, including the team of chaos-embracing developers. Not that they were happy, but this was the late 1980s and jobs weren't plentiful, and my recollection is that they grumbled about it, and the fact that it was a "damn Yankee" imposing his will on the righteous people of The South. But they went along with the change.

However, that still left massive amounts of pre-existing code that was essentially unreadable and unmaintainable. To resolve that, Mark took my parser as a base and wrote tools (I don't remember if it was one tool, or one per coding "style") that automatically reformatted all that existing code into Mark's style. That was followed by a ton of manual work to fix edge cases and retest the code to make sure it worked. In total I think this process took around 2 months, certainly not longer.

I wasn't directly involved in any of that fix-up process, as I had been assigned to do the biggest project yet in my young career: building support for an entire new product line in a related vertical to the focus of our original software. Talk about a rush for someone just a year out of university!

Wednesday, 28 December 2016 14:05:51 (Central Standard Time, UTC-06:00)  #    Disclaimer
 Tuesday, 27 December 2016

I've always been a fan of speculative fiction, and in particular the sub-genre of cyberpunk and what is often now called dark space opera (which usually has cyberpunk aspects).

As most people have become aware over the past few decades, good science fiction explores possible futures that come about due to technological advancements. The focus is usually on the changes to society or mankind, with the technology being just a driver for the change. If you weren't aware that this is the core of good SF, then I'm happy to let you know that you should be reading this sort of fiction because it will help you be more prepared for changes as they occur.

Among the key themes inherent in most of these speculative futures is the idea of automation. That computers, robots, and machines will automate away some (or nearly all) jobs that humans have done in the past, or that they do today. A couple decades ago this was true fiction; today we can see that this is an almost unavoidable future.

Personally I find this interesting because my entire career has been in the software industry. Most software is all about automating away people's jobs. Not that we usually frame it that way, but the reality is that corporations used to have massive numbers of accountants, now they have a small handful because computers do the work of those many, many thousands of accountants from the past. And software drives robotics, and machines, and all kinds of automation. My career is all about driving toward a future of automation, and so I tend to think about what that means for society.

For example, I was just reading that driverless cars will eliminate over 200 categories of jobs. We already know that nearly any factory work can be automated, it is just a matter of whether the automation is cheaper than offshore labor. There's essentially no way US labor can be cheap enough to avoid the work from being automated, so bringing jobs back from offshore is entirely unrealistic.

This article from a Nobel economist sums up how robotics threatens jobs rather nicely. And explains why worries about outsourcing jobs, or thoughts of trying to "bring them back" are not really important.

Capitalism and the free market drive companies to find the lowest cost way to provide the minimum viable product that makes the most profit. That's brutal, but it is true. Current US and European trends toward right-wing thinking tend to focus a lot on removing barriers so corporations can better pursue capitalistic and free market policies.

So companies will either find super-cheap labor somewhere in the world, or if that's too hard or expensive then they'll automate those tasks so they don't require large numbers of humans at all. Whatever costs less in the long run will win, and that will not involve human labor.

Assuming we're going to stick with capitalism, corporatism, and free market concepts (and I think that's a safe bet), the question isn't whether most people on the planet will become unemployed. The question is how humanity and society will deal with most people being unemployed.

One common trope in speculative fiction, and in reality, is the idea of a basic income provided for unemployable people. This article against universal basic income (make sure to click through to the author's original article with details) makes some good points about the risks of a basic income. Sadly, even after you read the author's original (and often good) points, I think it is clear that he maintains unrealistic hopes about keeping most people employed in some manner.

I don't have the answer. I don't know what society does look like when factories that required thousands of workers now require a few hundred technicians to keep the machines running.

Can we retrain those thousands of unemployed piece workers for another factory? And what stops that other factory from becoming automated? When that happens, do we retrain those people for jobs that can't be automated? What jobs are those?

We already see that fast food, driving vehicles, factory work, warehouse work - these are all on the chopping block. That's hundreds of millions of jobs headed for automation. What other economic segment has demand for a few hundred million generally unskilled workers?

In India I'm told that they intentionally avoid the use of big machines for construction, preferring instead to bring in hundreds of people with shovels and picks. I don't know if this saves money, but it keeps people busy and helps preserve social order (note that I have nothing to back up this idea - I'm going on second-hand knowledge here).

After the US Great Depression the government came up with a lot of busywork projects to keep people busy. Not bad projects either, in my youth there were a lot of highway rest stops and other small projects that had been built by the CCC. Those have mostly been replaced now by more modern facilities created via much lower labor modern technologies and techniques. But perhaps we need to return to building public works using rustic hand-laid stonework?

My point is this: whether we go with something like a basic minimum income, or use legal structures to try and block the free market from optimizing away our jobs, or something else, we absolutely need to come up with some societal answer for what we'll do when the majority of humanity is unemployed.

Tuesday, 27 December 2016 12:50:33 (Central Standard Time, UTC-06:00)  #    Disclaimer
On this page....
Search
Archives
Feed your aggregator (RSS 2.0)
June, 2017 (1)
May, 2017 (1)
April, 2017 (2)
March, 2017 (1)
February, 2017 (2)
January, 2017 (2)
December, 2016 (5)
November, 2016 (2)
August, 2016 (4)
July, 2016 (2)
June, 2016 (4)
May, 2016 (3)
April, 2016 (4)
March, 2016 (1)
February, 2016 (7)
January, 2016 (4)
December, 2015 (4)
November, 2015 (2)
October, 2015 (2)
September, 2015 (3)
August, 2015 (3)
July, 2015 (2)
June, 2015 (2)
May, 2015 (1)
February, 2015 (1)
January, 2015 (1)
October, 2014 (1)
August, 2014 (2)
July, 2014 (3)
June, 2014 (4)
May, 2014 (2)
April, 2014 (6)
March, 2014 (4)
February, 2014 (4)
January, 2014 (2)
December, 2013 (3)
October, 2013 (3)
August, 2013 (5)
July, 2013 (2)
May, 2013 (3)
April, 2013 (2)
March, 2013 (3)
February, 2013 (7)
January, 2013 (4)
December, 2012 (3)
November, 2012 (3)
October, 2012 (7)
September, 2012 (1)
August, 2012 (4)
July, 2012 (3)
June, 2012 (5)
May, 2012 (4)
April, 2012 (6)
March, 2012 (10)
February, 2012 (2)
January, 2012 (2)
December, 2011 (4)
November, 2011 (6)
October, 2011 (14)
September, 2011 (5)
August, 2011 (3)
June, 2011 (2)
May, 2011 (1)
April, 2011 (3)
March, 2011 (6)
February, 2011 (3)
January, 2011 (6)
December, 2010 (3)
November, 2010 (8)
October, 2010 (6)
September, 2010 (6)
August, 2010 (7)
July, 2010 (8)
June, 2010 (6)
May, 2010 (8)
April, 2010 (13)
March, 2010 (7)
February, 2010 (5)
January, 2010 (9)
December, 2009 (6)
November, 2009 (8)
October, 2009 (11)
September, 2009 (5)
August, 2009 (5)
July, 2009 (10)
June, 2009 (5)
May, 2009 (7)
April, 2009 (7)
March, 2009 (11)
February, 2009 (6)
January, 2009 (9)
December, 2008 (5)
November, 2008 (4)
October, 2008 (7)
September, 2008 (8)
August, 2008 (11)
July, 2008 (11)
June, 2008 (10)
May, 2008 (6)
April, 2008 (8)
March, 2008 (9)
February, 2008 (6)
January, 2008 (6)
December, 2007 (6)
November, 2007 (9)
October, 2007 (7)
September, 2007 (5)
August, 2007 (8)
July, 2007 (6)
June, 2007 (8)
May, 2007 (7)
April, 2007 (9)
March, 2007 (8)
February, 2007 (5)
January, 2007 (9)
December, 2006 (4)
November, 2006 (3)
October, 2006 (4)
September, 2006 (9)
August, 2006 (4)
July, 2006 (9)
June, 2006 (4)
May, 2006 (10)
April, 2006 (4)
March, 2006 (11)
February, 2006 (3)
January, 2006 (13)
December, 2005 (6)
November, 2005 (7)
October, 2005 (4)
September, 2005 (9)
August, 2005 (6)
July, 2005 (7)
June, 2005 (5)
May, 2005 (4)
April, 2005 (7)
March, 2005 (16)
February, 2005 (17)
January, 2005 (17)
December, 2004 (13)
November, 2004 (7)
October, 2004 (14)
September, 2004 (11)
August, 2004 (7)
July, 2004 (3)
June, 2004 (6)
May, 2004 (3)
April, 2004 (2)
March, 2004 (1)
February, 2004 (5)
Categories
About

Powered by: newtelligence dasBlog 2.0.7226.0

Disclaimer
The opinions expressed herein are my own personal opinions and do not represent my employer's view in any way.

© Copyright 2017, Marimer LLC

Send mail to the author(s) E-mail



Sign In