Rockford Lhotka's Blog

Home | Lhotka.net | CSLA .NET

 Monday, July 31, 2006

I submit that this is a good move by Microsoft: making the MSDN Library available for free download.

Some may argue that this devalues the MSDN subscription - but frankly that's silly. The vast majority of the Library is available online anyway, all Microsoft has done here is provided a more convenient way to access the data. It isn't like they decided to give away the software for free! Personally I haven't installed the Library on my machine for well over a year, because I find the web access more convenient.

Dollar per bit, an MSDN subscription is an unbeatable deal for a developer. The ability to get almost every OS, server and development tool for the purposes of development at just over the cost of Visual Studio alone is really quite amazing when you think about it.

Other people will likely argue that this is in response to government actions (the EU in particular). If so, then so be it. I think the EU is out of control and will likely do serious harm to European consumers, and maybe to Microsoft. But the upside for me is that I work for a consulting company, and the more variations on the OS the more time it takes us to build even simple software. Since we charge by the money, it merely means that software for use in the EU will make us more money that software for use in the US or elsewhere. So perhaps I should be rooting for the EU, because in some perverse way they're likely to make me more money?

Regardless, even if Microsoft is releasing the Library free to help mitigate some "openness" issues in the EU, that is only good news for developers who (for some reason) find it hard to get the content over the Internet.

My view is this: I've worked with IBM software, and the lack of an MSDN-equivalent is devastating to productivity. And I've worked with (and continue to work with) open source software, where the lack of decent documentation and organized support materials is infamous. The investment Microsoft has always made around supporting developer productivity through documentation and MSDN is one of its key success factors - at least in the development world. To me, this is just another small step in Microsoft's continuing support for developers on their platform.

Monday, July 31, 2006 7:31:25 PM (Central Standard Time, UTC-06:00)  #    Disclaimer  |  Comments [0]  | 
 Wednesday, July 26, 2006

Some exciting news! Dunn Training is building a formal training class around CLSA .NET, with plans for the class to be ready in September. I often get requests for CSLA .NET training, and now there'll be a great answer.

Of course Magenic remains the premier source for consulting and mentoring around CSLA .NET. Training is important, but you can't underestimate the value of longer term mentoring!

Given the combination of my books, a formal CSLA .NET class and longer term mentoring and consulting from Magenic, a full array of CSLA .NET resources is coming into being.

And while I'm plugging Dunn Training, I should mention that they have an excellent BizTalk Server 2006 class - just tell them that Rocky sent you :)

Update: Here is a link to the information page on the CSLA .NET class.

Wednesday, July 26, 2006 11:56:44 AM (Central Standard Time, UTC-06:00)  #    Disclaimer  |  Comments [0]  | 
 Friday, July 21, 2006

Paul Sheriff has extended a discount for Paul Sheriff's Inner Circle to people who read my blog:

Anyone who signs up between now and Sept. 1,

2006 will receive a discount.

Have them enter the PROMO code: ROCKY01

If they purchase a Yearly membership, they will receive 1 additional month for free.

Friday, July 21, 2006 8:55:03 AM (Central Standard Time, UTC-06:00)  #    Disclaimer  |  Comments [0]  | 
 Wednesday, July 19, 2006
Wednesday, July 19, 2006 10:29:40 PM (Central Standard Time, UTC-06:00)  #    Disclaimer  |  Comments [0]  | 

Yesterday I posted about Paul Sheriff’s new subscription-based online venture. It is an experiment on Paul’s part, and it is something he’s put a huge amount of time and effort into building.

 

Interestingly, there’s been a bit of pushback – at least in the comments on my blog – to Paul charging for his site. Of course this is an experiment, and so only time will tell if Paul’s investment in time and money putting it together, and his ongoing investment in building content will actually pay off.

 

But I hope it does work, and this is why.

 

It has been clear for a while now that the world is undergoing some major changes. While the Internet didn't transform the world like all the dot-com nuts thought it would, it really is having a non-trivial (if ponderous) impact as time goes by.

 

(for a thought-provoking view of a possible future, check out Epic 2014).

 

A few of us, Paul and myself included, are trying to figure out how to adapt to this new world. With book sales radically down and magazine subscriptions failing and technical conferences struggling, it is becoming less and less practical for a professional author/speaker to make a living.

 

Now it might be the case that free content will have the same quality as professionally created, reviewed and edited content. But I doubt it. Some people can generate quality content without reviewers and editors, but most can’t. And in any case there’s no substitute for experience. As with anything, experience has tremendous value. If you look at any professional author’s work you’ll see a progression as they get better and better at explaining their ideas over time.

 

Not that there isn't some great free content out there, but wading through all the random content to find it is very expensive. There’s no doubt that some people invest their time and effort in improving their writing skills for free, but over time it is hard to commit to that level of focus without some level of compensation.

 

I specifically avoided saying that some people do this as a hobby. Because I think that is very rare. People write to get compensation. In many cases it is financial – either directly, because they get paid to write, or indirectly, because they expect to get a raise, or to more easily job-hop into a raise.

 

Coming back to that sifting through the web thing though… Time isn't free. In fact I'm of the opinion that time is far more valuable than money for most of the people in our profession. Wasting hours sifting through random outdated, or just plain poor, content to find that one gem on someone's blog is really costly.

 

For some people it is worth that time, for others it is not. There's no way to pass a global value judgment on this, because different people have different jobs and priorities. If I can spend a couple hours writing code, I'm much happier than if I spent a couple hours reading random web content. Other people love reading and sifting through random web content and don't begrudge that time in the slightest.

 

One thing that I always keep in mind though, is that we (in the US and Europe anyway) cost 4-7 times more than people in India or China. That means we need to be 4-7 times more productive to justify our existence. So that time spent sifting through the web needs to result in some pretty impressive productivity or it was just a very high cost.

 

I sift through the web at least as much as the next guy, don’t get me wrong. But not really by choice. If some web-sifter out there started a subscription-based index into content that is actually up to date and valid I’d pay for it. Google is great, but just think if there was a Google that only searched meaningful content!?! I don’t care about the vast majority of what people put on the web, there are just a few gems I’m looking for.

 

Unfortunately, thus far the idea of a paid index for content hasn’t proven to be a viable business model. And the web is undermining traditional forms of providing content. So the world is changing.

 

But I don’t believe for a minute that the value of professional content is lower than in the past, I just think the delivery of that content is in flux.

 

So the question then, is how to deliver professional content in this new world? And in a way where the producers, reviewers and editors of the content are compensated for their effort. Time isn’t free, not for you as the reader, nor for those of us engaged in professionally producing that content.

 

We’ll all find out whether Paul’s experiment works or not over time. But he’s not alone in looking for ways to adapt to this new world, and you can expect to see some experiments from other people as well – including me – in the relatively near future.

Wednesday, July 19, 2006 5:31:00 PM (Central Standard Time, UTC-06:00)  #    Disclaimer  |  Comments [0]  | 
 Tuesday, July 18, 2006

My good friend Paul Sheriff is trying something new - a subscription-based web site where you can tap into his expertise on various topics of tremendous interest to developers. The site is Paul Sheriff's Inner Circle, and it is worth taking a look.

Tuesday, July 18, 2006 7:04:57 PM (Central Standard Time, UTC-06:00)  #    Disclaimer  |  Comments [0]  | 

Over the past few years the community around my CSLA .NET framework has become very large and very active. Recently the online forum for CSLA .NET, which is the center of the community in many ways, moved to a new home: http://forums.lhotka.net.

Just this week another community effort has become reality: CSLAcontrib. CSLAcontrib is a "project of projects" into which the community can contribute tools, templates and add-ons for CSLA .NET. I assume no ownership over anything on CSLAcontrib - this is a pure community effort.

This community effort has my full support, and the people involved have my gratitude. I find the community efforts around CSLA .NET to be both humbling and inspiring, and I appreciate each and every person who takes the time to get involved! Thank you!

CSLAcontrib is hosted on Microsoft's CodePlex site: http://www.codeplex.com/Wiki/View.aspx?ProjectName=CSLAcontrib, and operates under an open-source license.

The goal of CSLAcontrib is to provide a clear, powerful and friendly home for all the wonderful community contributions around CSLA .NET. There are many existing contributions out there, some of which can benefit by having a more visible home. Others were hosted on gotdotnet, and CodePlex offers a far superior environment in terms of performance, stability and ease of use.

But I think this is just the tip of the iceberg. CSLA .NET 2.0 has some interesting extensibility points, including common validation rule methods, data portal channels, and sub-classes of the six core base classes. I think it would be very interesting and useful to see more advanced and powerful validation rule methods in particular. And a colleague of mine had, at one point, suggested building a data portal channel that ran over an IM protocol :-)

So if you are looking for a way to contribute your cool tool, template or add-on, CSLAcontrib is a great option. If you are looking to use some great tools, templates or add-ons for CSLA .NET, CSLAcontrib should become the go-to place.

Tuesday, July 18, 2006 8:54:10 AM (Central Standard Time, UTC-06:00)  #    Disclaimer  |  Comments [0]  | 
 Friday, July 14, 2006

CSLA .NET version 2.0.3 is now available for download at www.lhotka.net/cslanet/download.aspx. This is a bug fix update to address various errata and issues that readers have found since version 2.0.2.

Most notably, this version (hopefully) fixes the issues with CslaDataSource, where it would sometimes fail to find the business assembly. Please note that the type name is case-sensitive and typos are now the primary reason the type/assembly can't be found.

To give you some idea what's going on with CslaDataSource, here's the short story:

In CSLA .NET 2.0 (and in the books), I implemented CslaDataSource so it loaded the business assembly at design time (in VS 2005) to get the assembly's metadata. The metadata is needed for the web forms designer so the GridView and DetailsView controls can show the right columns/rows. And that worked, but had the problem that changes to the business assembly during development weren't picked up by CslaDataSource.

In 2.0.1 I altered CslaDataSource to load the business assembly in a temporary AppDomain. That allows me to unload the AppDomain and thus unload the assembly - so CslaDataSource can always reflect against the latest assembly. Which is a great idea, except that there's no supported (or even unsupported really) way to find the current business assembly. It turns out that VS 2005 puts the assemblies into a shadow directory, creating a new shadow directory each time you build your project or update an external assembly. To work around this, I am doing a sort of hack by inferring the shadow directory location and then finding the most recently created shadow directory (by date/time).

Now, in 2.0.3 I altered CslaDataSource yet again, to better infer the shadow directory location. My previous approach didn't handle having Csla.dll in the GAC, and failed at other (seemingly random) times as well. The new approach infers the shadow directory inside the primary VS 2005 AppDomain, and using the business assembly/type as the source location (rather than Csla.dll). This change appears to have fixed the random failures and should allow Csla.dll to be in the GAC. I doubt it allows the business assembly to be in the GAC, but (imo) that's not a good idea anyway.

Friday, July 14, 2006 8:02:57 AM (Central Standard Time, UTC-06:00)  #    Disclaimer  |  Comments [0]  | 
 Tuesday, July 11, 2006

After spending (what to me was) an unbelievable number of hours fighting with css, I have a new web site: www.lhotka.net is now ASP.NET 2.0, with master pages, themes and a pure css layout. Special thanks to Mike Gale for helping me with some issues, and to http://www.positioniseverything.net/ for providing tools that generate various types of css layouts. It continues to amaze me that you need a tool (and various browser-specific hacks) to generate simple tabular page layouts - but I guess that's the web for you...

Anyway, the end result is that the web site is no longer hosted on a ~7 year old dual P3/450 machine with 512 megs of RAM - but rather is hosted on a (relatively) new Athalon 3100 with a couple gigs of RAM. So not only does it look different, but it is a bit faster too :)

However, some of the speed is bled off by my choice to jump into virtualization. The web "server" is actually now a virtual machine. (yes, I know you are supposed to keep these sorts of details private to slow hackers - but sharing information is the key to forward motion...)

I chose to do this because it means I have a lot more flexibility going forward. In particular, if this physical machine were to crash, I could rehost the vhd files on another machine and get running again relatively fast. Trying to do that with an OS directly installed on the hard drive is non-trivial (requiring something like Ghost or Acronis TrueImage).

With the virtualization, I can script it so the VM shuts down, the vhd files are copied elsewhere (as a backup) and the VM is restarted. Full image backups in just a couple minutes! Hard to beat that as a feature!

It is also the case that this server box is largely underutilized. It isn’t like my web site or blog are popular enough to overwhelm the machine. So now I have the ability to host other virtual machines on the same physical box, for testing of beta software or hosting other types of servers if I have the need going forward.

I had also considered moving from cvs to svn at the same time as the rest of the upgrade process. In researching that, I decided there was simply too much risk. While Subversion installed OK, the cvs2svn tool was unable to handle the CSLA .NET code repository. As with many of these tools, it was written by Un*x people for their needs, and has never really be made to work with the Windows equivalent software (cvsnt in my case). While it works for some people, my repository is apparently too old and complex (tagging/branching/deleting of files/directories).

The end result is that switching to svn means abandoning all the history I have in cvs. I’m not quite ready to do that, nor do I want to lose the cvsgraph abilities from viewcv (because there’s no equivalent for svn at present I guess). So for now, at least, I’m sticking with cvs.

Of course the obvious question is why I didn’t switch to Microsoft VSTS/TFS. There are perhaps two good reasons: licensing and functionality. Licensing isn’t a problem for me directly – I have a workgroup license that would serve me fine. But it isn’t clear that I could expose the code repository externally and still satisfy the license – and I think there’s tremendous value in www.lhotka.net/cslacvs... And functionality is an issue too – primarily in that I don’t know of a viewcv equivalent for TFS. So even if licensing is/was a non-issue – the lack of a read-only web viewer for the repository is a show-stopper.

Given time I am sure these issues will work themselves out – so by the time I’m ready to move off cvs perhaps TFS will be a viable choice alongside Subversion. Time will tell…

Tuesday, July 11, 2006 12:31:12 PM (Central Standard Time, UTC-06:00)  #    Disclaimer  |  Comments [0]  | 
On this page....
Search
Archives
Feed your aggregator (RSS 2.0)
August, 2014 (2)
July, 2014 (3)
June, 2014 (4)
May, 2014 (2)
April, 2014 (6)
March, 2014 (4)
February, 2014 (4)
January, 2014 (2)
December, 2013 (3)
October, 2013 (3)
August, 2013 (5)
July, 2013 (2)
May, 2013 (3)
April, 2013 (2)
March, 2013 (3)
February, 2013 (7)
January, 2013 (4)
December, 2012 (3)
November, 2012 (3)
October, 2012 (7)
September, 2012 (1)
August, 2012 (4)
July, 2012 (3)
June, 2012 (5)
May, 2012 (4)
April, 2012 (6)
March, 2012 (10)
February, 2012 (2)
January, 2012 (2)
December, 2011 (4)
November, 2011 (6)
October, 2011 (14)
September, 2011 (5)
August, 2011 (3)
June, 2011 (2)
May, 2011 (1)
April, 2011 (3)
March, 2011 (6)
February, 2011 (3)
January, 2011 (6)
December, 2010 (3)
November, 2010 (8)
October, 2010 (6)
September, 2010 (6)
August, 2010 (7)
July, 2010 (8)
June, 2010 (6)
May, 2010 (8)
April, 2010 (13)
March, 2010 (7)
February, 2010 (5)
January, 2010 (9)
December, 2009 (6)
November, 2009 (8)
October, 2009 (11)
September, 2009 (5)
August, 2009 (5)
July, 2009 (10)
June, 2009 (5)
May, 2009 (7)
April, 2009 (7)
March, 2009 (11)
February, 2009 (6)
January, 2009 (9)
December, 2008 (5)
November, 2008 (4)
October, 2008 (7)
September, 2008 (8)
August, 2008 (11)
July, 2008 (11)
June, 2008 (10)
May, 2008 (6)
April, 2008 (8)
March, 2008 (9)
February, 2008 (6)
January, 2008 (6)
December, 2007 (6)
November, 2007 (9)
October, 2007 (7)
September, 2007 (5)
August, 2007 (8)
July, 2007 (6)
June, 2007 (8)
May, 2007 (7)
April, 2007 (9)
March, 2007 (8)
February, 2007 (5)
January, 2007 (9)
December, 2006 (4)
November, 2006 (3)
October, 2006 (4)
September, 2006 (9)
August, 2006 (4)
July, 2006 (9)
June, 2006 (4)
May, 2006 (10)
April, 2006 (4)
March, 2006 (11)
February, 2006 (3)
January, 2006 (13)
December, 2005 (6)
November, 2005 (7)
October, 2005 (4)
September, 2005 (9)
August, 2005 (6)
July, 2005 (7)
June, 2005 (5)
May, 2005 (4)
April, 2005 (7)
March, 2005 (16)
February, 2005 (17)
January, 2005 (17)
December, 2004 (13)
November, 2004 (7)
October, 2004 (14)
September, 2004 (11)
August, 2004 (7)
July, 2004 (3)
June, 2004 (6)
May, 2004 (3)
April, 2004 (2)
March, 2004 (1)
February, 2004 (5)
Categories
About

Powered by: newtelligence dasBlog 2.0.7226.0

Disclaimer
The opinions expressed herein are my own personal opinions and do not represent my employer's view in any way.

© Copyright 2014, Marimer LLC

Send mail to the author(s) E-mail



Sign In