Rockford Lhotka

 Friday, May 30, 2008

I have hesitated to publicly discuss my experiences with Vista, acting under the theory that if you can't say something nice you shouldn't say anything at all. But at this point I have some nice things to say (though not all nice), and I think there's some value in sharing my experiences and thoughts.

On the whole, my experience with Vista has been decidedly mixed.

Vista is very pretty. It is clearly the future in many ways (especially around IIS 7 and WAS and security in general). And it has some nice usability features – like a far better replacement for ntbackup, and pre-enabled shadowing (so you can retrieve old files if you lose/overwrite them). And quite a few OS features are easier to find/use than in XP (once you get used to the changes).

However, it is slower and more resource-intensive than XP. So you can’t upgrade from XP and expect the same level of performance or responsiveness on the same hardware. If your hardware is more than a few months old, I really can't recommend an upgrade.

I upgraded my (now 2 year old) laptop when Vista came out, and have been generally displeased with the results. It is running Vista Business. However, as Vista has aged Microsoft has issued patches/fixes/updates that have helped with stability and performance. I would say that it is now tolerable, or at least I’ve learned to live with it. While it is workable, it isn't really satisfying - to me at least. My laptop is a dual core machine with 3 gigs of RAM and a low-end GPU.

One thing I'll note is that upgrading the laptop from 2 to 3 gigs of RAM made a huge difference in performance. Vista really likes memory, and the more you can get in your machine the happier you'll be.

I didn’t upgrade my desktop to Vista until I replaced my desktop machine. I do almost all my work on this machine, and wasn't about to deal with the performance issues on a constant basis.

My new desktop machine, which is running Vista Ultimate, is a quad core with 4 gigs of RAM and a high end GPU. I find that Vista runs quite adequately on this (admittedly high-end) machine. My current bottlenecks are memory speed (but DDR3 is too expensive) and disk IO (but 10k RPM disks are too loud – I’m one of those “silent computer” nuts).

I have colleagues who are running Vista 64 bit. Apparently that is faster and more stable (partially due to fewer iffy drivers, and because 64 bit gets all your 4+ gigs of RAM). But I'm a gamer, so I'm kind of stuck with 32 bit until there are 64 bit versions of Battlefield 2142, Supreme Commander, Sim City 4 and Civ IV :)

One thing that has really helped my Vista experience is the discovery of TeraCopy. Vista is notorious for slow file copies (especially when copying multiple files). It is a bit sad that Vista's slow file copies have enabled a product niche for something like a file copy utility (how 1990 is that!), but whatever, it works.

I have UAC turned on. I realize many devs turn it off. But if our users are to live with it, I think developers should too. And personally I think it should be illegal for Microsoft employees to turn it off – they should know what they are doing to their customers.

The thing is, I have not found UAC to be overly troublesome. Yes, there are some extra dialogs when installing software – but that’s not a big deal imo, and is an acceptable trade-off for the security. The bigger frustration with UAC are simpler things like trying to create a favorite in IE, or copy a shortcut to the Programs menu – both of which turn out to be really hard due to UAC.

I have done some work with VS 2005 under Vista. You have to run as admin to debug web apps, which means you can't double-click sln files. There may have been some other quirks too, I don't recall. Long ago I created a virtual machine with XP where I have VS 2005 installed, and any 2005 work is done there (for .NET 2.0/3.0 - primarily maintenance for CSLA .NET 3.0).

For months now I've been using VS 2008, largely under Vista. The experience is quite smooth. You do have to run 2008 as admin to debug a web app running in IIS, but not in the dev web server. It really isn’t a big deal to run as admin for VS if you need to debug a web app. This is the intended “escape hatch” for developers that need to do things a normal user should not be able to do. It is a little frustrating to not be able to double-click a sln file, but I can deal with that small issue.

And for WPF or Windows Forms work (and a lot of web work where the dev server can be used) then you don't need to run as admin at all.

In the final analysis, if you have a relatively new machine with high-end hardware and lots of RAM, then I think Vista is a fine OS, even for a developer. But if your machine is more than a few months old, has less than 3 gigs of RAM or has an older GPU, I'd hesitate to leave XP.

Friday, May 30, 2008 10:45:18 AM (Central Standard Time, UTC-06:00)  #    Disclaimer
 Wednesday, May 28, 2008
tech_summit_logo_sm Magenic is holding a full-day, two-track mini-conference on June 20.
That is just 3 weeks away, but there are still some open seats, so reserve yours now!

Our first keynote speaker is Jay Schmelzer, GPM of the RAD tools group at Microsoft. He'll be talking about the future of RAD tools (Visual Studio and more).

Our second keynote speaker is me, Rockford Lhotka. I'll be talking about the future of CSLA .NET and CSLA Light, a version of CSLA that will run in Silverlight.

The rest of the day is divided into two tracks for a total of 8 high-quality technical sessions. This will be a full day of hard-core technical content and fun!

This FREE event is being held in Downers Grove near Chicago, IL. It starts at 8:30 AM, we're providing lunch, and the event runs through to a reception at the end of the day at around 5 PM.

If you'd like an invitation to attend, please email

Click here more information about the event

Wednesday, May 28, 2008 10:36:55 AM (Central Standard Time, UTC-06:00)  #    Disclaimer
 Tuesday, May 27, 2008

Sue, my wife's best friend, and mother of my goddaughter, was diagnosed with breast cancer late last year. Fortunately they caught it early and with surgery and radiation she beat the cancer. It was a rough period of time, but she got through it.

Sue has convinced my wife to do a three day, 60 mile, Walk for the Cure this fall. It is a fundraiser for cancer research, and a worthy cause. I would think there must be better ways to raise money than to walk 20 miles a day for three days, but apparently not...

Here's my wife Teresa's post about the walk, including links to a web site where people (you perhaps?) can donate to the cause.

In fact, just to make it as easy as possible, here's the direct link to the donation page.

I rarely post personal items on my blog, preferring to keep it focused on cool technical stuff. But cancer is such an important issue. Sue beat it through good fortune and perseverance. My mother is living in an ongoing battle against a different type of cancer. A battle that appears unwinnable, and which has altered our lives dramatically over the past few years.

Any help in raising money for this cancer research effort is most appreciated! Thank you!

Tuesday, May 27, 2008 9:36:26 PM (Central Standard Time, UTC-06:00)  #    Disclaimer
 Wednesday, May 21, 2008

OK, this is really cool:

It appears that Microsoft, having reserved the convention center for the two weeks of Tech Ed (Dev week and IT Pro week), is allowing the user groups in the region to take advantage of the idle space in the intervening weekend. That is so cool!!

But now, to be fair, Microsoft should provide two days of conference center space and AV in <insert your city here>, don't you think? :)

Wednesday, May 21, 2008 8:12:30 PM (Central Standard Time, UTC-06:00)  #    Disclaimer
 Tuesday, May 20, 2008

I was just in a discussion about ClickOnce with several people, include Brian Noyes, who wrote the book on ClickOnce.

I was under the mistaken impression, and I know quite a few other people who have this same misconception, that ClickOnce downloads complete new versions of your application each time you publish a new version. In fact, I know of at least a couple companies who specifically chose not to use ClickOnce because their app is quite large, and re-downloading the whole thing each time a new version is published is unrealistic.

It turns out though, that ClickOnce does optimize the download. When you publish a new version of your app, all the new files are written to the server, that is true. But the client only downloads changed files. All unchanged files are copied from the previous install folder on the client to the new install folder on the client.

In other words, all unchanged files are reused from the copy already on the client, and so are not downloaded again. Only changed files are downloaded from the server.

The trick to making this work is to only rebuild assemblies that have actually changed before you do a publish. Don't rebuild unchanged assemblies, because that could change the assembly - and even a one byte change in the assembly would cause it to be downloaded because the file hash would be different.

Saying that gives me flashbacks to binary compatibility issues with VB6, but it makes complete sense that they'd have to use something like a file hash to decide whether to re-download each file.

Tuesday, May 20, 2008 9:00:29 AM (Central Standard Time, UTC-06:00)  #    Disclaimer
 Tuesday, May 13, 2008

Magenic is holding a full-day, two-track mini-conference on June 20. We have put together a great lineup of speakers and topics, including 2 keynotes and 8 sessions. This will be a full day of hard-core technical content and fun!

The event is being held in Downers Grove near Chicago, IL. It starts at 8:30 AM and runs through to a reception at the end of the day at around 5 PM.

The event is by invitation only - specifically invitation by one of Magenic's sales people. If you are already a Magenic customer and you'd like an invitation, please contact your Magenic AE and let them know. If you are not a Magenic customer please email and let us know you'd like an invitation.

The event is free, and includes both lunch and a reception at the end of the day. You are responsible for any travel expenses involved in getting to the event. Magenic is arranging a block of rooms at a nearby hotel with special pricing and ground transportation between the conference and hotel.

The following link has more information about the event

Tuesday, May 13, 2008 7:45:53 AM (Central Standard Time, UTC-06:00)  #    Disclaimer
 Thursday, April 24, 2008

Dunn Training has been offering a very good 3 day class on CSLA .NET for some time now, with lots of great feedback. And this class continues (with a sold-out class coming up in Toronto).

As a compliment to that class, Dunn is now lining up a bigger and deeper 5 day master class. The plan is to have just two of these each year.

This master class is quite different from the 3 day class. It will have more lecture, deeper labs and a faster pace. They tell me the intent is to cover everything from OO design to CSLA object creation to WPF/Windows/Web/WCF/WF interface design to LINQ in one intense week.

Not only will this be the ultimate in CSLA .NET training, it'll be some incredibly awesome training on .NET itself!!

Thursday, April 24, 2008 2:45:39 PM (Central Standard Time, UTC-06:00)  #    Disclaimer
 Wednesday, April 23, 2008

I spent some time over the past few days using my prototype Silverlight serializer to build a prototype Silverlight data portal. It is still fairly far from complete, but at least I've proved out the basic concept and uncovered some interesting side-effects of living in Silverlight.

The good news is that the basic concept of the data portal works. Defining objects that physically move between the Silverlight client and a .NET web server is practical, and works in a manner similar to the pure .NET data portal.

The bad news is that it can't work exactly like the pure .NET data portal, and the technique does require some manual effort when creating the business assemblies (yes, plural).

The approach I'm taking involves having two business assemblies (VS projects) that share many of the same code files. Suppose you want to have a Person object move between the client and server. You need Person in a Silverlight class library and in a .NET class library. This means two projects are required, even if they have the same code file.

Visual Studio makes this reasonable, because you can create the file in one project (say the Silverlight class library) and then Add Existing Item and use the Link feature to get that same file included into a .NET class library project.

I also make the class be a partial class, so I can add extra code to the .NET class library implementation. The result is:

BusinessLibrary.Client (Silverlight class library)
  -> Person.cs

BusinessLibrary.Server (.NET class library)
  -> Person.cs (linked from BusinessLibrary.Client)
  -> Person.Server.cs

One key thing is that both projects build a file called BusinessLibrary.dll. Also, because Person.cs is a shared file, it obviously has the same namespace. This is all very important, because the serializer requires that the fully qualified type name ("namespace.type,assembly") be the same on client and server. In my case it is "BusinessLibrary.Person,BusinessLibrary".

The Person.Server.cs file contains the server-only parts of the Person class - it is just the rest of the partial class. The only catch here is that it can not define any fields because that would obviously confuse the serializer since those fields wouldn't exist on the client. Well, actually it could define fields as long as they were marked as NonSerialized.

Of course you could also have a partial Person.Client.cs in the Silverlight class library - though I haven't found a need for that just yet.

One thing I'm debating is whether the .NET side of the data portal should just directly delegate Silverlight calls into the "real" data portal - effectively acting as a passive router between Silverlight and the .NET objects. OR the .NET side of the data portal could invoke specific methods (like Silverlight_Create(), Silverlight_Update(), etc) so the business developer can include code to decide whether the calls should be processed on the server at all.

The first approach is simple, and certainly makes for a compelling story because it works very much like CSLA today. The Silverlight client gets/updates objects in a very direct manner.

The second approach is a little more complex, but might be better because I'm not sure you should blindly trust anything coming from the Silverlight client. You can make a good argument that Silverlight is always outside the trust boundary of your server application, so blindly passing calls from the client through the data portal may not be advisable.

Either way, what's really cool is that the original .NET data portal remains fully intact. This means that the following two physical deployment scenarios are available:

Silverlight -> Web server -> database
Silverlight -> Web server -> App server -> database

Whether the web/app server is in 2- or 3-tier configuration is just a matter of how the original .NET data portal (running on the web server) is configured. I think that's awesome, as it easily enables two very common web server configurations.

The big difference in how the Silverlight data portal works as compared to the .NET data portal is on the client. In Silverlight you should never block the main UI thread, which means calls to the server should be asynchronous. Which means the UI code can't just do this:

var person = Person.GetPerson(123);

That sort of synchronous call would block the UI thread and lock the browser. Instead, my current approach requires the UI developer to write code like this:

var dp = new Csla.DataPortal();
dp.FetchCompleted +=
  new EventHandler<Csla.DataPortalResult<Person>>(dp_FetchCompleted);
dp.BeginFetch<Person>(new SingleCriteria<int>(123));

with a dp_FetchCompleted() method like:

private void dp_FetchCompleted(object sender, Csla.DataPortalResult<Person> e)
  if (e.Error != null)
    // e.Error is an exception - deal with the issue
    // e.Object is your result - use it

So the UI code is more cumbersome than in .NET, but it follows the basic service-calling technique used in any current Silverlight code, and I don't think it is too bad. It isn't clear how to make this any simpler really.

Wednesday, April 23, 2008 8:08:36 AM (Central Standard Time, UTC-06:00)  #    Disclaimer
 Tuesday, April 22, 2008

Yes, I'm a comic book collector :)


FCBD LogoIt seems that every year FREE COMIC BOOK DAY gets bigger and bigger and it looks like this year is going to be no exception! It's a great opportunity to introduce someone new to the world of comic books! Over 2.5 million comic books will be given away this year. In addition, this year is featuring the widest variety of Free Comic Book Day Comics ever! Support your local comic book store and grab some family and friends and get some free comic books and a great time!


Tuesday, April 22, 2008 1:08:03 PM (Central Standard Time, UTC-06:00)  #    Disclaimer