Rockford Lhotka

 Tuesday, July 7, 2009

If you’ve read the news in the last day or two, you’ve probably run across an article talking about an “unprecedented step” taken by Microsoft, in that they are talking about a Windows vulnerability before they have a patch or fix available.

When I read the article on msnbc.com it mentioned that there is a workaround (not a fix – but a way to be safer), and that information could be found on Microsoft’s web site.

So off I went to www.microsoft.com – Microsoft’s web site. Where I found nothing on the topic, but I did find a link to the security home page.

So off I went to the security home page. Where I found nothing that was obviously on the topic. Yes, there’s a lot of information there, including some information on viruses, infection attacks and an apparent rise in fake attacks (so I started wondering if MSNBC had been faked out?).

At no point in here did I realize that one of the articles on the security home page actually was the article I was looking for! It turns out that this particular vulnerability is through an ActiveX video component, a fact not mentioned in the MSNBC article. So while I saw information about such a thing on the Microsoft site, I had no way to link it to the vague mainstream press article that started this whole adventure…

Fortunately I know people :)

The vulnerability is an ActiveX video component issue. And the workaround is documented here:

http://support.microsoft.com/kb/972890

And now that I know I’m looking for information related to an ActiveX video component issue, it is clear that there are relevant bits of information on these sites too:

Microsoft Security Response Center blog:

http://blogs.technet.com/msrc/default.aspx

Microsoft TechNet Security alerts:

http://www.microsoft.com/technet/security/advisory/default.mspx

I still think the communication here is flawed. The mainstream press screwed up by providing insufficient and vague information, making it virtually impossible to find the correct documentation from Microsoft on the issue. But perhaps Microsoft was vague with the press too – hard to say.

And I think Microsoft could have been much more clear on their sites, providing some conceptual “back link” to indicate which bits of information pertain to this particular issue.

There’s no doubt in my mind that my neighbors, for example, would never find the right information based on the mainstream articles in the press. So Microsoft’s “unprecedented step” of talking about this issue will, for most people, just cause fear, without providing any meaningful way to address that fear. And that’s just sad – lowering technology issues to the level typically reserved for political punditry.

Tuesday, July 7, 2009 9:06:16 AM (Central Standard Time, UTC-06:00)  #    Disclaimer
 Wednesday, July 1, 2009

I’ve updated my prototype MCsla project to work on the “Olso” May CTP. The update took some effort, because there are several subtle changes in the syntax of “Oslo” grammars and instance data. What complicated this a little, is that I am using a custom DSL compiler because the standard mgx.exe utility can’t handle my grammar.

Still, I spent less than 8 hours getting my grammar, schemas, compiler and runtime fixed up and working with the CTP (thanks to some help from the “Oslo” team).

I chose at this point, to put the MCsla project into my public code repository. You can use the web view to see the various code elements if you are interested.

The prototype has limited scope – it supports only the CSLA .NET editable root stereotype, which means it can be used to create simple CRUD screens over single records of data. But even that is pretty cool I think, because it illustrates the end-to-end flow of the whole “Oslo” platform concept.

A business developer writes DSL code like this:

Object Product in Test
{
  Public ReadOnly int Id;
  Public string Name;
  Public double ListPrice;
} Identity Id;

(this is the simplest form – the DSL grammar also allows per-type and per-property authorization rules, along with per-property business and validation rules)

Then they run a batch file to compile this code and insert the resulting metadata into the “Oslo” repository.

The user runs the MCslaRuntime WPF application, which reads the metadata from the repository and dynamically creates a XAML UI, CSLA .NET business object and related data access object that talks to a SQL Server database.

f01

The basic functionality you get automatically from CSLA .NET is all used by the runtime. This includes application of authorization, business and validation rules, automatic enable/disable for the Save/Cancel buttons based on the business object’s rules and so forth.

If the business developer “recompiles” their DSL code, the new metadata goes into the repository. The user can click a Refresh App button to reload the metadata, immediately enjoying the new or changed functionality provided by the business developer.

The point is that the business developer writes that tiny bit of DSL code instead of pages of XAML and C#. If you calculate the difference in terms of lines of code, the business developer writes perhaps 5% of the code they’d have written by hand. That 95% savings in effort is what makes me so interested in the overall “Oslo” platform story!

Wednesday, July 1, 2009 5:15:07 PM (Central Standard Time, UTC-06:00)  #    Disclaimer
 Monday, June 29, 2009

I use register.com for my email – though after today that may have to change…

Why? Because the register.com email service is down, and has been for several hours. There was a brief moment earlier this afternoon when I thought they had it fixed, because a few emails squeaked through, but otherwise it is deader than a doornail.

register.com is apparently doing some sort of email system upgrade – fancier AJAX web UI, etc. And that’d be fine, but all I really care about is reliable POP/SMTP service, and the “upgrade” appears to have been a major step backward in that regard…

This affects my personal email, the email for the CSLA .NET forum and email for my online store.

So if you sent me email and expect/need a response, don’t get your hopes too high. Maybe they’ll get it fixed, but I’m beginning to suspect that they really messed themselves up. This may be the push I need to explore other email options – probably ones that are cheaper and better (since register.com is not a great value in that regard – they are just convenient).

Monday, June 29, 2009 2:32:01 PM (Central Standard Time, UTC-06:00)  #    Disclaimer
 Friday, June 26, 2009

I’ve had quite the experience over the past couple weeks.

Three weeks ago I was in Las Vegas speaking at VS Live. While there, I realized I’d forgotten to copy some key files to my laptop before leaving home, but Windows Home Server made that a non-issue, since it provides a secure web interface to my files. Awesome!

Then I got home and discovered that one of the two additional hard drives I added to my WHS machine was failing. This was unpleasant, but not cause for alarm since all my key files are set to duplicate.

(I only discovered the failure because WHS started crashing, and I looked in the Windows system event log to find the drive failure notifications – they’d been occurring for several days, but I don’t check my system event log daily, so I didn’t know – this is the one place where WHS really let me down – I still don’t know why Windows knew the drives were going to fail, but WHS blindly ignored this clear intelligence…)

Unfortunately I couldn’t get WHS to dismount (remove) the failing hard drive. After 3-4 tries, it finally did remove the drive. This took 2.5 days, since each failure took 12-24 hours, as did the final success.

I should also note that I was under serious time pressure, because I was flying out to Norway for the NDC conference and only had about 3.5 days to solve the problem!

After the failed drive was removed, things were obviously not right on the WHS machine. Clearly the remove didn’t work right or something. Poking around a bit further, I found that the second additional hard drive was also failing. What are the odds of two drives failing at once? Small, but yet there I was.

I quickly bought and installed a brand new hard drive (Seagate this time, since the dual failures were Western Digitals) and tried to remove the second failing drive. The attempt was still running when I flew to Norway.

Fortunately Live Mesh allowed me to use remote desktop to get back into my network, and I kept trying to remove the drive (failure after failure) while in Norway.

When I returned from Norway I manually removed the drive. Clearly it wasn’t going to remove through software. I can’t say this made matters worse, but it sure didn’t make them better either. Now WHS still wouldn’t remove the drive even though it was shown as “missing”. It had “locked files” and couldn’t be removed.

Thanks to some excellent help from the Microsoft WHS forum (thanks Ken!) I came to realize that my only option at this point was to repair the WHS OS – basically do a reinstall. I have the cute little HP appliance, and it comes with a server restore disk – pop it into my desktop machine, run the wizard and in very little time I had my server back, just like when I bought it originally.

OK, so now I have a functioning WHS again, but it is empty, blank – all my data is gone!

I’ve been here before (a couple times) with other servers though, so I have backups for my backups. All “critical” data is always in 3 places. So I just restored my server backup and got back my “critical” files – everything for my work, all the family photos and home videos of the kids, etc.

Here’s the catch though – I rapidly discovered that my “non-critical” data is actually pretty critical. Things like music, videos and miscellaneous files.

The music I was able to recover from a Zune device. I tried my Zune device, but that was a mistake. As soon as I connected it to my desktop machine it synced – and it discovered I’d “deleted” all my music and so it cleared the device. Damn!

Fortunately my son also has a fully-synced Zune, and I connected his to my desktop machine as a guest. No automatic sync, and so I was able to highlight all music on his device and say “copy to my collection”. Just like that all our music was back on the server.

I still don’t have any videos or miscellaneous files. They are gone. Arguably this isn’t the end of the world, as technically I can get back anything that really matters – by re-downloading, or getting files from friends, etc. But that’s all a pain in the butt and a waste of time, so it is unfortunate.

(it might be that I can recover some of them from the two defunct hard drives – using various data recovery tools I may be able to connect them to my desktop machine and retrieve some of the files – but that’s also a big hassle and may not be worth the effort)

So what did I learn out of all this?

  1. WHS is awesome, and I still really love it
  2. WHS can’t handle two hard drives failing at once – if that happens you better have a backup for your server
  3. “Critical” files include things that aren’t really critical like music and maybe videos – external hard drives to backup the server are relatively cheap – just get a 2 TB external drive and back up everything – that’s my new motto!

Oh, and I’m now using IDrive to get offsite backups for my truly critical files. I know, I didn’t need it in this case, but the whole experience got me thinking about floods, tornadoes, fire, etc. What if I did lose my family photos or home videos? The last 15 years of my life is digital, and nearly all record would be lost in such a case. Having automatic backups of that data, along with other important documents and files seems really wise.

So now my super-critical files are in at least 4 places (one offsite). My critical files (using my newly expanded definition) are in at least 3 places. And my non-critical files are in 2 places. I’m so redundant I’m starting to feel like NASA :)

Friday, June 26, 2009 3:37:24 PM (Central Standard Time, UTC-06:00)  #    Disclaimer
 Wednesday, June 10, 2009

The CSLA .NET for Silverlight video series is now complete! Segment 7, covering authentication and authorization, is now online and this completes all video segments – over 8 hours of content!

The CSLA .NET for Silverlight video series is an invaluable resource for getting started with CSLA .NET on the Silverlight platform. The series starts with the basics of setting up a Silverlight solution, covers the creation of client-only applications using CSLA .NET and then moves to a discussion of creating 2-, 3- and 4-tier applications using CSLA .NET on the client and on the server(s).

Segments 5 and 6 cover CSLA .NET object stereotypes and data access respectively. These segments are also available for purchase as individual videos, because they are useful to any CSLA .NET developer, including ASP.NET, WPF, Windows Forms and more.

Segment 7 covers the use of Windows authentication, MembershipProvider authentication and custom authentication using CSLA .NET for Silverlight against an ASP.NET web or application server. It also covers the use of per-property and per-type authorization in business classes, and talks about how the PropertyStatus, ObjectStatus and CslaDataProvider controls interact with those rules.

Buy the video series before June 20 and save $50 off the regular purchase price of $300.

Wednesday, June 10, 2009 1:09:50 PM (Central Standard Time, UTC-06:00)  #    Disclaimer
 Friday, June 5, 2009

Segment 6 of the CSLA .NET for Silverlight video series is now available.

Also, both the Business Object Types and N-Tier Data Access videos (segments 5 and 6) can now be purchased separately, as each of these segments contains information valuable to any CSLA .NET developer.

Segment 6 details the various options supported by CSLA .NET for data access in n-tier scenarios. Watching this video, you will learn how to put data access code into your business class, or into a separate data access assembly, along with the pros and cons of each technique. You will also learn about the ObjectFactory attribute and base class, that can be used to create pluggable data access layers for an application.

This video is 1 hour and 49 minutes in length, so you can imagine just how much great content exists!

Not only does the video talk about editable objects and child objects and lists, it covers the common parent-child-grandchild scenario.

And it includes data access code using raw ADO.NET (for performance and long-term stability) as well as a complete walkthrough using ADO.NET Entity Framework as a data access layer.

The pre-release purchase offer of $50 off the regular price of $300 is still available. If you buy before June 20, your price is $250 for the entire video series, and you get the first 6 (of 7) video segments, nearly 7 hours of content, immediately!

And again, you can purchase segments 5 and 6 individually if you are not interested in the complete Silverlight video series.

Friday, June 5, 2009 4:30:24 PM (Central Standard Time, UTC-06:00)  #    Disclaimer
 Tuesday, June 2, 2009

Segment 5 of the CSLA .NET for Silverlight video series is now available. This segment covers all the CSLA .NET object stereotypes, including:

  • Editable objects (single and list)
  • Read-only objects (single and list)
  • Name/value list
  • Command
  • Dynamic list (EditableRootListBase)
  • etc

The focus is primarily on the business class structure and features, with some time spent discussing XAML data binding and the use of the CslaDataProvider and other UI controls.

This segment is 1:37 hours in length, yes, 97 minutes. The vast majority of this time is in Visual Studio, walking through code and providing information about class development that will be immediately useful to you.

Because segments 6 and 7 are not yet complete, I’m offering a pre-release purchase offer of $50 off the regular price of $300. If you buy before June 20, your price is $250 for the entire video series, and you get the first five segments, over 5 hours of content, immediately!

 

Also, I’m looking for feedback. Most of the content in segment 5 (and in segment 6) applies to any user of CSLA .NET – Silverlight, WPF, ASP.NET, Windows Forms, etc. Yes, there’s some Silverlight/XAML specific data binding discussion, but most of the video is focused on business class implementation.

If you are not a Silverlight developer, would you be interested in purchasing these two video segments even if some of the content didn’t apply to you? Let me know what you think, thanks!

Tuesday, June 2, 2009 7:50:02 AM (Central Standard Time, UTC-06:00)  #    Disclaimer
 Wednesday, May 27, 2009

Since WPF came out there’s been one quirk, one “optimization” in data binding that has been a serious pain.

Interestingly enough the same quirk is in Windows Forms, but the WPF team tells me that the reason it is also in WPF is entirely independent from how and why it is in Windows Forms.

The “optimization” is that when a user changes a value in the UI, say in a TextBox, that value is then put into the underlying source object’s property (whatever property is bound to the Text property of the TextBox). If the source object changes the value in the setter the change will never appear in the UI. Even if the setter raises PropertyChanged, WPF ignores it and leaves the original (bad) value in the UI.

To overcome this, you’ve had to put a ValueConverter on every binding expression in WPF. In CSLA .NET I created an IdentityConverter, which is a value converter that does nothing, so you can safely attach a converter to a binding when you really didn’t want a converter there at all, but you were forced into it to overcome this WPF data binding quirk.

WPF 4.0 fixes the issue. Karl Shifflett describes the change very nicely in this blog post.

This should mean that I can remove the (rather silly) IdentityConverter class from CSLA .NET 4.0, and that makes me happy.

Wednesday, May 27, 2009 9:28:10 AM (Central Standard Time, UTC-06:00)  #    Disclaimer