Rockford Lhotka's Blog

Home | Lhotka.net | CSLA .NET

 Monday, 18 June 2018

In my microservices presentations at conferences I talk about APIs like this. I go into more depth in my presentations in terms of the background, but these are the high level points of that section of the talk.

From 1996 with the advent of MTS, Jaguar, and EJB, a lot of people create a public service API with endpoints like this pseudo-code:

int MyService(int x, double y)

That is not a service, that is RPC (remote procedure call) modeling. It is horrible. But people understand it, and the technologies have supported it forever (going back decades, and rolling forward to today). So LOTS of people create "services" that expose that sort of endpoint. Horrible!!

A better endpoint would be this:

Response MyService(Request r)

At least in this case the Request and Response concepts are abstract, and can be thought of as message definitions rather than types. Not that hardly anybody thinks that way, but they should think that way.

With this approach you can at least apply the VB6 COM rules for evolving an API (which is to say you can add new stuff, but can't change or remove any existing stuff) without breaking clients.

However, that is still a two-way synchronous API definition, so achieving things like fault tolerance, scaling, and load balancing is overly complex and WAY overly expensive.

So the correct API endpoint is this:

void MyService(Request r)

In this case the service is a one-way call, that can easily be made async (queued). That helps the mental adjustment that Request is a message definition. It also makes it extremely easy and cost-effective to get fault tolerance, scaling, and load balancing, because the software architecture directly enables those concepts.

Monday, 18 June 2018 09:54:09 (Central Standard Time, UTC-06:00)  #    Disclaimer
 Monday, 11 June 2018

Windows Server is a wonderful server operating system. However, I think it is closing in on END OF LINE (with a nod toward Tron fans).

Why do I say this? Here's my train of thought.

  1. .NET Core runs on either Windows or Linux interchangeably
  2. Linux servers are cheaper to run than Windows Servers (especially in public clouds)
  3. Docker is the future of deployment
    1. Linux containers are more mature and generally better than Windows containers
    2. Linux containers are cheaper to run
    3. Azure runs Linux and Windows Server, and Microsoft seems to care more about you using Azure than which OS you use on Azure
  4. If you are writing new server code, why wouldn't you write it in .NET Core?
  5. If you are writing .NET Core code, why wouldn't you run in (cheaper) Linux containers on (cheaper) Linux hosts?

Now I get it. You say that you have tons of full .NET 1.x, 2.x, 3.x, or 4.x code. That stuff can only run on Windows, not Linux. So obviously Windows Server isn't EOL.

I agree. It isn't yet. But neither is the green-screen AS/400 software my auto mechanic uses to file tickets when I bring my car in to get the oil changed. Has that software been updated in the past 20 years? Probably not. Does it still work? Yes, clearly. Is it the vibrant present or future of software? Hahahahahahahaa NO!

When I say Windows Server is headed toward EOL I mean it is headed toward the same place as the AS/400, the VAX, and more platforms. It'll continue to run legacy software for decades until it eventually becomes cost effective to rewrite the software running on those servers into the then-current technologies.

But if I were starting a new project today, you'd have to come up with some terribly compelling reasons why I wouldn't

  1. Write it in .NET Core (really in netstandard)
  2. Deploy it via Linux Docker containers
  3. Use a Linux Docker host

That's not to say there might not be some compelling, if short term, arguments. Such as

  1. Your IT staff only knows Windows (a career limiting move (CLM) for them!!)
  2. Your IT infrastructure is centered around Windows deployment (Docker and Kubernetes will eat you for dinner, sorry)
  3. Your IT infrastructure is centered around Windows management (valid for a while, but also a CLM)
  4. You value that Windows Server can run both Linux and Windows Docker containers (valid argument imo, for the host)

To reiterate, as a .NET developer I feel comfortable saying that the future of server-side code is .NET Standard, .NET Core, and the ability to run my code on Linux or Windows equally. And I feel comfortable saying that Docker is the best server-side deployment scenario I've yet seen in my 30+ year career.

So I guess at the end of the day, the future of Windows Server rests entirely in the hands of IT Pros, who'll be using some host OS to run my Linux Docker containers with my .NET Core apps.

Either Windows Server or Linux will offer a better overall value proposition. Which is to say that one of them will be cheaper to run on a per-hour basis in a cloud. Today that's Linux. Barring change, Windows Server is headed toward END OF LINE.

Monday, 11 June 2018 21:05:44 (Central Standard Time, UTC-06:00)  #    Disclaimer
 Tuesday, 05 June 2018

Hello all!

I'm coordinating a codeathon on Saturday, June 23, to work on the Mobile Kids Id Kit app.

This project is sponsored by the Humanitarian Toolbox and Missing Children Minnesota, and provides you with a literal opportunity to make the world a better place for children and parents.

The app is partially complete, and has been created with Xamarin Forms. There are a number of important backlog items remaining to be completed, and we need to do the work to get the apps into the Apple, Google, and Microsoft stores.

If you are looking for a way to use your Xamarin (or ASP.NET Core) skills to make the world a better place, this is your chance! Sign up here.

Tuesday, 05 June 2018 15:45:52 (Central Standard Time, UTC-06:00)  #    Disclaimer
 Monday, 14 May 2018

At Build 2018 Satya Nadella announced a new "AI for Accessibility" program. Having had a few days since then, I have a couple ways I think about this.

One is personal. A little over 2 years ago I had major surgery, including about 15% chance of being partially paralyzed as a result. Fortunately the surgery went well and paralysis didn’t happen, but it got me thinking about the importance of technology and software as an equalizer in life. Something like partial paralysis often ends people’s lives as they know it. With technology though, there’s the very real possibility of people with severe medical conditions living life at a level they never could without those technologies.

The other way I think about this is from the perspective of Magenic. We build custom software for our clients. Sometimes that software is fairly run-of-the-mill business software. Sometimes it is part of a solution that has direct impact on making people’s lives better, in big or small ways. When you get to work on a project that makes people’s lives better, that’s amazingly rewarding!

Every time I’ve had the opportunity to talk to Satya I’ve been impressed by his thoughtfulness and candor. As a result, when I hear him talk about ethical and responsible computing and AI, I believe he’s sincerely committed to that goal. And I very much appreciate that perspective on his part.

Again, veering a bit into my personal life, I read a lot of science fiction, speculative fiction, and cyberpunk. A great deal of that literature deals with the impacts of unfettered technology and AI; consequences both miraculous and terrifying. I also don’t dismiss the (non-fiction) concept of a potential “singularity”, where technology (via transhumanism or AI) results in a whole new class of being that is beyond simple humanity. Yeah, I know, I might have gone off the deep end there, but I think there’s a very real probability that augmented humans and/or AI will transform the world in ways we can’t currently comprehend.

Assuming any of this comes to pass, we can’t understate the need to approach AI and human augmentation in an ethical and thoughtful manner. Our goal must be to improve the state of mankind through the application of these technologies.

The fact that Microsoft is publicly having these conversations, and is putting money behind their words, is an important step in the right direction.

Monday, 14 May 2018 09:23:38 (Central Standard Time, UTC-06:00)  #    Disclaimer
 Friday, 20 April 2018

On Saturday April 21, 2018 I'm giving a talk at TCCC about WebAssembly (my current favorite topic).

Friday, 20 April 2018 15:14:44 (Central Standard Time, UTC-06:00)  #    Disclaimer
 Sunday, 08 April 2018

Overview

As you've probably noticed from my recent blogging, I'm very excited about the potential of WebAssembly, Blazor, and Ooui. A not-insignificant part of my enthusiasm is because the CSLA .NET value proposition is best when a true smart client is involved, and these technologies offer a compelling story about having rich smart client code running on Windows, Mac, Linux, and every other platform via the browser.

This weekend I decided to see if I could get CSLA running in Blazor. I'm pleased to say that the experiment has been a success!

You can see my log of experiences and how to get the code working via this GitHub issue: https://github.com/MarimerLLC/csla/issues/829

Issues

At first glance it would appear that CSLA should already "just work" because you can reference a netstandard 2.0 assembly from Blazor, and the CSLA-Core-NS NuGet package is netstandard 2.0. However, it is important to remember that Blazor is experimental, and it is running on an experimental implementation of mono for wasm. So not everything quite works as one might hope. In particular, I ran into some issues:

  1. System.Linq.Expressions is unavailable in Blazor, which is how CSLA (and Newtonsoft.Json) avoid use of reflection in many cases (https://github.com/aspnet/Blazor/issues/513)
  2. The DataContractSerializer doesn't work "out of the box" in Blazor (https://github.com/aspnet/Blazor/issues/511)
  3. The Blazor solution template for a .NET Core host with Blazor client doesn't work if you reference one version of Csla.dll on the server, and a different version of Csla.dll on the client - so you have to use the same Csla.dll in both projects (https://github.com/aspnet/Blazor/issues/508)
  4. The HttpClient type isn't fully implemented in mono-wasm, and it only supports passing string data, not a byte array
  5. You can't just create an instance of HttpClient, you need to use the instance provided via injection into each Blazor page

To address these issues I've created a new Csla.Wasm project that builds Csla.dll specifically to work in Blazor.

Issue 1 wasn't so bad, because CSLA used to use reflection on some platforms where System.Linq.Expressions wasn't available. I was able to use compiler directives to use that older code for Csla.Wasm, thus eliminating any use of Linq. There's a performance hit of course, but the upside is that things work at all!

Issue 2 was a bit more complex. It turns out there is a workaround to get the DCS working in Blazor (see issue 511), but before learning about that I used Newtonsoft.Json as a workaround. Fortunately this only impacts the MobileList type in CSLA.

Now keep in mind that Newtonsoft.Json doesn't universally work in Blazor either, because when it serializes complex types it also uses System.Linq.Expressions and thus fails. But it is capable of serializing primitive types such as a List<string>, and that's exactly the behavior I required.

Issue 3 is kind of a PITA, but for an experiment I'm ok with referencing the wasm implementation of Csla.dll on the server. Sure, it uses reflection instead of Linq, but this is an experiment and I'll live with a small performance hit. Remember that the wasm version of Csla.dll targets netstandard 2.0, so it can run nearly anywhere - just with the minor changes needed to make it work on mono-wasm.

Issue 4 required tweaking the data portal slightly. Well, the right answer is to create a new proxy/host channel for the data portal, but for this experiment I directly tweaked the HttpProxy type in CSLA - and that'll need to be corrected at some point. Really no change to the actual data portal should be required at all.

Issue 5 required tweaking the CSLA HttpProxy type to make it possible for the UI code to provide the data portal with an instance of the HttpClient object to use. This isn't a bad change overall, because I could see how this would be useful in other scenarios as well.

The BlazorExample project

The end result is a working Blazor sample, and you can see the code here: https://github.com/rockfordlhotka/csla/tree/blazor/Samples/BlazorExample

This solution is mostly the Microsoft template.

  • BlazorExample.Client is the Blazor client app that runs in the browser
  • BlazorExample.Server is an ASP.NET Core app running on the server, from which the client app is deployed, and it also hosts the CSLA data portal endpoint
  • BlazorExample.Shared is a netstandard 2.0 class library referenced by both client and server, so any code here is available to both

Code shared by client and server

In BlazorExample.Shared you'll find a Person class - just a simple CSLA business domain class:

using System;
using System.Collections.Generic;
using System.Text;
using Csla;

namespace BlazorExample.Shared
{
  [Serializable]
  public class Person : BusinessBase<Person>
  {
    public static readonly PropertyInfo<string> NameProperty = RegisterProperty<string>(c => c.Name);
    public string Name
    {
      get { return GetProperty(NameProperty); }
      set { SetProperty(NameProperty, value); }
    }

    private static int _count;

    private void DataPortal_Fetch(string name)
    {
      using (BypassPropertyChecks)
      {
        _count++;
        Name = name + _count.ToString();
      }
    }
  }
}

This type is available to the client and server, enabling the normal CSLA mobile object behaviors via the data portal. Anyone using CSLA over the years should see how this is familiar and makes sense.

Also notice that there's nothing unique about this code, this is exactly what you'd write for Windows Forms, ASP.NET, Xamarin, WPF, UWP, etc. One of the key benefits of CSLA - reuse your business classes across every platform where you can run .NET code.

The Blazor client app

In the client project there's a Program.cs file with the app's startup code. Here's where I configure the data portal and ensure there's a serializable principal object available:

Csla.DataPortal.ProxyTypeName =
  typeof(Csla.DataPortalClient.HttpProxy).AssemblyQualifiedName;
Csla.DataPortalClient.HttpProxy.DefaultUrl = 
  "/api/DataPortal";
Csla.ApplicationContext.User = 
  new Csla.Security.UnauthenticatedPrincipal();

This is standard CSLA initialization code that you'll find in nearly any modern app. Same as WPF, UWP, Xamarin, etc.

I chose to do my UI experiments in the Pages/Counter.cshtml page.

The real highlight here, from a CSLA perspective, is the LoadPerson method; a handler for the "Load person" button:

async void LoadPerson()
{
  try
  {
    // Provide injected HttpClient to data portal proxy
    Csla.DataPortalClient.HttpProxy.SetHttpClient(Http);
    // Get person object from server
    person = await Csla.DataPortal.FetchAsync<BlazorExample.Shared.Person>("Fred");
  }
    catch (Exception ex)
  {
    errorText = ex.Message + ":: " + ex.ToString();
  }
  StateHasChanged();
}

The unique thing here is where the SetHttpClient method is called to provide the data portal proxy with access to the HttpClient object injected at the top of the page:

@inject HttpClient Http

This particular HttpClient instance has been initialized by Blazor, so it has all the correct settings to talk easily to the deployment web server, which is also where I hosted the data portal endpoint.

The page also makes use of Blazor data binding. In particular, there's a person field available to the Razor code:

BlazorExample.Shared.Person person = new BlazorExample.Shared.Person();

And then in the Razor "html" this is used to display the business object's Name property:

<p>Name: @person.Name</p>

Because the LoadPerson method is async, it is necessary to tell Blazor's data binding to refresh the UI when the data has been retrieved. That call to StateHasChanged at the bottom of the method is what triggers the data binding UI refresh.

The ASP.NET Core web/app server

The server project has a couple unique things.

I had to work around the fact that a byte array can't be passed over the network from Blazor. So there's a modification to the CSLA HttpProxy class (client-side) to pass base64 encoded data to/from the server. For example:

//httpRequest.Content = new ByteArrayContent(serialized);
httpRequest.Content = new StringContent(System.Convert.ToBase64String(serialized));

Then in the server project there's a custom HttpPortalController class, copied from CSLA and also tweaked to work with base64 encoded data. For example:

string requestString;
using (var reader = new StreamReader(requestStream))
  requestString = await reader.ReadToEndAsync();
var requestArray = System.Convert.FromBase64String(requestString);
var requestBuffer = new MemoryStream(requestArray);

This controller is then exposed as an API endpoint via the DataPortalController class in the Controllers folder:

using Microsoft.AspNetCore.Mvc;

namespace BlazorExample.Server.Controllers
{
  [Route("api/[controller]")]
  public class DataPortalController : Csla.Server.Hosts.HttpPortalController
  {
  }
}

This is no different from hosting the data portal in ASP.NET Core (or ASP.NET MVC) in any other setting - except that it is using that custom controller base class that works against base64 strings instead of byte arrays like normal.

Because the BlazorExample.Shared assembly is referenced by the server project, the data portal automatically has access to the same Person type that's being used by the client, so again, the normal CSLA mobile object concept just works as expected.

Summary

I estimate I spent around 20 hours fighting through the various issues listed in this blog post. As per normal, most of the solutions weren't that hard in the end, but isolating the problems, researching possible solutions, testing the various solutions, and settling on the answer - that takes time and persistence.

Also - the support from the Blazor community on gitter is really great! And the team itself via GitHub - also really great!

One comment on this - there's no debugger support in Blazor right now, hence my tweet in the middle of my work:

That did make things a lot more tedious than normal modern development. It was like a throwback to 1990 or something!

The end result though, totally worth the effort! It is so cool to see normal CSLA code running in Blazor, data bound to a UI, and interacting with an app server via the data portal!

Sunday, 08 April 2018 13:23:57 (Central Standard Time, UTC-06:00)  #    Disclaimer
 Monday, 02 April 2018

I want to put a stake in the ground. I could be right, I could be wrong, but at least I'll have recorded this stand for future reference.

WebAssembly plus PWA technology means that the future of the "client platform" is the browser.

All existing client platforms are now officially "legacy" and will just fade away over the next couple decades. Windows, macOS, iOS, Android all are just walking dead now.

I'm not saying this will be fast, or even painless. But Microsoft is relegating Windows to second-class status, even though it runs on over a billion devices. The Mac, nice as people think it is, has never exceeded 12% market share. The smartphone market appears to be maturing - meaning it isn't growing any more, so there won't be massive income to drive new iOS or Android features.

And ultimately people have never purchased computers (including smartphones) for the OS. They purchase them because the devices run valuable apps - they do valuable things, and that means software.

Over the next few years as wasm and PWA concepts mature, we'll have browser deployed smart client apps with native (enough) performance.

Essentially, all the reasons why web sites haven't been able to compete with apps are being resolved by the wasm/PWA concepts. Web-style deployment? Browsers have that. App-style deployment? PWAs fix that. Performance? wasm fixes that. Porting existing massive codebases into the future? .NET (for example) on wasm helps address that.

I strongly suspect that Microsoft's recent org changes around Windows are just a recognition that the "client platform" of the future is whatever is necessary to run a browser that supports wasm/PWA - which is essentially all modern browsers.

What we think of as an OS today is merely a browser-delivery vehicle for the future. Something has to provide all those drivers and whatnot so browser-based apps can rely on them.

Yeah, I hear the gaming community screaming. And as one of them, I agree - I expect to continue to pump money into a "real computer" for a long time to come so I can play Battlefield and Elite Dangerous.

But for everyone else (and that's most people) the era of a "real computer" with a "real operating system" is coming to a close.

The browser with wasm/PWA is the new client platform, everything else is a vestige of a time that's come and gone.

fwiw, this was a topic of hot discussion in 1999 before the .com bubble burst. I remember speaking at various conferences and hearing big debates between fellow speakers, as well as attendees in the hallways, about whether the browser should remain a colorful-but-dumb terminal, or should become a true client platform for app dev.

That conversation was large derailed by the .com collapse (the dot bomb?). Why it has finally come back around nearly 20 years later is an interesting question unto itself.

But here we are, with the browser clearly an app dev client platform, capable of hosting rich, interactive smart client apps with web-like and app-like deployment, and (via wasm) performance at a native enough level to be viable.

Monday, 02 April 2018 20:12:20 (Central Standard Time, UTC-06:00)  #    Disclaimer
 Wednesday, 28 March 2018

Power BI is a pretty amazing tool, even for data analysis novices like me. The ability to take data from an Excel spreadsheet and create useful and compelling dashboards as a power user is really wonderful!

However, I don't create such a dashboard all that often, and I find that each time I do it I fall over one particular issue: how to connect my PBI report to a dataset that can auto-refresh.

My dataset is an Excel file in O365 SharePoint (actually in a folder/site managed by Microsoft Teams). In theory it is easy to have PBI connect to such a spreadsheet, and when the report is published to PBI in the web it can be set to auto-refresh the data daily, weekly, etc.

In practice however, this is fraught with peril. The instructions commonly available from Microsoft and other blogs leave out a couple key issues that (I find at least) derail the process and lead to hours of frustration.

I use Power BI Desktop to create my report. So step 1 is to connect to the Excel file hosted in O365 SharePoint. And it is step 1 where things go wrong, because there are so many different ways to connect to a spreadsheet, and so many URL variations for a spreadsheet hosted in SharePoint - most of which won't actually work in the end.

But what's worse, is that all these myriad variations work from Power BI Desktop, so you are fooled into thinking all is well. When in reality, after you publish the report to PBI in the web it will fail! Very frustrating!

My colleague Scott Diehl and I spent some time this afternoon sorting through what works and what only appears to work, and I am documenting it in this blog post so next time I need to create a dashboard I'll have a place to find the answer without re-learning it.

So here's the thing. I have this spreadsheet in Teams.

And if I click on the "Open in SharePoint" link I can see it in O365 SharePoint via the browser.

If I click the "Copy Link" button here I get a link to the file, such as:

https://mysite.sharepoint.com/:x:/s/TechologyLeadershipTeam/EWToC-py8VVMrdQz-_yXfPoB6_qQky37CgRZyrOtX8FZQA?e=vKQUGF

And you can use that link to add a dataset to PBI Desktop and it work great.

BUT THIS LINK WON'T WORK WHEN YOU PUBLISH TO PBI WEB!

So instead what you must do is open the spreadsheet in actual desktop Excel. Not in Excel Online, not in O365 SharePoint, but in actual Excel.

Then go to the File tab in the ribbon and you should see something like this.

Notice the text I've highlighted. Left-click on this and choose "Copy Link to Clipboard". The result is a URL like this:

https://mysite.sharepoint.com/sites/TechologyLeadershipTeam/Shared%20Documents/General/2018/2018%20Activity%20Tracking.xlsx?web=1

This is just a different URL to the same exact file, but this URL is one that PBI Web will actually accept.

Note: You do need to remove the ?web=1 bit from the end of the URL before using this to create your dataset in PBI Desktop.

From here you can create your report and dashboard, and publish to PBI Web. Then in PBI Web you can edit the dataset to provide PBI Web with your organization credentials, and to set up a refresh schedule for the data:

One final note. If you do what I did, and create your report and dashboard using the first style of URL that works in Desktop but not Web, you are not totally out of luck. In that case, follow the above steps to get the functional URL from Excel, and then in PBI Desktop you can edit the dataset's source.

and

That'll allow you to replace the broken URL with the working URL, without having to recreate your entire report or lose your work.

Wednesday, 28 March 2018 13:27:54 (Central Standard Time, UTC-06:00)  #    Disclaimer
On this page....
Search
Archives
Feed your aggregator (RSS 2.0)
September, 2018 (2)
August, 2018 (3)
June, 2018 (4)
May, 2018 (1)
April, 2018 (3)
March, 2018 (4)
December, 2017 (1)
November, 2017 (2)
October, 2017 (1)
September, 2017 (3)
August, 2017 (1)
July, 2017 (1)
June, 2017 (1)
May, 2017 (1)
April, 2017 (2)
March, 2017 (1)
February, 2017 (2)
January, 2017 (2)
December, 2016 (5)
November, 2016 (2)
August, 2016 (4)
July, 2016 (2)
June, 2016 (4)
May, 2016 (3)
April, 2016 (4)
March, 2016 (1)
February, 2016 (7)
January, 2016 (4)
December, 2015 (4)
November, 2015 (2)
October, 2015 (2)
September, 2015 (3)
August, 2015 (3)
July, 2015 (2)
June, 2015 (2)
May, 2015 (1)
February, 2015 (1)
January, 2015 (1)
October, 2014 (1)
August, 2014 (2)
July, 2014 (3)
June, 2014 (4)
May, 2014 (2)
April, 2014 (6)
March, 2014 (4)
February, 2014 (4)
January, 2014 (2)
December, 2013 (3)
October, 2013 (3)
August, 2013 (5)
July, 2013 (2)
May, 2013 (3)
April, 2013 (2)
March, 2013 (3)
February, 2013 (7)
January, 2013 (4)
December, 2012 (3)
November, 2012 (3)
October, 2012 (7)
September, 2012 (1)
August, 2012 (4)
July, 2012 (3)
June, 2012 (5)
May, 2012 (4)
April, 2012 (6)
March, 2012 (10)
February, 2012 (2)
January, 2012 (2)
December, 2011 (4)
November, 2011 (6)
October, 2011 (14)
September, 2011 (5)
August, 2011 (3)
June, 2011 (2)
May, 2011 (1)
April, 2011 (3)
March, 2011 (6)
February, 2011 (3)
January, 2011 (6)
December, 2010 (3)
November, 2010 (8)
October, 2010 (6)
September, 2010 (6)
August, 2010 (7)
July, 2010 (8)
June, 2010 (6)
May, 2010 (8)
April, 2010 (13)
March, 2010 (7)
February, 2010 (5)
January, 2010 (9)
December, 2009 (6)
November, 2009 (8)
October, 2009 (11)
September, 2009 (5)
August, 2009 (5)
July, 2009 (10)
June, 2009 (5)
May, 2009 (7)
April, 2009 (7)
March, 2009 (11)
February, 2009 (6)
January, 2009 (9)
December, 2008 (5)
November, 2008 (4)
October, 2008 (7)
September, 2008 (8)
August, 2008 (11)
July, 2008 (11)
June, 2008 (10)
May, 2008 (6)
April, 2008 (8)
March, 2008 (9)
February, 2008 (6)
January, 2008 (6)
December, 2007 (6)
November, 2007 (9)
October, 2007 (7)
September, 2007 (5)
August, 2007 (8)
July, 2007 (6)
June, 2007 (8)
May, 2007 (7)
April, 2007 (9)
March, 2007 (8)
February, 2007 (5)
January, 2007 (9)
December, 2006 (4)
November, 2006 (3)
October, 2006 (4)
September, 2006 (9)
August, 2006 (4)
July, 2006 (9)
June, 2006 (4)
May, 2006 (10)
April, 2006 (4)
March, 2006 (11)
February, 2006 (3)
January, 2006 (13)
December, 2005 (6)
November, 2005 (7)
October, 2005 (4)
September, 2005 (9)
August, 2005 (6)
July, 2005 (7)
June, 2005 (5)
May, 2005 (4)
April, 2005 (7)
March, 2005 (16)
February, 2005 (17)
January, 2005 (17)
December, 2004 (13)
November, 2004 (7)
October, 2004 (14)
September, 2004 (11)
August, 2004 (7)
July, 2004 (3)
June, 2004 (6)
May, 2004 (3)
April, 2004 (2)
March, 2004 (1)
February, 2004 (5)
Categories
About

Powered by: newtelligence dasBlog 2.0.7226.0

Disclaimer
The opinions expressed herein are my own personal opinions and do not represent my employer's view in any way.

© Copyright 2018, Marimer LLC

Send mail to the author(s) E-mail



Sign In