Rockford Lhotka's Blog

Home | Lhotka.net | CSLA .NET

 Tuesday, August 28, 2007

Here's a link to a preliminary list of sessions for the upcoming Twin Cities code camp. Jason needs a few more speakers to fill out the schedule though, so if you are in the Twin Cities area and would like to present on any developer related topic (business, non-business, Microsoft, non-Microsoft - it doesn't matter!) you can find contact information here.

Tuesday, August 28, 2007 7:04:29 AM (Central Standard Time, UTC-06:00)  #    Disclaimer  |  Comments [0]  | 
 Friday, August 24, 2007

Thanks to my friend Billy Hollis, I appear to have found the radio station I've been looking for: Pandora.

This is an online streaming service with a brain. You tell it some of your favorite artists and/or songs and it assembles a radio station that plays music similar to (and including) your selections. It calls the artists and songs you pick "seeds", and apparently uses some sort of genome-based algorithm to find that similar music.

I've been listening for nearly an hour now, based on my seeds (Rush, Queensryche, Godsmack and Disturbed) and it absolutely rocks!

The only glitch I've found is that if you open it in two browsers, it plays both songs over the top of each other. But that's more a user error than their issue I think :)

Not since CyberRadio 2000 got squashed by RIAA have I found quite the service I was looking for. I had hoped zune.net would do something like this, but Pandora beat them to it.

Friday, August 24, 2007 3:57:25 PM (Central Standard Time, UTC-06:00)  #    Disclaimer  |  Comments [0]  | 
 Thursday, August 23, 2007

Magenic is providing a series of free online seminars on various IT related topics. The next one is on the value of portals.

Suppose your organization’s existing IT investments could be put to use to make your customer and partner interactions more effective and increase your company’s profits. External Portals are browser-based applications that can enable your business processes to better meet the needs of your customers, your partners and your firm.

In this power-packed agenda, the IT experts from Magenic will  introduce you to the current market trends influencing IT decisions and how External Portals can help you:

  • Generate value by creating and fostering customer intimacy.
  • Maximize customer and partner interactions with a personalized online experience.
  • Increase brand satisfaction by creating unexpected opportunities to serve customers.
  • Better understand your prospects and customers.
  • Provide visibility across your extended value chain.
  • Lower cost and improve overall quality, efficiency and profitability.
  • The extraordinary value to be gained through timely, relevant information exchange
  • The simple steps you can take today to get started.

Click here for full information about the seminar.

Thursday, August 23, 2007 7:12:20 PM (Central Standard Time, UTC-06:00)  #    Disclaimer  |  Comments [0]  | 

I am working on my Using CSLA .NET 3.0 ebook and wrote some content that I don't think I'm going to use in the book now. But I don't want to waste the effort, so I'm posting it here.

The text discusses an answer to a common question: how do I get my read-only list of X to know that some X data has been changed. Obviously the following technique can't detect that some other user has changed some data, but at least it detects and handles the case where the current user changes some data that would impact an existing read-only list in that same process.

 

Another common example occurs when there is a read-only list that should be updated when an associated editable object is saved. For instance, you might have a CustomerList object that should be updated any time a Customer is saved.

To solve that issue, the CustomerList object needs to subscribe to an event that is raised any time a Customer is saved. Such an event should be a static event on the Customer class:

  public class Customer : BusinessBase<Customer>

  {

    public static event EventHandler<Csla.Core.SavedEventArgs> CustomerSaved;

 

    protected static virtual void OnCustomerSaved(Customer sender, Csla.Core.SavedEventArgs e)

    {

      if (CustomerSaved != null)

        CustomerSaved(sender, e);

    }

 

    // ...

 

    protected override Customer Save()

    {

      Customer result = base.Save();

      OnCustomerSaved(this, new Csla.Core.SavedEventArgs(result));

      return result;

    }

  }

The CustomerSaved event is declared using the standard EventHandler<T> model, where the argument parameter is of type Csla.Core.SavedEventArgs. The reason for the use of this type is that it includes a reference to the new object instance created as a result of the Save() call.

Then the Save() method is overridden so the CustomerSaved event can be raised after the call to base.Save().

The CustomerList class can then handle this static event to be notified as any Customer object is saved:

  public class CustomerList : ReadOnlyListBase<CustomerList, CustomerInfo>

  {

    private CustomerList()

    {

      Customer.CustomerSaved += new EventHandler<Csla.Core.SavedEventArgs>(Customer_Saved);

    }

 

    private void Customer_Saved(object sender, Csla.Core.SavedEventArgs e)

    {

      // find item corresponding to sender

      // and update item with e.NewObject

      Customer old = (Customer)sender;

      if (old.IsNew)

      {

        // it is a new item

        IsReadOnly = false;

        Add(CustomerInfo.LoadInfo((Customer)e.NewObject));

        IsReadOnly = true;

      }

      else

      {

        // it is an existing item

        foreach (CustomerInfo child in this)

          if (child.Id == old.Id)

          {

            child.UpdateValues((Customer)e.NewObject);

            break;

          }

      }

    }

  }

The final requirement is that the read-only CustomerInfo business object implement a LoadInfo() factory method, and an UpdateValues() method.

The LoadInfo() factory method is used to initialize a new instance of the read-only object with the data from the new Customer object. Loading the data from the Customer object avoids having to reload the data from the database when it is already available in memory:

  internal static CustomerInfo LoadInfo(Customer cust)

  {

    CustomerInfo info = new CustomerInfo();

    info.UpdateValues(cust);

    return info;

  }

The UpdateValues() method sets the read-only object’s fields based on the values from the Customer object:

  internal void UpdateValues(Customer cust)

  {

    _id = cust.Id;

    _name = cust.Name;

    // and so on ...

  }

The end result is that the CustomerList object is updated to reflect changes to the saved Customer object. The CustomerSaved event notifies CustomerList of the change, and (in most cases) CustomerInfo can update itself without hitting the database by loading the data from the Customer object that is already in memory.

Thursday, August 23, 2007 3:35:30 PM (Central Standard Time, UTC-06:00)  #    Disclaimer  |  Comments [0]  | 
 Thursday, August 09, 2007

I just spent the past few days pulling my hair out trying to get a custom principal to work in WCF.

Google returned all sorts of interesting, but often outdated and/or overly complex results. I kept looking at the techniques people were using, thinking this can't be so hard!!!

Well, it turns out that it isn't that hard, but it is terribly obscure... Fortunately I was able to get help from various people, including Clemens Vasters, Juval Lowy and (in this case most importantly) Christian Weyer. Even these noted WCF experts provided an array of options rather than a unified, simple answer like I'd expected.

My conclusion: while WCF really is cool as can be, it is also a deep plumbing technology that begs for abstraction for use by "normal" people.

Anyway, as a result of my queries, Christian got one of his colleagues to write the blog post I wish I had found a few days ago: www.leastprivilege.com - Custom Principals and WCF.

One of my motivations in researching this issue was for the WCF chapter in my upcoming Using CSLA .NET 3.0 ebook. There's now a comprehensive discussion of the topic in that chapter, starting with the creation and use of X.509 certificates and walking through the whole process of implementing custom authentication and using a custom principal in a WCF service. Dominick's blog post is great, but only covers about a third of the overall solution in the end.

The ebook should be out toward the end of September, for those who are wondering.

Thursday, August 09, 2007 2:28:24 PM (Central Standard Time, UTC-06:00)  #    Disclaimer  |  Comments [0]  | 
 Tuesday, August 07, 2007

Aaron Erickson, a fellow Magenic consultant, has done a lot of research and work on how to index LINQ queries over objects. He just published an article on the topic: Indexed LINQ.

While indexing these queries has a cost, due to building the index, it can be very beneficial if you are doing a lot of queries over a large set of data.

Tuesday, August 07, 2007 10:13:56 AM (Central Standard Time, UTC-06:00)  #    Disclaimer  |  Comments [0]  | 
 Wednesday, August 01, 2007

For those that don't know, I live in the Twin Cities area: Minneapolis and St. Paul, Minnesota.

This evening one of the major bridges crossing the Mississippi River collapsed. It is a horrible disaster. The kind of thing that you simply hope never to see.

My family and I are fine. We've called our friends who might have been affected and they are fine.

My heart goes out to the people who were caught in this disaster, and their families and friends.

Wednesday, August 01, 2007 9:18:16 PM (Central Standard Time, UTC-06:00)  #    Disclaimer  |  Comments [0]  | 

I wrote the following for the Using CSLA .NET 3.0 ebook, but I don't think I'm going to use it now, because I've wrapped most of this into a new class in CSLA .NET. Rather than letting this go to waste though, I thought I'd post it here. Remember that it is just draft content, so it may have typos or errors, but perhaps it will be useful to someone:

Basic Workflow Execution

Executing a workflow is a little tricky, because workflows default to running on a background thread. That means you must take steps to ensure that the workflow completes before the host process terminates.

One way to solve this issue is to always execute a workflow synchronously. Another is to use a thread synchronization object to prevent the process from terminating until the workflow completes.

Note: It is also possible to suspend and resume workflows, and even to unload them from memory so they store their state in a database. Later you can reload that workflow instance and resume it. These advanced scenarios are outside the scope of this book

Synchronous Execution

The code to synchronously execute a workflow follows a standard pattern:

1.      Create an instance of the WorkflowRuntime.

2.      Create a synchronization object.

3.      Set up event handlers.

4.      Create workflow instance.

5.      Ensure you have a valid principal object.

6.      Start the workflow.

7.      Wait for the workflow to complete.

The only step unique to CSLA .NET is number 5, and that is only required if you are using custom authentication. The WF runtime will automatically ensure that the background thread that executes the workflow has the same principal as the thread that calls the workflow’s Start() method, but you must ensure that the principal is set on the current thread before calling Start().

The following code implements these steps to execute the ProjectWorkflow implemented earlier in this chapter:

      using (WorkflowRuntime workflowRuntime = new WorkflowRuntime())

      {

        Exeception error = null;

 

        AutoResetEvent waitHandle = new AutoResetEvent(false);

        workflowRuntime.WorkflowCompleted +=

          delegate(object sender, WorkflowCompletedEventArgs e)

          {

             waitHandle.Set();

          };

        workflowRuntime.WorkflowTerminated +=

          delegate(object sender, WorkflowTerminatedEventArgs e)

          {

            error = e.Exception;

            waitHandle.Set();

          };

 

        // create workflow instance

        Dictionary<string,object> parameters = new Dictionary<string,object>();

        parameters.Add("ProjectId", projectId);

        WorkflowInstance instance =

          workflowRuntime.CreateWorkflow(

            typeof(PTWorkflow.ProjectWorkflow),

            parameters);

 

        // login before starting WF instance

        ProjectTracker.Library.Security.PTPrincipal.Login("pm", "pm");

 

        // execute workflow

        instance.Start();

 

        // wait for workflow to complete

        waitHandle.WaitOne();

 

        // throw any workflow exception

        if (error != null)

          throw error;

      }

Creating the workflow instance involves setting up a Dictionary<string, object> that contains name/value pairs for any parameters to be passed into th workflow instance:

        // create workflow instance

        Dictionary<string,object> parameters = new Dictionary<string,object>();

        parameters.Add("ProjectId", projectId);

        WorkflowInstance instance =

          workflowRuntime.CreateWorkflow(

            typeof(PTWorkflow.ProjectWorkflow),

            parameters);

The name in the dictionary must correspond to the name of a dependency property defined by the workflow, and of course the type of the value must match the dependency property type.

Keep in mind that creating an instance of a workflow does not start the workflow. The workflow won’t start executing until the Start() method is called later in the code.

The waitHandle synchronization object is the key to making this process synchronous. The waitHandle object starts out unset because false is passed to its constructor as its initial state:

        AutoResetEvent waitHandle = new AutoResetEvent(false);

At the bottom of the code is a line that calls WaitOne(), thus blocking the current thread until waitHandle is set:

        // execute workflow

        instance.Start();

 

        // wait for workflow to complete

        waitHandle.WaitOne();

While the current (starting) thread is blocked, the workflow is busy executing on a background thread. In other words, the Start() call returns immediately, having just started the workflow instance executing on a background thread. Without the WaitOne() call, the current thread would exit the code block, which would dispose the WF engine instance while it is executing the workflow. The result would be an exception.

Notice how the event handlers for the WorkflowCompleted and WorkflowTerminated events both call waitHandle.Set(). These events are raised by the WF engine when the workflow either completes or terminates unexpectedly. Either way, by calling the Set() method, the current thread is released so it can continue running.

In the case of a workflow terminating unexpectedly, the exception from the workflow is made available to the WorkflowTerminated event handler. You can choose what to do with this information as appropriate for your application. One technique is shown here, which is to store the Exception object in a field:

          delegate(object sender, WorkflowTerminatedEventArgs e)

          {

            error = e.Exception;

            waitHandle.Set();

          };

And then have the current thread throw the exception once it is unblocked:

        // wait for workflow to complete

        waitHandle.WaitOne();

 

        // throw any workflow exception

        if (error != null)

          throw error;

The result of this code is that the workflow appears to run synchronously, even though it really executes on a background thread.

Asynchronous Execution

Allowing a workflow to run asynchronously is just a slightly more complex version of running the workflow synchronously. The important thing is to ensure that your process doesn’t exit until the workflow is complete. This means that the synchronization object must be available at a broader scope so you can write code to prevent the application from closing if the workflow is still running.

You also must come up with a way to deal with any exception object in the case that the workflow terminates unexpectedly. One solution is to elevate the error field from the previous example to a broader scope as well.

Finally, the WorkflowRuntime instance must remain in memory until the workflow completes.

This means that you must define these fields so they exist at an application level, for instance using static fields:

  private static AutoResetEvent _waitHandle = null;

  private static Exception _workflowError = null;

  private static WorkflowRuntime _workflowRuntime = null;

Then you can create a method to start the workflow:

    public static void BeginWorkflow(Guid projectId)

    {

      _workflowRuntime = new WorkflowRuntime();

      _waitHandle = new AutoResetEvent(false);

      _workflowRuntime.WorkflowCompleted +=

        delegate(object sender, WorkflowCompletedEventArgs e)

        {

           _waitHandle.Set();

        };

      _workflowRuntime.WorkflowTerminated +=

        delegate(object sender, WorkflowTerminatedEventArgs e)

        {

          _workflowError = e.Exception;

          _waitHandle.Set();

        };

 

      // create workflow instance

      Dictionary<string,object> parameters = new Dictionary<string,object>();

      parameters.Add("ProjectId", projectId);

      WorkflowInstance instance =

        _workflowRuntime.CreateWorkflow(

          typeof(PTWorkflow.ProjectWorkflow),

          parameters);

 

      // login before starting WF instance

      ProjectTracker.Library.Security.PTPrincipal.Login("pm", "pm");

 

      // execute workflow

      instance.Start();

    }

Notice that the WorkflowRuntime object is no longer in a using block, so it can remain in memory, not disposed, while the workflow instance is running on the background thread.

The workflow instance is created the same as before, and its Start() method is called. At that point this method simply ends, returning to the caller.

Once you call BeginWorkflow() the workflow is started on a background thread, but your current thread (often the UI thread) is free to continue working.

The final piece to the puzzle is a method your application can call before it exits, or when it otherwise can’t continue without the workflow having completed:

    public static void EndWorkflow()

    {

      // wait for workflow to complete

      _waitHandle.WaitOne();

 

      // dispose runtime

      _workflowRuntime.Dispose();

 

      if (_workflowError != null)

        throw _workflowError;

    }

It is important to realize that this method will block the current thread until _waitHandle is set. If the workflow completes before this method is called, then _waitHandle is already set, and this method runs immediately, but if the workflow is still running, this method will block until the workflow completes or terminates.

For this to work, you must call EndWorkflow() before your process terminates to properly dispose the runtime and to determine if the workflow terminated unexpectedly.

 

Wednesday, August 01, 2007 4:15:26 PM (Central Standard Time, UTC-06:00)  #    Disclaimer  |  Comments [0]  | 
On this page....
Search
Archives
Feed your aggregator (RSS 2.0)
July, 2014 (2)
June, 2014 (4)
May, 2014 (2)
April, 2014 (6)
March, 2014 (4)
February, 2014 (4)
January, 2014 (2)
December, 2013 (3)
October, 2013 (3)
August, 2013 (5)
July, 2013 (2)
May, 2013 (3)
April, 2013 (2)
March, 2013 (3)
February, 2013 (7)
January, 2013 (4)
December, 2012 (3)
November, 2012 (3)
October, 2012 (7)
September, 2012 (1)
August, 2012 (4)
July, 2012 (3)
June, 2012 (5)
May, 2012 (4)
April, 2012 (6)
March, 2012 (10)
February, 2012 (2)
January, 2012 (2)
December, 2011 (4)
November, 2011 (6)
October, 2011 (14)
September, 2011 (5)
August, 2011 (3)
June, 2011 (2)
May, 2011 (1)
April, 2011 (3)
March, 2011 (6)
February, 2011 (3)
January, 2011 (6)
December, 2010 (3)
November, 2010 (8)
October, 2010 (6)
September, 2010 (6)
August, 2010 (7)
July, 2010 (8)
June, 2010 (6)
May, 2010 (8)
April, 2010 (13)
March, 2010 (7)
February, 2010 (5)
January, 2010 (9)
December, 2009 (6)
November, 2009 (8)
October, 2009 (11)
September, 2009 (5)
August, 2009 (5)
July, 2009 (10)
June, 2009 (5)
May, 2009 (7)
April, 2009 (7)
March, 2009 (11)
February, 2009 (6)
January, 2009 (9)
December, 2008 (5)
November, 2008 (4)
October, 2008 (7)
September, 2008 (8)
August, 2008 (11)
July, 2008 (11)
June, 2008 (10)
May, 2008 (6)
April, 2008 (8)
March, 2008 (9)
February, 2008 (6)
January, 2008 (6)
December, 2007 (6)
November, 2007 (9)
October, 2007 (7)
September, 2007 (5)
August, 2007 (8)
July, 2007 (6)
June, 2007 (8)
May, 2007 (7)
April, 2007 (9)
March, 2007 (8)
February, 2007 (5)
January, 2007 (9)
December, 2006 (4)
November, 2006 (3)
October, 2006 (4)
September, 2006 (9)
August, 2006 (4)
July, 2006 (9)
June, 2006 (4)
May, 2006 (10)
April, 2006 (4)
March, 2006 (11)
February, 2006 (3)
January, 2006 (13)
December, 2005 (6)
November, 2005 (7)
October, 2005 (4)
September, 2005 (9)
August, 2005 (6)
July, 2005 (7)
June, 2005 (5)
May, 2005 (4)
April, 2005 (7)
March, 2005 (16)
February, 2005 (17)
January, 2005 (17)
December, 2004 (13)
November, 2004 (7)
October, 2004 (14)
September, 2004 (11)
August, 2004 (7)
July, 2004 (3)
June, 2004 (6)
May, 2004 (3)
April, 2004 (2)
March, 2004 (1)
February, 2004 (5)
Categories
About

Powered by: newtelligence dasBlog 2.0.7226.0

Disclaimer
The opinions expressed herein are my own personal opinions and do not represent my employer's view in any way.

© Copyright 2014, Marimer LLC

Send mail to the author(s) E-mail



Sign In