Rockford Lhotka

 Tuesday, September 21, 2004

I read a lot of science fiction. I travel a lot (which you can tell from my list of speaking engagements) and I find it very hard to work on a plane. You might think I’d read computer books, but I regard computer books as reference material, so I very rarely actually read a computer book, as much as skim it or use the index to find answers to specific questions… Besides, they are all so darn big, and it gets heavy carrying them around airports :)


So what’s the point of this post? I recently read a very excellent book that I believe has a great deal of meaning for the computer industry. Sure, it is science fiction, but it is also paints a picture of what computing could look like if today’s service-oriented concepts come to true fruition.


The book is Vernor Vinge’s A Deepness in the Sky, which is a prequel to A Fire Upon the Deep. Fire is an excellent novel, and was written first. I highly recommend it. However, Deepness is the book to which I’m referring when I talk about the links to the computer industry.


I doubt the links are accidental. Vernor Vinge is a retired professor of Computer Science in San Diego, and obviously has some pretty serious understanding of computer science and related issues.


I should point out that what follows could give away key points in the story. The story is awesome, so if you are worried about having it lose some of its impact then STOP READING NOW. Seriously. Take my word for it, go buy and read the book, then come back.


The technology used by the Qeng Ho (the protagonist race, pronounced “cheng ho”) in the book is based on two core concepts: aggregate systems and the wrapping/abstraction of older subsystems.


First is the idea that all the systems are aggregates of smaller systems, which are aggregates of smaller systems and so forth. This is service-orientation incarnate. If you know the protocol/interface to a system, you can incorporate it into another system, thus creating a new system that is an aggregate of both. Extending this into something as complex as a starship or a fleet of starships gives you the effect Vinge describes in the book.


And this is a compelling vision. I have been in a bit of a funk over the past few months. It seems like our industry is lost in the same kind of rabid fanaticism that dominates the US political scene, and that is very depressing. You are either Java or .NET. You are either VB or C#. That gets very old, very fast.


But Vinge has reminded me why I got into computers in the first place. My original vision – way back when – was to actually link two Asteroids arcade games together so multiple people could play together. It may sound corny, but I have been all about distributed systems since before such things existed.


And Vinge’s vision of a future where massively complex systems are constructed by enabling communication between autonomous distributed systems is exactly what gets me excited! It is like object-oriented design meets distributed architecture in a truly productive and awesome manner. If this really is the future of service-orientation then sign me up!


The second main theme is the idea that most systems wrap older sub-systems. Rather than rewriting or enhancing a subsystem, it is just wrapped by a newer and more useful system. Often the new system is more abstract, leaving hidden functionality available to those who know how to tap directly into the older subsystem directly.


This second theme enables the first. Unless systems (and subsystems) are viewed as autonomous entities, it is impossible to envision a scenario where service-oriented architecture is a dominant idea. For better or worse, this includes dealing with the fact that you may not like the way a system works. You just deal with it, because it is autonomous.


To digress a bit, what we’ve been trying to do for decades now is get computers to model real life. We tried with procedures, then objects, then components. They all miss the boat, because the real world is full of unpredictable behavior that we all just deal with.


We deal with the jerks that use the shoulder as an illegal turn lane. We deal with the fact that our federal tax cuts just get redirected so we can pay local taxes and school levies. We deal with the fact that some lady is so busy on the cell phone that she rear-ends you at 40 mph when you are stopped at a red light. We deal with the fact that the networks run R rated movie commercials during primetime when our kids are watching TV.


The point is that people and all our real world systems and devices and machines are autonomous. All our interactions in the real world are done through a set of protocols and we just hope to high heaven that the other autonomous entities around us react appropriately.


Service-oriented architecture/design/programming is the same thing. It has the potential to be the most accurate model of real life yet. But this won’t be painless, because any software we write must be ready to deal with the autonomous nature of all the other entities in the virtual setting. Those other autonomous entities are likely to do the direct equivalent of using the shoulder as a turn lane – they’ll break the law for personal advantage and from time to time they’ll cause an accident and every now and then they’ll kill someone. This is true in the real world, and it is the future of software in the service-oriented universe.


To bring this back to Vinge’s Deepness, the various combatants in the book make full use of both the distributed autonomous nature of their computer systems, and of the fact that the systems are wrapping old – even ancient – subsystems with hidden features. It isn't understanding a computer language that counts, it is understanding of ancient, lost features of deeply buried subsystems that gives you power.


We are given a vision of a future where systems and subsystems are swapped and tweaked and changed and the overall big system just keeps on working. At most, there’s adaptation to some new protocol or interface, but thanks to autonomy and abstraction the overall system keeps working.


It is perfect? Absolutely not. In fact I never saw the big twist coming – nor did the protagonists or antagonists in the book. The overall system was too complex, too autonomous. There was no way they could have anticipated or monitored what eventually occurred. Very cool (if a bit scary). But I don’t want to spoil it entirely, so I’ll leave it there.


Go read the book, hopefully it will inspire some of the same excitement I got by reading it. It certainly gave me an appreciation for the cool (and scary) aspects of a truly service-oriented world-view!


Tuesday, September 21, 2004 7:59:04 AM (Central Standard Time, UTC-06:00)  #    Disclaimer
 Saturday, September 18, 2004

Some people are still trying to have browser wars. Now it is apparently between Firefox and IE. What a misguided concept.


Star Trek covered the issue years ago in an episode titled “Let that be Your Last Battlefield” (which is probably the only good episode in all of season 3). In this episode there are two characters. One, named Lokei, has black skin on the left side and white skin on the right. Another, Bele, has black on the right and white on the left.


The point of the episode is to illustrate just how inane the concept of racism really is. But the lesson is easily extended to any scenario where meaningless differences are used as a divisive technique. This is true of the silly arguments between VB and C#, Ford and Chevy, and is equally true of the so-called “browser wars” of today.


When two things are identical except for superficial differences then it is a massive waste of time and energy to get worked up over who picks which thing. People who consider themselves “superior” for picking on set of superficial differences are simply (not to pull any punches) idiots.


Years ago when I was a DEC VAX guy we used DEC branded VT terminals (VT52, then VT100 and VT220).


Another company, Wyse, had VT terminals that were cheaper so we switched to them. These terminals also had a couple odd features we didn't use – after all, we still had lots of DEC terminals, so we stuck with the common (standard) ESC sequences.


Browsers are the same thing. Since HTML has stagnated (or was that standardized?), it doesn't really matter what browser you use. Who cares? Virtually all web sites out there use HTML 3.2, because that’s the de facto standard that works reasonably well on all terminals – oops, I mean browsers.


The difference between having tabs or not, the specific icons on the toolbar, or how favorites are organized are immaterial. In the end, all the current browsers pretty much understand the same ESC sequences (except now those sequences are HTML – whoop de doo).


Sure, some people are foolish enough to use the browser as a programming platform (as in using client-side script to do a rich UI). Those poor people are stuck with IE (or whatever browser they targeted), but that is a poor strategy anyway. Note the total lack of development tools support for client-side programming. There's not a vendor out there who is encouraging or enabling client-side programming. It is a total dead-end wasted investment in the long run.


No reputable Internet company is foolish enough to go down the client-side script road. Only misguided IT shops are doing this, and they are going to get burned over time...


If you want a rich client, use Windows or GNOME or KDE. That’s what these technologies were designed for!!!! Don’t bastardize a terminal/browser into doing something way beyond its design parameters. That’s like using a duck as a pack animal when there are perfectly good mules and horses standing right there.


In the end, when you look at IE or Firefox, either switch, or don't switch. I honestly don't see where it matters. This is fundamentally the same debate as whether to switch from VT terminal vendor X to vendor Y - only now the price for both products is zero.


IE is black on the left side, and Firefox is black on the right. Other than superficial differences they are the same damn thing.


In the Star Trek episode the two “races” had fought so long and hard that they’d literally destroyed their planet. Lokei and Bele were the only members of their world left alive. Thankfully the “browser war” is unlikely to decimate Earth or even the IT industry, but it certainly does have the potential to waste more time and energy than free products can possibly be worth…

Saturday, September 18, 2004 8:27:26 AM (Central Standard Time, UTC-06:00)  #    Disclaimer
 Thursday, September 16, 2004

From a recent MSDN article on UI design comes this little quote:

“[...] We then discussed two types of UI designs—deductive and inductive. Generally speaking, the former puts the onus on the user to manage and learn a task, while the latter takes on the onus to guide the user through a task. The latter consequently turns out to be a great UI design choice for infrequently used tasks [...]”

I think this is very interesting, because I've seen numerous examples where an inductive UI was used for frequently performed tasks. And while these UIs look very cool, they are really, really inefficient if you have to use them very often...

Thursday, September 16, 2004 5:43:46 PM (Central Standard Time, UTC-06:00)  #    Disclaimer

I just had a conversation with a member of a Microsoft product team earlier this week. He said that he wished they were doing something comparable to ndoc in Visual Studio 2005, but they aren't.


For those who don’t know, ndoc ( is a very cool tool that takes the XML comments and assembly metadata from your .NET code and creates MSDN-style help in html and chm format. The XML document support is available in VB if you use vbxc and C# supports it directly. ndoc is open-source software (OSS), and it is very good. I use it on a regular basis and recommend it.


In my mind, the idea that it was somehow bad that Microsoft hadn’t created an alternative to ndoc triggered the question: “Why is it that Microsoft feels the overwhelming need to compete with and replace perfectly good tools that already exist? Especially free ones?”


Not to say that there might not be perfectly good reasons to compete with ndoc on some level. I guess I don't know. But what's the point? To crush the spirit and community effort put forth by some group of loyal .NET developers? That certainly makes little sense...


But I keep forgetting - OSS is evil. Donating time and effort for no immediate monetary return is bad. If you aren’t making money directly off your work then there’s something wrong with you and you should be crushed.


But wait! I am a Microsoft Regional Director (RD) – a relatively small group of people around the planet who help evangelize Microsoft tools and technologies. And I am a Microsoft Most Valuable Professional (MVP) – another group of people around the world who help support the Microsoft community.


The whole point of the RD and MVP programs is for Microsoft to acknowledge people who donate time and effort to Microsoft and the community for no immediate monetary return.


So now I'm conflicted... Donated time/effort is evil when done independently. But Microsoft sanctioned donation of time/effort is encouraged. Say what!?


Honestly, Microsoft just needs to get past this knee-jerk reactionary stance on OSS and realize that it has strong benefits for all of us – including Microsoft.


Part of the recent success of .NET has been due to OSS. Tools like ndoc and a host of others have made .NET development truly productive for many organizations. Productive in ways that even Visual Studio 2005 is unlikely to match. This is only good, as it has spurred adoption of .NET where it otherwise may not have been used.


It is also good in that it has forced some traditionally anti-Microsoft people to rethink their world-view. If OSS can thrive in the .NET space as well or better than it can in the Java space then is .NET really such an evil thing? I’ve personally used the existence and broad support for various .NET OSS tools to bring some Java-focused people to a realization that .NET is a pretty damn cool platform.


And finally there’s the competitive aspect. Microsoft is only good when it has competition. Without competition Microsoft tends to serious lag.


Look at Office and the sorry improvements in that space over the past few years. No competition, and the products get incremental and generally lame improvements (with the exception of Outlook 2003, which is really nice!). I don't think I've used a single new feature of Word since Office 97. I just keep upgrading to stay current, not because I get any value.


On the other hand look at .NET. Due to the competition from Java and J2EE we Microsoft-loyalists now have the (arguably) best programming platform and tools ever created. Not that .NET is perfect by any means, but it is seriously cool and fun and productive!! All thanks to the Java world, which provided competition and drove Microsoft to make radical shifts in tools and technologies in ways that really benefit us in important ways.


And due to the continued pressure from OSS (and other vectors), we’re seeing substantial improvements coming in the 2005 series of .NET tools. For example, integrated unit testing (to compete with nunit).


And I do think this competitive view is healthy, but also frustrating. It is healthy because it drives innovation and integration of cool tools - increasing my productivity. It is frustrating because Microsoft somehow doesn't exude a sense of competition as much as “OSS is evil and must be destroyed” - which is totally counter-productive on all levels.


Competitors aren’t evil, they just are.


I think competition, especially with OSS, should be viewed as a net win overall.


Suppose Microsoft does (at some point) create something comparable to ndoc, but integrated into Visual Studio. We (as users) would get an integrated and probably more polished documentation tool, and the guys who built ndoc would be freed up to go create some new and even cooler OSS tool to fill in some other missing functionality in the Microsoft development tools. Everyone wins – at least if you look at the bright side of things :-)


In the meantime I have work to do - including building some updated online help files by using ndoc.


Thursday, September 16, 2004 9:28:55 AM (Central Standard Time, UTC-06:00)  #    Disclaimer
 Thursday, September 9, 2004

I just love a good discussion, and thanks to Ted I’m now part of one :-)


A few days ago I posted another blog entry about service-oriented concepts, drawing a parallel to procedural design. Apparently and unsurprisingly, this parallel doesn’t sit too well with everyone out there.


I agree that SO has the potential to be something new and interesting. Maybe even on par with fundamental shifts like event-driven programming and object-oriented design have been over the past couple decades. However, I am far from convinced that it really is such a big, radical shift – at least in most people’s minds.


After all, at its core SO is just about enabling my code to talk to your code. A problem space that has been hashed and rehashed for more years than most of us have been in the industry.


Most people view SO as SOAP over HTTP, which basically makes it the newest RPC/RMI/DCOM technology. And if that’s how it is viewed, then it is a pretty poor replacement for any of them…


Some people view SO as a technology independent concept that can be used to describe business systems, technology systems – almost anything. Pat Helland makes this case in some of his presentations. I love the analogy he makes by using SO to describe an old-fashioned experience of coming in and ordering breakfast in a diner. That makes all the sense in the world to me, and there’s not a computer or XML fragment involved in the whole thing.


Yet other people view SO as a messaging technology. Basically this camp is all about SOAP, but isn’t tied to HTTP (even though there are precious few other transports that are available for SOAP today).


Obviously for SO to be a radical new thing, one of the latter two views must be adopted, since otherwise RPC has it covered.


By far the most interesting view (to me) is of SO as a modeling construct, not a technological one. If SO is a way of describing and decomposing problem spaces in a new way, then that is interesting. Not that there’s a whole lot of tool/product money to be had in this view – but there’s lots of consulting/speaking/writing money to be made. This could be the next RUP or something :-)


Less interesting, but more tangible, is the messaging view. Having worked with companies who have their primary infrastructure built on messaging (using MQ Series or MSMQ), I find this view of SO to be less than inspiring.


It has been done people! And it works very nicely – but let’s face it, this view of SO is no more interesting or innovative than the RPC view. The idea of passing messages between autonomous applications is not new. And adding angle brackets around our data merely helps the network hardware vendors sell more equipment, it doesn’t fundamentally change life.


Again, I call back to FORTRAN and procedural programming. The whole idea behind procedural programming was to have a set of autonomous procedures that could be called in an appropriate order (orchestrated) in order to achieve the desired functionality. If we had not cheated – if we had actually passed all our data as parameters from procedure to procedure, it might have actually worked. But we didn’t (for a variety of good reasons), and so it collapsed under its own weight.


Maybe SO can avoid this fate, since its distributed nature makes cheating much more difficult. But even so, the design concepts developed 20 years ago for procedural design and programming should apply to SO today, since SO is merely procedural programming with a wire between the procedures.


So in short, other than SO as an analysis tool, I continue to seriously struggle with how it is a transformative technology.


Either SO is a new RPC technology, enabling cross-network component access using XML, or SO is a new messaging technology, enabling the same autonomous communication we have with queuing technologies – but with the dubious advantage of angle brackets. Neither is overly compelling in and of itself.


And yet intuitively SO feels like a bigger thing. Maybe it is the vendor hype: IBM with their services and Microsoft with their products – everyone wants to turn the S in SOA into a $.


But there is this concept of emergent behavior, which is something I’ve put a fair amount of energy into. Emergent behavior is the idea that simple elements – even rehashed technology concepts – can combine in unexpected ways such that new and interesting behaviors emerge.


Maybe, just maybe, SO will turn out to be emergent by combining the concepts of RPC, messaging and angle brackets into something new and revolutionary. Then again, maybe it is a passing fad, and SO is really just another acronym for EDI or EAI.

Thursday, September 9, 2004 8:42:46 AM (Central Standard Time, UTC-06:00)  #    Disclaimer
 Wednesday, September 8, 2004

Every now and then I get asked for career advice – often from people just getting into the computer industry, but sometimes from experienced developers who are considering getting into writing, speaking or similar activities.


My overarching career advice is to be patient, focused and persistent. There really is no substitute for experience, so patience is important. Though there's no substitute for experience, a focused person can learn more in a year than someone who's just coasting through a job from day to day. All that said, you never get anywhere if you don't keep steady pressure on yourself and your surroundings to move slowly toward your goal, so persistence is key.


In the Clinton-era economic boom I encountered numerous people with 0-3 years experience as a developer who wanted to be an “architect”. Now! Instantly! That is an absurd concept, since an actual architect can only function due to having years of experience on many different types of project in different settings and environments. There really is no substitute for experience.


However, I’ve worked with people who have 5-7 years of experience that are absolutely qualified to be application architects. I’ve also worked with people who have 10+ years of experience that are not qualified for anything beyond the developer role.


What’s the difference? Focus.


Some people have a career, others have a series of jobs. People who have a career understand that everything is a learning experience, and that focusing on the benefits of each task or job is only good in the long run. People who just work in a string of jobs don’t have this focus and don’t learn nearly as much.


Like most people, I’ve had my share of crappy projects where I worked on old technology rather than the newest and best stuff. But I’ll tell you that each and every one of those projects taught me something. Maybe not about some specific tool, but often about business, or software design, or architectural principals.


To be truly successful in the computer industry you need to understand far more than just tools and design patterns. You need to understand the interactions between users, and systems, and networks, and operating systems, and tools, and fads/trends. You need to appreciate the cyclic nature of our technology so you can recognize when some “new” thing is an upgraded rehash of an old thing (like SOA and procedural programming).


These things only come through focus. Focus on your career, not just on the job/project at hand. A career is a long-term play. Who cares if you are a developer for a few years before becoming a lead, and then an architect? We’re talking about a 30-40 year span of time here, so 5-10 years to become an architect is nothing.


Yes, I know that careers used to be a 30 year proposition, but due to longer lifespans and the erosion of retirement security (e.g. the impending crash of social security and the destruction of pensions due to things like Enron) most of us (in the US at least) can expect to spend more like 40 years in the workforce in one capacity or another. Personally I’d rather spend more time doing cool computer stuff and less working at McDonalds as a “returning worker”, so I stay focused on computers and automation.


What I’m getting at here, is that things take time and effort. And we have time, and you can choose to put forth effort. So that’s OK.


But while you must be patient, you absolutely need to keep pushing toward your long-term goals. Persistence is what makes your patience and focus pay off. Purely patient people can succumb to inertia. Purely focused people can be sidetracked by frustration. But persistent people have the ability to patiently focus over the long-term in order to get where they want to be.


Wednesday, September 8, 2004 8:48:38 AM (Central Standard Time, UTC-06:00)  #    Disclaimer
 Friday, September 3, 2004

In some previous posts on service-oriented architecture (SOA), I got comments from various people indicating that SO could/should be applied inside applications as well as between applications. I am skeptical about this - here's why.

Attempting to apply service-oriented concepts inside an application is non-trivial. This is exactly what we tried to do in FORTRAN, C and other languages years ago.

At the time we all argued that it was a good idea - procedural programming made a lot of sense. If you passed all required data in via paramters, and got all data out as a result then procedures were a great way to encapsulate behavior. In this context a procedure of yesteryear and a service of tomorrow are really the same thing.

Of course it didn't actually work that way. Packaging all the data into the parameters was often a huge task, and so virtually everyone cheated by using a common block of memory or other global variable techniques. Thus procedural programming was brought to its knees, because procedures didn't “own” the data on which they operated.

This is a core flaw in object-oriented (OO) designs as well. It is very common to pass an object as a parameter and have some method act on the object. But what you really pass is an object reference. The actual object is conceptually similar to a common block in that case.

Now I agree that OO avoids many flaws of procedural programming, but the fact that methods end up acting on the same physical set of data (objects) can be problematic.

So to apply SO (service-oriented) concepts inside an application means that we're going to take the procedural approach from FORTRAN, but this time we're not going to cheat. Instead we're going to package all the required data into a message (not an object, because the message must be passed by value!!!) and pass it to the procedure (service).

As with FORTRAN years ago, I agree that this could be done. But we didn't have the discipline then, and I very much doubt we have the discipline now. It is just too damn much work to create an XML document (or whatever) each time I need to call a method, and then to unpack an XML document to get at the result. It is simply unrealistic to expect that developers are going to go through this much work.

And people today complain about code bloat. Think of the bloat we're talking about here. Rather than putting an integer on the stack to call a method, we're now going to put the integer into XML and put the XML on the stack? Good thing Moore's Law is out there to save our collective asses...

This is not to say that SO won't work between applications. In that case it will (I think) work very nicely, because you can't cheat. It isn't possible to use a common block or reference to an object across process or network boundaries, so we have to pack parameters and pass them by value.

In the end, SO is what FORTRAN and C were meant to be - and without the possibility of cheating it might really work. Long live procedural programming!

Friday, September 3, 2004 8:04:15 AM (Central Standard Time, UTC-06:00)  #    Disclaimer

Pat Helland from Microsoft is a recognized expert in the emerging area of Service-Oriented Architecture. His new paper is quite thorough...

Data on the Outside vs. Data on the Inside


Friday, September 3, 2004 7:59:30 AM (Central Standard Time, UTC-06:00)  #    Disclaimer
 Tuesday, August 31, 2004

The service packs are:

Microsoft .NET Framework 1.0 Service Pack 3 (SP3)

Microsoft .NET Framework 1.1 Service Pack 1 (SP1)

Microsoft .NET Framework 1.1 Service Pack 1 (SP1) for Windows Server 2003

These services packs increase overall security, and support changes made to Windows in Windows XP SP2.

Tuesday, August 31, 2004 12:33:37 PM (Central Standard Time, UTC-06:00)  #    Disclaimer

Years ago it was Carl and Gary's that was the central hub for the VB community. The place we all started browsing and then jumped off to other locations. There really hasn't been an equivalent hub (or portal) for a very long time.

But I think there's hope - check out Robert Green's blog entry on the new VB community site updates.

Robert has been working (along with Duncan) on this for quite a while now, soliciting input from a lot of people in the VB community - including authors, speakers and others. The site has been slowly evolving, and now is really starting to show some great promise as a central hub for the VB community.


Tuesday, August 31, 2004 12:29:04 PM (Central Standard Time, UTC-06:00)  #    Disclaimer
 Thursday, August 26, 2004

OK, now I feel better. Perhaps I jumped the gun with my previous post.


Rich Turner gave an awesome presentation – totally on the mark from start to end.


He was very, very clear that the prescriptive guidance is to use asmx (web services) to cross service boundaries and to use Enterprise Services (COM+), MSMQ, Remoting or asmx inside a service boundary.


Note that inside a service might be multiple tiers. Multiple physical tiers. You might cross network boundaries (though that should be minimized), but that’s OK. This is all inside your service, within your control. Since it is inside your control, you should choose the appropriate technology based on all criteria (such as performance, transactional support, security, infrastructure support, deployment and so forth).


This is the best and most clear guidance I’ve heard from Microsoft yet. Very nice!


Thursday, August 26, 2004 10:32:31 PM (Central Standard Time, UTC-06:00)  #    Disclaimer

I’d laugh except that it makes me want to cry.


I’m at a Microsoft training event, being briefed on various technologies by people on the product teams – including content on Indigo.


The unit manager gave an overview, and someone asked about the recommended architecture guidance around today’s Remoting technology. He reiterated that the recommendation is to only use it within a process. This, after he’d just finished pointing out that there are scenarios today that are only solved by remoting.


Say what?


Then several other Indigo team members covered various features of Indigo and how they map to today’s technology and how we may get from today to Indigo. Numerous times it comes up that Indigo incorporates much of the Remoting model (because it is good), and that most code using Remoting today will transparently migrate to Indigo when it gets here.


So what now?


First, the prescriptive guidance is nuts. They are saying conflicting things and just feeding confusion. Remoting is sometimes the only answer, but don't use it?


I'm sorry, I have to build real apps between now and when ever Indigo shows up. If Remoting is the answer, then it is the answer. End of story.


Second, it turns out that you are fine with Remoting as long as you don’t create custom sinks or formatters for Remoting your code will move to Indigo just as easily as any asmx code you write today (which is to say with minimal code changes).


And of course you should avoid the TCP channel and custom hosts – use IIS, the HttpChannel and the BinaryFormatter and life is good.


Finally, (as I’ve discussed before), Remoting is for communication between tiers of your own application. If you are communicating across trust boundaries (between separate applications) then you should use web services – or better yet use WSE 2.0.


Conversely, if you are using web services or WSE 2.0, then you have inserted a trust boundary and you shouldn’t be pretending that you are communicating between tiers – you are now communicating between separate applications.

Thursday, August 26, 2004 9:29:00 PM (Central Standard Time, UTC-06:00)  #    Disclaimer