Wednesday, August 31, 2005

Less Is More -- by Barry Schwartz

The following edited excerpt comes from a presentation given at Pop!Tech 2004 in Camden, Maine.  The speaker was psychology professor Barry Schwartz of Swathmore.  

I've always been thoroughly fascinated by psychological explanations of human behavior, dating back to my days as a Ph.D. graduate student studying cognitive psychology.

You can listen to this speech yourself by visiting IT Conversations, a fabulous free resource providing superb IT-related lectures delivered as podcasts.
Since birth, modern women have been told that they can do and be anything they want.  Be an astronaut, the head of an Internet company, or a stay-at-home mom.  There aren't any rules anymore and the choices are endless.

Is it possible that we've become so spoiled by choices that we're unable to make one?  

A part of us knows that once you choose something -- another option goes away.

The desire to have it all, and the illusion that we can, is one of the principle sources of torture of modern, affluent, free, and autonomous western societies.

We have much more freedom and flexibility about where we work and when we work than we ever did before.  There's obviously something very good about that.  But what that means is that at every minute of everyday we are faced with a choice of whether or not to be working.  Where you are is no longer an excuse for not working.  The time of day is no longer an excuse for not working.  That you are already doing something else at the same time is no longer an excuse for not working.  

We are now in charge of what we look like in a way that we weren't before.  You used to be in charge of your physical appearance, like the clothes you wore, or your haircut.  But, now you get to shape your body as well.  If you have too much tissue in one place you just suck it out and you reinject it someplace else where you don't have quite enough.  You can paralyze your facial muscles to make the little lines go away.  Anything is possible and what this means is that physical appearance is now a matter of personal choice and personal responsibility.  And what that means is that if you're unattractive it is your fault.  Unattractiveness is a matter of choice.  It didn't use to be.  

People could always choose whether to marry.  Having decided to marry, they could always choose to end the marriage.  And they could always choose whether and when to have children.  In the old days, although these choices were available, the default assumption was so powerful that most people didn't feel like there was any choice.  There was a choice of mates.  But after that everything else just ran its course.  You got married as soon as you could.  You started having kids as soon as you could.  It's obvious that none of that is true anymore in our society.  Should I marry or shouldn't I?  Should I marry now or should I marry later?  Should I have children or shouldn't I?  Should I have them now or should I have them later  Each and every one of these decisions is a very consequential decision and a very real one in the minds of today's young people.  

Religion comes in a lot more flavors than it used to.  Even identity -- who we are -- is up for grabs.  We don't automatically inherit the identities our parents gave us.  We are free to re-identify and reshape ourselves on almost a daily basis.  

These are consequential decisions because they tell the world something about who we are.  No matter how trivial the decision may seem, it says something to the world about you, or at least some people think it does, and that makes unimportant decisions become important decisions.   This creates the paradox.  

We have more freedom in America, more freedom of choice than any people ever has had anywhere on earth before.  We also have more money than any people anywhere on earth ever has had before which is not insignificant because freedom without the money with which to exercise choice is a kind of empty freedom.  So this is, as you might imagine, the best of all possible worlds.

Well, no.  Americans are sadder than Americans have ever been before.  Clinical depression is more than twice as frequent now as it was a generation ago.  Some people have estimated that it is a hundred times more frequent than it was a century ago.  This is also true of suicide.  Not only is the frequency of these things higher than ever before, but it is also being observed in younger and younger people than ever before.  

The problem of making choices makes a direct contribution to the fact that in the face of plenty, Americans are more and more dissatisfied with their lives.  

What Does Too Much Choice Do?
The classic study involved buying jam at a gourmet food store in Palo Alto.  This store would set up a little table where people could sample products and then if they wanted to buy the products, they could.  Experimenters set up a table that offered imported jams.  One week there were 24 different flavors of these jams sitting on the table.   If you stopped by you could sample as many as you wanted and then you'd get a coupon that would give you a dollar off on any jam you bought.  The next week the same set up, except instead of 24 jams, you had only six.  Again, you could stop by, sample, and if you liked the product you could buy the jam and get a dollar off using the coupon.  What they found is that when there were 24 jams on the display more people were attracted than when there were six.  More tasting, more coupons dispersed, and one tenth as many people bought jam.  A profusion of choices produces a kind of paralysis.  Which jam should I buy?  How am I going to decide?  

Too much choice makes it impossible for people to choose.  With all of this choice they may do better when they finally pull the trigger and make a decision, but they will feel worse.  In other words, you do better objectively yet you feel worse about the results of your decision.  The question is why.

One reason why is regret.  You choose the boysenberry jam.  You take it home.  It's good, but maybe the blueberry would have been better.  And you regret not having chosen the blueberry.  What that does is subtract from the satisfaction you get from the boysenberry even though boysenberry's a perfectly good choice.  So, regret poisons good results, and even worse than that, anticipated regret -- the worry that you will end up sorry that you didn't choose the blueberry -- makes choosing itself almost impossible.  

There's a concept economists call opportunity costs.  Anytime you make a choice of one thing you are passing up attractive features of other things.  These missed opportunities are called lost opportunities, and the more attractive alternatives there are, the more opportunity costs there will be.  These will accumulate and detract from the satisfaction you get out of what might be a perfectly good decision.  

Choice causes an escalation in expectations.  A profusion in choices leads to a decrease in satisfaction with decisions.  Often, satisfaction is determined not by the objective results of the decision but by how the results of our decision line up with what our expectations are.  If expectations keep on leaping ahead of objective results, then at best, we're running in place, and often times we're falling behind.  

Not to romanticize about the past, but the sense that everything was better back when everything was worse is that when everything was worse people's expectations were much more moderate than our expectations are now.  So, it was actually possible to have experiences that exceeded expectations.  It is no longer possible for people living in our society to have experiences that exceed expectations because expectations are so high.    

Finally, you go out to buy something.  You make a choice.  You bring it back.  It's disappointing.  It doesn't live up to your expectations and you have to ask this question.  Whose fault is it that this choice failed to live up to expectations?  The feeling today is the fault is yours.  There is no excuse for failure in a world where the choices are essentially infinite.  There is no excuse for anything less than perfection.  

The paradox in all of this is that what really, really produces flourishing -- what really makes people happy -- what really produces satisfaction -- is close relations to other people.  That's the single most important determinant of well-being that anyone has identified in 40 years of research.  

The thing to notice about close relations with other people is that they constrain.  They don't liberate.  What it means to be close to someone is that you are not free to make all of these choices for yourself.  You have to consider the needs, interests, and desires of others.  So, your choice set is limited by the fact that you care about other people and other people care about you.  

In an affluent society like this one, anything that constrains choices is itself a benefit.  One of the benefits of being involved with other people in an intimate way is that it limits your possibilities.  We are now desperate for things in society that will limit our possibilities.  

How Can Choice Be Both Good and Bad?
Economists have a term called diminishing marginal utility.  Imagine a curve where the x-axis is the number of choices you have, and the y-axis is your subjective state (how good you feel).  Living with no choice is infinitely bad.  You can't be a human being if you live in a world with no opportunity to determine your destiny.  As the number of choices in life increases, our well-being also increases.  But, eventually, the marginal benefits of additional choices become infinitely small.  They go almost to zero.  Adding more and more options doesn't increase our well-being.  

At the same time, for each additional choice we face, there's a negative attached to it -- uncertainty -- regret -- raised expectations.  The more choices you have, the bigger the negative effect.  Going from no choice to some choice dramatically improves our well-being.  But a point is reached where well-being actually crosses the zero point and starts to go negative as a function of increased choice.  

So, you can be anything you want to be.  No limits.  The only problem is that being anything you want to be is only possible within a world in which there are limits.  If you take away all the limits it becomes impossible to realize potential.  

It is only within a set of constraints that the realization of human potential is possible.  And we have, as a society, moved in the direction of assuming all constraints are bad and our task in life is to shatter as many constraints as possible.  The result has been to make us more and more dissatisfied with life even as the material circumstances of life get better and better.  

There's an optimal amount of choice.  Some amount of choice is better than no choice unequivocally.  But, we have assumed that since some choice is good, more choice is better.  We've assumed that the relation between choice and well-being is monotonic.  The curve goes in one direction.  That's false.  


Monday, August 29, 2005

'Stay Hungry. Stay Foolish.'

Just in case you missed it last June, Steve Job's commencement address to the Stanford University's class of 2005 is once again in the news. Fortune Magazine republished it in its entirety this past week.

I strongly encourange you to take the time to read the complete transcript. You can access a copy by clicking here. A video of the commencement address can also be viewed by clicking here.

Steve Jobs, former and present CEO of Apple Computer, as well as CEO of Pixar, shared three life lessons in his commencement address that struck a powerful chord -- not only with Standford's graduating class but also with tech workers everywhere.



  1. The first story is about connecting the dots. You can't connect the dots looking forward; you can only connect them looking backwards. So you have to trust that the dots will somehow connect in your future. You have to trust in something -- your gut, destiny, life, karma, whatever. This approach has never let me down, and it has made all the difference in my life.

  2. The second story is about love and loss. Sometimes life hits you in the head with a brick. Don't lose faith. You've got to find what you love. And that is as true for your work as it is for your lovers. Your work is going to fill a large part of your life, and the only way to be truly satisfied is to do what you believe is great work. And the only way to do great work is to love what you do. If you haven't found it yet, keep looking. Don't settle. As with all matters of the heart, you'll know when you find it. And, like any great relationship, it just gets better and better as the years roll on. So keep looking until you find it. Don't settle.


  3. The third story is about death. To help make the big choices in life, remember almost everything -- all external expectations, all pride, all fear of embarrassment or failure -- these things just fall away in the face of death, leaving only what is truly important. Remembering that you are going to die is the best way to avoid the trap of thinking you have something to lose. You are already naked. There is no reason not to follow your heart. Your time is limited, so don't waste it living someone else's life. Don't be trapped by dogma -- which is living with the results of other people's thinking. Don't let the noise of others' opinions drown out your own inner voice. And most important, have the courage to follow your heart and intuition. They somehow already know what you truly want to become. Everything else is secondary.


What is "IT Governance"?

Charlie Betz runs a Yahoo!Group called erp4it · ERP for IT, along with a corresponding blog called erp4it: Architecting IT Governance. There's been a lively debate in response to his recent posting entitled Defining "IT Governance". Below are edited excerpts from the discussion:
The definition in Peter Weill and Jeanne Ross's recent book, IT Governance: How Top Performers Manage IT Decision Rights for Superior Results, refers to a "firm's allocation of IT decision rights and accountability." The purpose is simple and straightforward -- "to encourage desirable behavior in using IT."

Weill and Ross ask the question, "Do your IT investments target enterprisewide strategic priorities -- or does your firm squander resources on diverse tactical initiatives?"

According to the authors, "firms manage assets -- people, money, plant, and customer relationships -- but information and the technologies that collect, store, and disseminate information may be the assets that perplex them the most. Business needs constantly change, while systems, once in place, remain relatively rigid."

As Weill and Ross point out, "IT governance encourages and leverages the ingenuity of the enterprise's people in IT usage and ensures compliance with the enterprise's overall vision and values."

They continue, "All enterprises have IT governance. Those with effective governance encourage behavior consistent with their enterprise's mission, strategy, values, norms, and culture. In contrast, enterprises that govern IT by default more often find that IT can sabotage business strategy. Good governance allows enterprises to deliver superior results on their IT investments."

Weill and Ross conclude, "Governance is the single most important predictor of the value an organization generates from IT."

Charlie Betz views the term "Governance" slightly differently, by extending its definition "to include IT service management, portfolio management, and the software development lifecycle."

Craig Symons, a Forrester analyst, wrote a US$349.00 research report entitled: 'IT Governance Framework' which states: IT governance at its most basic is the process of making decisions about IT. By this simple definition, every organization has some form of IT governance. Good IT governance ensures that IT investments are optimized, aligned with business strategy, and delivering value within acceptable risk boundaries -- taking into account culture, organizational structure, maturity, and strategy."

Daniel Rolles commented on the "analogy between IT governance and corporate governance. If corporate governance is about business unit & firm level risk management, accounting standards, ethics charters, etc., then IT governance is about application & infrastructure risk management, project management standards, etc." He defines corporate governance as "the process by which agencies are directed and controlled. It is generally understood to encompass authority, accountability, stewardship, leadership, direction and control."

Nick Gall, a Gartner analyst, added "governance is 'the management of management'". He points out how "nicely this fits the definition of the role of the board of directors vis-a-vis management":
  • the board governs -- manages the management
  • the managers manage -- manages the company

On the one hand, I agree with Peter Weill and Jeanne Ross's definition of IT Governance focusing as it does on decision behavior and accountability. On the other hand, compliance depends on operationalizing IT Governance.

While process control frameworks, such as CMMI (Capability Maturity Model Integration), ITIL (IT Infrastructure Library), ISO-9000, CobiT (Control objectives for information and related Technologies), and Six-Sigma, all play an important role in IT Governance, I personally believe the most important, relatively inexpensive first step begins by mapping an enterprise's technology portfolio of past IT investments onto a technology architecture and then identifying IT standards.

It doesn’t matter what performance metric you choose to measure, standardizing improves efficiency by lowering costs, shortening cycle times, and reducing staffing. Simultaneously, standardization coupled with consolidation increases effectiveness, expands interoperability, and even improves security.

IT Standards ought to serve as the foundation cornerstone for all IT Governance initiatives. If you want a responsive, agile IT organization: adopt a simplified, streamlined,less complex, standardized computing environment. Doing so will reduce inefficiencies and eliminate unnecessary replication.

Please pardon my soapbox. I've just recently written on this topic in a whitepaper entitled The IT Standards Manifesto. I feel pretty strongly that IT Governance ought to start by eliminating the huge ongoing waste of resources that go toward supporting redundant products that deliver identical functionality, purchased from multiple different vendors, by multiple different project teams, purchasing multiple different products. What’s not well-understood nor well-documented is how much the total cost of a product's ownership (its TCO) extends far beyond the initial purchase price.

Thursday, August 25, 2005

Mess O'Potamia

I received one of those "Do not break this chain" emails which I've excerpted below. I have no idea who wrote it. But it is interesting.
  • The garden of Eden was in Iraq.
  • Mesopotamia, which is now Iraq, was the cradle of civilization!
  • Noah built the ark in Iraq.
  • The Tower of Babel was in Iraq.
  • Abraham was from Ur, which is in Southern Iraq!
  • Isaac's wife Rebekah is from Nahor, which is in Iraq!
  • Jacob met Rachel in Iraq.
  • Jonah preached in Nineveh -- which is in Iraq.
  • Assyria, which is in Iraq, conquered the ten tribes of Israel.
  • Amos cried out in Iraq!
  • Babylon, which is in Iraq, destroyed Jerusalem.
  • Daniel was in the lion's den in Iraq!
  • The three Hebrew children were in the fire in Iraq (Jesus had been in Iraq also as the fourth person in the fiery furnace!)
  • Belshazzar, the King of Babylon saw the "writing on the wall" in Iraq.
  • Nebuchadnezzar, King of Babylon, carried the Jews captive into Iraq.
  • Ezekiel preached in Iraq.
  • The wise men were from Iraq.
  • Peter preached in Iraq.
  • The "Empire of Man" described in Revelation is called Babylon, which was a city in Iraq!
Israel is the nation most often mentioned in the Bible. Iraq is second, however, that is not the name that appears in the Bible. Instead, the Bible uses the names: Babylon, Land of Shinar, and Mesopotamia. The word Mesopotamia means between the two rivers, more exactly between the Tigris and Euphrates Rivers. The name Iraq, means country with deep roots.


Wednesday, August 24, 2005

Technological Revolutions

Irving Wladawsky-Berger's blog quotes Carlota Perez, a Venezuelan scholar and expert who views technology revolutions as engines of growth, rejuvenating and transforming the whole economy, as well as re-shaping social behavior and the institutions of society.

Below is an edited excerpt based on an online lecture:
To better understand the dynamics of a technology revolution, we should split it into two different periods, each lasting 20 to 30 years.

The "installation period" is the time of creative destruction, when:
  • new technologies emerge from the lab into the marketplace
  • entrepreneurs start many new businesses based on these new technologies
  • venture capitalists encourage experimentation with new business models and speculation in new money-making schemes
Inevitably, this all leads to the kind of financial bubble and crash we are all quite familiar with from our recent experience.

After the crash, comes the "deployment period," which she views as a time of institutional recomposition:
  • the now well accepted technologies and economic paradigms become the norm
  • infrastructures and industries start getting better defined and more stable
  • production capital drives long-term growth and expansion by spreading and multiplying the successful business models
Carlota Perez believes that we may not yet have entered the deployment period, as the crash phase doesn't seem to have resolved itself. She mentions three particular structural tensions that we need still to work out in order to move on:
  • investments continue to be focused on short-term gain, not on long-term production and growth
  • the social system continues to foster an unstable environment in which the rich get richer and the poor get poorer
  • there is too much "idle money" chasing and inflating assets like housing and not going into expanding the demand needed to soak up all the excess supply being produced


Monday, August 22, 2005

An Architectural Pattern

MDC (Model, Document, and Communicate) is to Architecture what MVC (Model, View, and Controller) is to Software Development. Both represent important patterns for others to follow. Both are elegant and simple.

Architecture is different than software development. Architectural patterns are similar to, but not exactly like, what The Gang of Four wrote about in their seminal book Design Patterns. Rather than describing the elements of reusable object-oriented software, architectural patterns deal with those activities typically performed by architects (instead of developers).

The Gang of Four set out to use patterns in a generative way in the sense that Christopher Alexander uses patterns for urban planning and building architecture. They then used the term ''generative'' to mean ''creational'' to distinguish them from ''Gamma patterns'' that capture observations.

Christopher Alexander coined the term "Pattern Language" to emphasize his belief that people had an innate ability for design that paralleled their ability to speak.

Chris's book A Timeless Way Of Building is the most instructive in describing his notion of a pattern language and its application to designing and building buildings and towns.

He defined a 'pattern' as a three part construct:
  1. first comes the 'context' -- under what conditions does this pattern hold
  2. next are a 'system of forces' -- these represent the 'problem' or 'goal'
  3. the third part is the 'solution' -- a configuration that balances the system of forces or solves the problems presented
According to The Gang of Four, Patterns and Pattern Languages are ways to describe best practices, good designs, and capture experience in a way that it is possible for others to reuse the experience.

Patterns represent a common vocabulary for expressing architectural concepts. The goal of architectural patterns is to create a body of literature to help architects resolve recurring problems. Patterns help create a shared language for communicating insight and experience about problems and solutions.

Let's look at MDC -- Model, Document, and Communicate.

Architects deal with models. Different types of architects deal with different types of models: Business architects model business processes; Data architects model business objects (i.e., entities); Application architects model business solutions; Technology architects model technology portfolios.

The goal of architecture is to document everything. Formally codifying models provides a framework for capturing and organizing content.

Modeling and documenting is worthless unless the captured and organized information can be easily communicated to the people who need to access it.

Wednesday, August 17, 2005

Stupid Is As Stupid Does


Can you imagine Forrest Gump as President of the United States? "Too smart," you say!  I agree. Even Forrest Gump isn't stupider than George W. Bush.

The New Yorker published a transcript indicating that Bush received a cumulative score of 77 for his first three years at Yale and a roughly similar average under a non-numerical rating system during his senior year. Of course, John Kerry, who graduated two years before Bush, received a cumulative score on only 76 for his four years. Leave it to the Democrats to out-stupid the Republicans.

Under Bush's leadership, America has become the world's greatest debtor nation sucking up more than 80% of all global savings. What happens when that figure hits 100%?  This administration claims there's no scientific proof that global warming is happening and thus no need to legislate reductions in greenhouse emissions. This from a president who believes that 'Intelligent Design' ought to be taught in science classes. I don't even want to mention the stupidity that led us into Iraq or how that war is currently being waged. Let's just leave it that stupid is as stupid does.

So, how about Forrest Gump as CIO?  "Still way too smart," you say.

Just like Forrest Gump saying "Life is like a box of chocolates", it's even more true that "IT is like a box of chocolates". CIOs haven't a clue what's on the inside of their systems.

The majority of IT organizations do not possess a Technology Architecture. Nobody knows what the enterprise owns, let alone what products to use under what conditions. The communication of IT standards in most enterprise today is a non-existent process.

It seems as if architects are looking for some magic silver bullet solution. No one is willing to do any real work -- like trying to create a Technology Architecture that models and communicates what technology to use under what conditions and why. Instead, architects are looking for some simple solution to go off and scour the IT environment to discover what the enterprise already owns, as if that's the main problem.

Let's assume an automated process finds that an enterprise is using Oracle, SQL Server, and DB2. What good does that do? They just show up as three line items on an inventory report. How does someone intuively know that these three different products all deliver virtually identical functionality. This problem is greatly magnified by the scores of different product categories beyond relational DBMSs.

The federal government is investing billions in enterprise architecture. Yet, Technology Architecture is at the bottom of the FEAF (Federal Enterprise Architecture Framework), almost as an afterthought.

Technology Architecture is not the same as Asset Management. Rather, it's comparable to the difference between metadata and data. Technology Architecture needs to be a model that describes products in terms of functionality so that similar tools cluster together. This requires a taxonomy.

Monday, August 15, 2005

Virtuoso Teams



The Harvard Business School Working Knowledge newsletter asks the question:
Can Superstars Play the Team Game?


Below is an edited excerpt:
In nearly any area of human achievement -- business, the arts, science, athletics, politics -- you can find teams that produce outstanding and innovative results. Such work groups are referred to as virtuoso teams, and they are fundamentally different from the garden-variety groups that most organizations form. Virtuoso teams comprise elite experts specially convened for ambitious projects. Their work style has a frenetic rhythm. They emanate a discernable energy. They are utterly unique in the ambitiousness of their goals, the intensity of their conversations, the degree of their spirit, and the extraordinary results they deliver.

Despite such potential, most companies deliberately avoid virtuoso teams, thinking that the risks are too high. For one thing, it's tough to keep virtuoso teams together once they achieve their goals -- burnout and the lure of new challenges rapidly winnow the ranks. For another, most firms consider expert individuals to be too elitist, temperamental, egocentric, and difficult to work with. Force such people to collaborate on a high-stakes project and they must come to fisticuffs. Even the very notion of managing such a group seems unimaginable. So most organizations fall into default mode, setting up project teams of people who get along nicely. The result is mediocrity.

Virtuoso teams play by a different set of rules than other teams. Unlike traditional teams -- which are typically made up of whoever's available, regardless of talent -- virtuoso teams consist of star performers who are handpicked to play specific, key roles. These teams are intense and intimate, and they work best when members are forced together in cramped spaces under strict time constraints. They assume that their customers are every bit as smart and sophisticated as they are, so they don't cater to a stereotypical "average." Leaders of virtuoso teams put a premium on great collaboration -- and they're not afraid to encourage creative confrontation to get it.

Traditional teams typically operate under the tyranny of the "we" -- that is, they put group consensus and constraint above individual freedom. Team harmony is important; conviviality compensates for missing talent. This produces teams with great attitudes and happy members, but, from a polite team comes a polite result.

When virtuoso teams begin their work, individuals are in and group consensus is out. As the project progresses, however, the individual stars harness themselves to the product of the group. Sooner or later, the members break through their own egocentrism and become a plurality with a single-minded focus on the goal. In short, they morph into a powerful team with a shared identity.


Tuesday, August 09, 2005

A is for Asynchronous

Asynchronous communication is the key to agility.

Asynchronous will be the cornerstone for any future distributed application architecture if your objective is to deliver robust, flexible, and reliable systems. The challenge is in getting software developers to shift away from the traditional, synchronous, request-reply paradigm that has long dominated programming ever since the introduction of client/server.

AJAX -- Asynchronous JavaScript And XML -- is based on an architectural model where a typical HTTP request to a server is replaced by a JavaScript call to an AJAX engine. The AJAX engine handles a call to a server by generating an asynchronous request via XML. The user's interaction with an AJAX application never stalls waiting synchronously for a request to complete.

AJAX is an example of MVC -- Model-View-Controller. MVC separates an application's data model, user interface and control logic into three distinct components. The Controller mediates between the Model and the View.

AJAX significantly improves how web pages interact with data, rivaling programs that run natively on desktops. Look, for example, at how draggable maps work inside Google Maps. Users click and drag to view adjacent sections of maps, without having to wait for new areas to download.

AJAX overcomes a severe limitation in traditional web interfaces which must reload anytime they try to call up new data. By contrast, AJAX lets users manipulate data without clicking through to a new page or performing page refreshes.

While Microsoft wants to be part of the AJAX revolution, its major focus for developers is on helping them build desktop applications for Windows Vista (formerly code-named Longhorn), its next generation .NET platform for delivering enterprise services that support complex distributed transactions, including support for compensating actions (instead of simple rollback operations). Microsoft includes a user interface development tool called Windows Presentation Foundation (formerly code-named Avalon). Its ultimate goal is to replace browser-based user interfaces with rich clients.

Meanwhile, recognizing the need in certain scenarios for a standardized, browser-based user interface, Microsoft also announced it is developing its own AJAX toolbox called Atlas for web developers who use Microsoft's ASP.NET technologies to build websites.

Regardless of who wins the battle between rich clients and browser-based AJAX clients, the future is all about asynchronous processing. Think of asynchronous as the lynchpin that will unlock the true power of SOA and web services.

Of course, in some sense you could say that the software industry is moving back to the future. After all, years ago when IBM introduced its CICS TP monitor, its most salient feature was delivering online computing via asynchronous computing.

Monday, August 08, 2005

Now, if My Software Only Had a Brain ...






A practical way to master information from Web pages is so important that new programs for this purpose keep appearing: