It's not exactly accurate to use words like "legacy" when describing systems like IBM's i5 (it will always be the AS/400 to me). Our "legacy" systems are so critical to our ($1B) business it's not an overstatement to say that our restaurants could not transact business without them. The simple majority of our development time, energy, and money is spent writing new RPG code, introducing new green screen applications, and finding new ways to make the 400 work with the rest of our expanding private cloud infrastructure. Calling something legacy has usually implied that newer systems are taking the place of the "old" way of doing things. I suppose you could say that programmers use the word "legacy" interchangeably with "obsolete".
Our AS/400 is not going away. For that reason, it's silly to call it obsolete.
I've gotten some great feedback from the session on private cloud infrastructures I did at this year's SpringOne 2GX in Chicago. People are very interested in how these traditional systems can work with the new cloud services many are introducing into their enterprise. Plenty of organizations have decades of business knowledge and data tied up in "legacy" systems and they want to know how in the world they can get a fancy new cloud application server like tc Server to talk to their AS/400 (through more than SQL and JDBC).
The first myth I need to dispel is that cloud computing is limited in scope to virtualized machines. This is simply not true. Virtualization is what has given cloud computing wings and it's certainly an integral part of cloud computing infrastructures, but systems that run operating systems on the bare metal and are not virtualized are still first-class citizens of private cloud architectures. The data and knowledge they hold is important to the business and virtualized systems that interact with them couldn't perform their functions fully without these (formerly) legacy systems.
Cloud computing is a world view. It's a way in which developers and system administrators can express new applications that is completely different from how they would have expressed those applications using more traditional tools, frameworks, and architectures.
In my session on the private cloud, I used an analogy that seems to resonate with people when describing what cloud "is".
Consider the Dutch Masters of the 17th century (try Vermeer--famous for his painting "Girl with a Pearl Earring"). Renaissance painters bent all their efforts to creating paintings that had a high degree of visual fidelity. By that I mean, the artist strove to make people look like people and trees look like trees. Rembrandt painted himself throughout his life and made extra effort to paint every wrinkle and subtle variation in his face as he aged. They spent many hours practicing how best to paint the way sunlight through a window illuminates a still life.
This is traditional computing architecture. This is J2EE and the "enterprise". This is operating systems running on bare metal machines and running a small number of large application servers.
By the 19th century, however, Impressionism was taking hold. Monet wasn't all that focussed on expressing, with great visual fidelity, the exact scene he saw while painting. People still looked mostly like people and you can tell what Monet wants you to think is a tree. But Impressionist paintings are not journalistic documents like those of the Dutch Masters. They forever abandoned the chains of visual fidelity. As long as you could tell that two brown perpendicular lines in a wavy blue color field was a person sitting in a boat, the artist didn't try and make that person in a boat exhibit a great level of visually-accurate detail.
This is virtualization. Through virtualization, computing forever broke the chains of operating systems running on bare metal machines. But they could still very much resemble those traditional architectures like an Impressionist painting can still very much resemble the real-life subject of the painting. Virtualization can fool applications and services into thinking they actually are running on bare metal boxes. Just like Impressionism was in Art, Virtualization was the innovation in computing that forked architectures and allowed system administrators to make better and more efficient use of their existing hardware.
But virtualization by itself isn't really a new worldview from a developer's perspective. You can still code and deploy applications onto virtual machines in the same way you do bare metal boxes. The fact that the application server runs in a virtualized environment means little to the developer.
Along comes Modernism. In the art world, Modernism was forever a new way to express the artist's vision. Visual fidelity to real objects, people, and landscapes were not just thrown off but trampled and burned. Modernism was a completely new perspective through which artists expressed their visions using many of the same basic tools as the Dutch Masters; they used canvas and oil paints and people as the subjects of their paintings. But there's no mistaking a Kandinsky for a Rembrandt!
Cloud is to computing what Modernism is to Art: it's a new worldview that forever changes the way applications are designed and developed, from the application frameworks up through the systems they run on. It has echoes of the past in it, just like any development that has history. People might try and convince you that "cloud computing" is a nifty, trendy marketing slogan slapped to the side of existing products. Not true. Cloud computing is a paradigm shift that is forever changing the way developers, and system administrators alike, approach problem-solving.
Cloud computing matters to you because it's a new (though still somewhat ineffable) worldview. It's an all-encompassing prism through which we see architectures and application development which crosses system boundaries, cares not for old labels and traditions in the way Modernism forged a new path with somewhat callous disregard for the work of previous generations, and has indomitable momentum.
Much of my career has involved exposing AS/400 data and knowledge (in the form of existing programs) to users via web-ish interfaces. Now I'm able to make better use of our existing hardware thanks to virtualization and I'm able to build and scale applications in ways I never could have before cloud computing came along. Since that data in DB2/400 is an important part of the whole (PostgreSQL data figures highly in our enterprise, as well), whatever infrastructure I put in place has to accept the AS/400 as a first-class citizen. I have no choice in this. It simply Must Be.
I'm finding a lot of people are in the same boat. There are a lot of great innovations going on in the industry with this intense focus on public and private cloud computing. This push to take applications to the next level by building on top of virtualization and making full use the possibilities it offers can give developers that work with traditional enterprise systems fantastic new tools to solve problems in ways that are not near as painful as they were just a few years ago. And who wants to inflict more pain on themselves than they have to?
Sadly, there's no getting around the fact there is a gap between traditional enterprise systems and the momentum in cloud computing. It seems vendors are rushing to provide services that are great for "greenfield" development (where you can design a system from scratch with no baggage) but are insensitive to the needs of real enterprises that can't simply re-tool on demand. Believe me when I say "I'm working on them"! I'm bending ears whenever I can to get some attention directed toward these not insignificant problems.
Cloud computing is here to stay. It's not a trend or a fad. It's not (just) a buzzword that has no meaning. Until such time as a better word is elected to describe a composite system of virtualized and non-virtualized machines that work together to implement a unified vision of new applications and architectures, I'll be using the much shorter moniker: "cloud".