Sunday, 13 July 2008

Mainframe futures?

I was thinking about the direction that mainframe evolution is moving in, and I came to the conclusion that it will definitely survive well into the foreseeable future. In fact, it will even take on a more important and strategic role within organizations. And it will do this by embracing distributed platforms and the Internet.

So how do I come to this contradictory conclusion? If everyone is busy working on laptops and running Linux servers, where does the mainframe fit into this scenario? Surely I’m misreading the signs!?!

But that’s the very reason that mainframes will be able to survive. It’s because so many users are working away on laptops that they need a central reliable repository for their bet-the-business data. A device where security and performance issues were solved decades ago, and continue to improve. And the mainframe needs these distributed platforms in order for it to survive. Think of a giant spider at the centre of an enormous web, and you have a picture of the computing environment of the future.

I was talking a little while ago about Enterprise Information Integration (EII) and how users needed to make use of information from different data sources in real-time in order to make the best decisions. I also wrote a couple of weeks ago about a lingua franca for translating (migrating) data from one format to another. This can be achieved off mainframe. And it’s a sensible place to do it. But the data needs to be stored on the mainframe because of all the data security and performance characteristics the mainframe possesses. CONNX Solutions has this kind of product.

The same is true with BPEL (Business Process Execution Language), which I also blogged about a few weeks ago. BPEL is important for sites that are interested in SOA and want to integrate their mainframes with their distributed systems in a way that allows scaling up – rather than small point-to-point solutions that don’t integrate or scale. Again, the mainframe can and should be used as the main data source. A product that makes use of industry-standards for this is available from DataDirect.

What about database auditing? What’s the best platform for that? Again, an interesting choice is to perform the auditing off the mainframe, but keep your data on the mainframe. Products like Guardium for Mainframes can ensure that users are able to audit their mainframe database’s activities and comply with the latest regulations, without impacting on the performance of the database.

The same thinking can apply to database archiving. You still want the database on your mainframe, but the archiving operation and the archived data can be on a distributed platform. NEON Enterprise Software has TITAN Archive, which can look after mainframe database archiving from a Linux platform.

These are just four (of many) different products that are currently available for use for different aspects of mainframe computing, but they illustrate the fact that the mainframe can run more efficiently with some activities taking place off-mainframe. But it’s only a short step (in terms of the thinking process) to realise that storing the active data on the mainframe in the first place is a really sensible approach. From the mainframe it can be made available over the Internet or locally to employees, customers, business partners, and whoever else it makes sense to share it with. And there’s an enormous lacuna in your computing environment if you don’t have a mainframe (with its many benefits) at the centre. The mainframe can be the heart of your computing environment – the spider at the centre of an intricate Web.

1 comment:

Anonymous said...

While I agree with your conclusion that the mainframe will continue to contribute important business value in the future, I think you do it a disservice positioning it as simply a store for data. Admittedly, you were coming from an EII angle, so this is understandable. However, I see at least equal if not greater value coming from the mainframe as an APPLICATION store - all those trillions of dollars of investment can continue to deliver returns.

You did mention SOA and BPEL, but you then restated your focus on data.

The key tool, it seems to me, is the one that makes mainframe APPLICATIONS available from off-mainframe, preferably as SOA services. While I do not feel it appropriate for me to recommend a specific tool for this fifth category as you did for the other four, Lustratus offers a best-of-breed functionality guide to determine what is needed in such a tool (available at www.lustratus.com) for anyone wanting to evaluate specific vendor solutions in this area.