Monday 29 August 2011

IMS systems and costs - analysis

I blogged about IBM’s IMS (Information Management System) at the end of July, saying that it has been around since 1968 and originated as a bill-of-materials program for NASA’s Apollo programme. I said that IMS effectively comes in two parts – there’s the Transaction Manager (TM) part and the Data Base (DB) part. I talked about different types of database, and I mentioned the Virtual IMS user group at www.fundi.com/virtualims.

Today I want to pose the questions: how much does an IMS development/test system cost? And how many development test systems does a site typically have installed?

It’s a bit like asking: how long is a piece of string? Obviously every piece of string has a length, but it is unknown, a quantative answer can’t be given. And by implication, whatever else is being discussed will contain a degree of indeterminate uncertainty!

Our experience at iTech-Ed (where we administer the Virtual IMS user group) is that a single IMS development test system can cost an organisation between US$1,000,000 per year and $2,000,000 per year (and possibly more in some cases).

There are some sites that run their development systems on dedicated machines that can be larger than many average-sized organizations’ production systems.

However, there is an additional complication. We believe that, although IMS is a huge revenue earner for IBM, they will waive their fee for software for organisations that are development shops and don't use it for production.

We also estimate that the personnel costs for installing and maintaining IMS development systems can amount to about half a million US dollars per year.

And the number of IMS development/test systems can vary hugely from 1 or 2 true development systems (plus test, QA, etc) in smaller shops, to larger customers, who may have any number from around ten to perhaps 30+. We know of some users with 300+ test IMS regions, but the bulk of the bell-shaped curve is skewed to much lower values. The reason we believe the average is ten or slightly above is because of the amount of administrative effort these test systems take to maintain.

The waters can be muddied further by the fact that organizations can negotiate deals on price with IBM, but are then discouraged from sharing information about those prices with others.

Our conclusion is that the cost to the organisation of running a development system depends on the size of the installation. US$1-2M is a good estimate of the cost for each IMS development/test system, with 10 being a reasonable estimate of, on average, how many development/test systems exist.

And, of course, if you have any further information on this, we would be really interested to hear from you.

Sunday 21 August 2011

The PC at 30

In the future, IBM will be known as the PC maker of choice for most people, and those PCs will be running a GUI (Graphical User Interface) based on CP/M. Well, that was the view of some people 30 years ago when IBM gave birth to its first PC.

It was on 12 August in a ballroom at the Waldorf Astoria, New York that the IBM 5150 made its first appearance. And because IBM was known for making mainframes, this device was called a ‘personal computer’.

IBM didn’t invent the idea of small personal computers, they just wanted a part of a new and growing market place. In those days you could buy small computers like the Sinclair Spectrum, and slighty bigger boxes from Apple, Atari, Commodore, Osborne, and Tandy. The big mind-shift for the IBM engineers in Boca Raton, Florida, was to construct their PC from parts that were available off-the-shelf. Up until then, IBM had always designed and made what they needed. However, time was precious and development was faster by shopping to get the parts. It was a decision that allowed clone makers a foot in the door.

IBM wrote the BIOS (Basic Input/Output System), the part that’s loaded when the machine boots up. Next they needed an operating system. In the same way they were buying hardware components, they thought they’d buy the OS. The best one around was CP/M (Control Program for Microcomputers) from Gary Kildall of Digital Research, Inc. The story goes that IBM’s representatives waited to see him but he didn’t want to deal with men in suits. Remember back then how ‘cool’ computing was. As a consequence, IBM looked for another source for the operating system. They found Bill Gates. He provided PC-DOS, which was a rewrite of Seattle Computer Products’ (SCP) 86-DOS. The rest, as they say, is history.

Because the 5150 was made from these off-the-shelf component, other companies were quick to copy it. These clone makers badged their machines as IBM compatible. It seems a while since anyone put one of those stickers on a PC! However, because they couldn’t copy the IBM BIOS, they were never 100% compatible in those days. Now, of course, it’s not an issue. And many companies have come on the scene, made a lot of money making PCs, and disappeared again.

The first PC came standard with 16 kilobytes of memory, upgradeable to 64K, two 5.25-inch floppy drives, an Intel 8088 processor running at 4.77MHz, a display/printer adapter card, and a 12-inch green CRT monitor. You could then buy IBM’s dot-matrix printer and the necessary cable. This meant you’d be looking at over $3000 for the whole lot!

And now, IBM doesn’t have a PC business. It sold it to Lenovo in 2004. In 1996, Caldera acquired the assets of Digital Research from Novell, and later changed its own name to The SCO Group, and more recently the TSG Group.

It’s always hard predicting the future, even if you invented it!

Sunday 14 August 2011

We’re all friends now

There used to be a time when selling software was a cut-throat game. A salesman would turn up saying how good their product was and quietly poison the prospective client’s mind against alternative products from other vendors – listing their weaknesses and down-playing their strengths. In fact, I’ve even been paid to write documents for sales teams to use doing exactly that!

But now there is a much more grown-up approach to business. It seems that nowadays sales people are working together to move products. And where their own product may be gappy in some way, they are recommending the software of an erstwhile competitor. The benefits of this cooperative approach means that the customer gets a better service from vendors and a better understanding of the strengths and weaknesses of the products. And it also means that those companies are able to sell more products – which, after all, is how you stay in business!

So what prompted these thoughts? At this week’s SHARE in Orlando, Florida CA Technologies started off by announcing a new release of the CA VM:Manager Suite for Linux on System z and a new capability for CA Solve Operations Automation. There have been lots of anecdotes appearing on the Internet of organisations benefitting hugely from virtualizing their Linux servers on System z and eliminating the server sprawl that preceded it. And, clearly, Linux continues (after its slow start) to be one of the fastest growing segments of the mainframe market. So anything that helps to make zLinux users’ lives easier has got to be a good thing.

According to CA’s press release: “The new release of the CA VM:Manager Suite includes enhancements across product areas, which extend integrated management capabilities designed to control costs, improve performance, increase user productivity, and more efficiently manage and secure z/VM systems that support Linux on System z.. It also adds tape management capabilities for Linux on System z, along with improvements that help CA Technologies customers install, deploy, and service their CA z/VM products quickly and more effectively.”

The new capability in CA Solve Operations Automation means Linux applications can be managed as if they were System z applications, which reduces the need for mainframe Linux operations expertise.

The synergy comes with the announcement of a partnerships between CA and INNOVATION Data Processing and Velocity Software, which, they claim, are designed to help customers increase cost savings by optimizing Linux management. CA will distribute INNOVATION Data Processing’s UPSTREAM for Linux on Z and UPSTREAM for UNIX on Z, and Velocity Software’s zVPS Performance Suite.

UPSTREAM for Linux on Z is, they say, an intuitive, easy-to-use, data protection solution for what was once distributed data that is now the foundation for Linux applications being consolidated on the mainframe. UPSTREAM for Linux on Z can help reduce backup, restore, and disaster recovery costs, while increasing business resiliency by enabling customers to leverage the use of existing mainframe resources to meet their enterprise data protection needs. The UPSTREAM for Linux on Z solution is designed so that users can easily schedule timely backups and still meet the need for immediate reliable recovery; of a file, disk volume, or an entire data centre with confidence.

zVPS offers, again according to their press release, easy-to-use graphical and Web-based tools to help analyse capacity requirements, establish operational alerts, and determine chargeback usage. Its detailed information helps IT staff optimize performance and effectively manage the cost of their Linux on System z environment. By gathering data from Linux on System z and distributed environments, such as VMWare, Microsoft, Linux, and Unix servers, zVPS supports server consolidation projects and facilitates decisions on the most cost-effective placement of workloads.

Cynical observers, who are slightly longer in the tooth, will remember a time when Computer Associates would have bought the company (in that Victor Kiam, Remington Rand sort of way!). Clearly, CA Technologies is now all about ‘working with’ other vendors. 

Sunday 7 August 2011

IBM’s lawyers can take the day off!

IBM’s dark-suited legal team can relax a little following the news that three organisations have agreed to drop antitrust complaints filed against IBM in Europe and the USA. The companies involved are T3 Technologies, NEON Enterprise Software, and TurboHercules.

Back in October 2009, the US Department of Justice (DoJ) started investigating IBM’s mainframe monopoly following complaints from T3. Back in 2000, T3 launched its tServer low-end mainframe based on the FLEX-ES technology from Fundamental Software Inc (FSI). IBM is saying T3 withdrew its appeal for a ruling in the US courts in May this year. IBM also says T3 has withdrawn its European Commission complaint, alleging antitrust behaviour by IBM.

IBM also says that NEON has dropped its European Commission complaint. This makes sense because in June NEON agreed (in the sense that a person with their arm twisted agrees) to stop reselling and distributing its zPrime product and requested customers to remove and destroy their copies. zPrime was controversial since it first appeared in July 2009 because it allowed mainframe users to run workloads on specialty processors rather than on the main processor. IBM’s revenue stream is based on main processor workloads. So you can see why users would welcome such a product (and the consequent savings they would make) and why IBM would not. As a consequence claims and counter claims flew back and forth between the two companies until the resolution in early June. Since then, NEON’s IMS products have been acquired by BMC.

Finally, TurboHercules has dropped complaints about IBM with the EU. TurboHercules, a French company, was set up in 2009 by Roger Bowler, who created the open source Hercules mainframe hardware emulator. TurboHercules allows mainframe operating systems and applications to run on x64 and Itanium processors running Windows, Linux, Mac OS, or Solaris as the host environment for Hercules. The organisation was part funded by Microsoft (obviously, no lover of mainframe technology).

But it’s not all good news for IBM. It’s still the subject of antitrust probes by the US DoJ and the European Commission. So, those lawyers can’t take off too many days!

And on a completely different topic: don’t forget it’s the Virtual IMS user group meeting on Tuesday with Scott Quillicy, CEO and Founder of SQData talking about IMS replication for high-availability. There are more details at www.fundi.com/virtualims.