Sunday, 24 February 2013

Guest blog - Cloud and mainframes: a perfect couple

This week, I’m publishing a second blog entry from Marcel den Hartog, Principal Product Marketing for CA Technologies Mainframe Solutions. You can read more of his blogs here.

I know what you’re thinking: another Generation Z person wanting to ride the cloud hype and make sure we don’t forget about his favourite platform. But bear with me while I explain there is a lot of logic behind this...

I am old enough to remember the time when many of us in IT were surprised by the rise of the distributed environment. And, like many others, I did my fair share of work building applications on both mainframes and distributed servers. I was, however, lucky enough to work in an environment with little or no bias to any platform; our management simply asked us to pick the best platform for any given application.

Soon after building the first distributed applications, we found that we needed the data that resided on the mainframe. Not a big surprise, because most of the mission-critical data was stored on the mainframe AND we had a requirement that almost all the mission-critical data in the company had to be up-to-date, always!

This was not easy, especially in the early phases. We did not have ODBC, JDBC, MQSeries, Web services, or anything else that allowed us to send data back and forth from distributed applications to the mainframe. We had to rely on HLLAPI, a very low-level way of transmitting data by using the protocols that came with 3270 emulation boards like IRMA or Attachmate. And for the data that did not require continuous updates, we relied on extracts that were pumped up and down to a couple of distributed servers every night, where the data was then processed to make it usable for different applications.

The “fit-for-purpose” IT infrastructure we had back then was not ideal because it required a lot of work behind the scenes, mainly to make sure the data used by all these applications was up-to-date and “transformed” in the right way. But the technology became better and better, and with the move from text-based Clipper applications using HLLAPI to communicate with CICS transactions, we slowly moved to more modern, visually attractive programming languages that used more modern protocols like ODBC and JDBC. But since not all data on the mainframe or on distributed systems was capable of being accessed by these protocols (like VSAM files, IMS databases and others), for many applications we had to rely on data transport and transformation routines (and the servers to store and manage it) for many years. Even today, companies pump terabytes of data up and down their various platforms for many different purposes.

The next phase in IT was the switch from home-grown applications to buying applications like ERP, CRM, or other third-party (standardized) applications. And again, these applications not only required data from the mainframe and distributed applications, we also had to extract data FROM these applications to update the other applications our companies relied on. After all, every company runs with a mixture of applications on a mixture of platforms. And to complicate things even more, we do not buy all applications from the same vendors. So we have to exchange data between commercial applications as well. More complexity, more work, more money and more stuff to manage.

Welcome to the world of cloud. Where again, we will run (commercial) applications on a different platform and where we are (again) asked to make sure that our companies most valuable IT asset (our data) is up-to-date across all the different platforms: mainframe, cloud, and distributed. After all, we don’t want our customer data to be different across the different applications; the fact that we have to duplicate across all our different environments is already bad enough.

The efforts to create a pool of IT resources that enables us to create the “fit-for-purpose” IT infrastructure we are all aiming for has, until now, resulted in a very complex IT infrastructure that requires a lot of management. Here is where a latest IBM mainframe technology, comes in. The “pool of resources” can be found externally in the form of cloud services, as well as internally in blades that are configured to run virtualized systems on the mainframe. Adding the mainframe to this pool has big advantages. If you run Linux on the specialized engines (IFLs) on the mainframe, you need a lot less infrastructure to allow the Linux servers to communicate with each other and with your existing zOS applications. So there is no need for cabling and other network infrastructure. This not only greatly reduces the amount (and added complexity) of the hardware needed, it also means less power consumption and more flexibility. In the beginning of this article we agreed that many applications need access to mainframe data. By bringing the applications that need this data closer TO the mainframe, and by reducing the amount of infrastructure devices needed to connect it all, we make things simpler, faster, more manageable, and more reliable.

To make the mainframe part of an internal pool of IT resources that is flexible enough to accommodate the business needs of our management, there is one last step we must take; we need the right software to orchestrate, provision, and decommission servers on demand. There are already a number of vendors who offer software that helps you to provision both internal and external (cloud) resources in a matter of minutes. Simple drag-and-drop applications that provision servers and configure the network and connection settings make it possible to create complex environments in a matter of minutes. But none of them has supported the mainframe (yet). In the near future, however, the mainframe will also be supported, and this will open a new world of possibilities for everybody who already owns a mainframe. Soon, the mainframe will not just be used as part of an internal or hybrid cloud environment, it will act as one of the components you have in your pool of resources.

Yes, a stand-alone mainframe can offer many of the advantages that “cloud” offers: on-demand capacity, virtualization, flexibility and reliability. But not until you are able to use the mainframe as a component in a cloud infrastructure, so that you can bring the applications that need access to their data closer to the mainframe, will the two really be a perfect couple.

Sunday, 17 February 2013

Review: Nuance PDF Converter Professional 8

As a regular and frequent user of Adobe Acrobat – the full version, not just the reader – I was intrigued to see whether anyone could come up with a product that was more useful or cheaper, so I welcomed the chance to have a look at the PC version of Nuance’s PDF Converter Professional 8.

It was easy to install and opened up a variety of PDFs that I had lying around. So that meant it was at least as good as the Adobe Reader! So next I had a look at some of the facilities and features it promised. In these days of cloud computing and document sharing, it offered that facility easily and fully. There’s a drop-down menu called ‘collaborate’ that calls up the Gaaiho Collaboration feature. When I installed, I included the Nuance Cloud Connector. I now have an ‘N:’ Nuance Cloud drive as one of my storage drive options. You can also save and open files on Dropbox.

The first thing I wanted to try was the ‘Advanced Edit option in the top row of the tool bar. This converts the PDF into a format that is editable. I changed the font and font size of text in a document. I was also able to insert whole paragraphs of text. I could even have added tables. I liked this feature.

It’s got quite a clever form-filling feature. You can download a form and start filling in the boxes. It’s even clever enough to tab to the next field. And it’s clever enough to convert an old PDF to a searchable PDF. And you can convert your PDF to Word, Excel, PowerPoint, and WordPerfect documents.

Nuance also do Dragon NaturallySpeaking voice recognition software, and I hear they have something to do with Siri on iPhones, so they’ve included voice recognition in this software. It allows you to annotate your PDFs. Not sure whether I really need it, but it’s fun! And if you don’t want to speak, you can type into a resizable text box to make notes.

And like Acrobat, you can reduce file size and split the document. Tucked away on the left-hand side is the Pages tool. That gives you the option to add, replace, and delete pages – a feature I use quite often.

On the right-hand side is a pull-out menu that provides stamp options (you can make a stamp from any area of the PDF), security options, envelope and watermark options, as well as recovery and clip-art options.

Overall, it’s a clever and feature-rich piece of software. Considering its price compared to Adobe Acrobat and the extra facilities it offers, I think it is well worth taking a look at if you’re a regular (or plan to be a regular) user of PDFs.

Sunday, 3 February 2013

BYOD growing in importance at mainframe sites

The Arcati Mainframe Yearbook 2013 is now available for download from www.arcati.com/newyearbook13 – and it’s FREE. Each new Yearbook is always greeted with enthusiasm by mainframers everywhere because it is such a unique source of information. And each year, many people find the results of the user survey especially interesting. And this year, for the first time, we asked about BYOD (Bring Your Own Device).

Firstly, we wanted to know how important sites thought it was to make mainframe data available to other platforms. 82 percent of sites said that it was very important to the way they work at the moment. When it comes to how important the idea of people using their own devices is to access mainframes, 10 percent of sites said it was very important to the way they work now – but 18 percent are in the planning stages, and a further 12 percent expect to be in the future.

There’s been a huge growth in the use of social media in recent years, and we wondered whether mainframers found social media (Facebook, Twitter, Youtube, etc) useful for their work on the mainframe. A fifth (20 percent) said that they did, with eight percent not sure, and the rest not using it at all. Perhaps a surprisingly low figure, with IBM having Facebook pages dedicated to IMS, CICS, and DB2.

The survey also looked at IBM’s PureSystems – said to combine the flexibility of a general purpose system, the elasticity of cloud, and the simplicity of an appliance. They are integrated by design and come with built-in expertise. So, that’s the hype, but were survey respondents actually buying them? Only two percent of respondents said they were and, similarly, only two percent said they planned to. The survey also looked at the take up of business analytics (IBM’s Smarter Analytics). Here the take up was slightly higher, but again it makes disappointing reading for IBM, with four percent of sites saying they were currently using the technology and another four percent planning to.

Talk of Linux on mainframes has used the old joke of’ taking ten years to become an overnight success’ for a little while now. But this year, the survey found only 34 percent of respondents (down from last year’s 44 percent) said that they run Linux on the System z (with another 16 percent, up from last year’s 6 percent, at the planning stage). There are considerable cost and management benefits of consolidating distributed Linux workloads onto the mainframe, and IBM made the IFL (Integrated Facility for Linux) specialty processor available in 2001. Even so, running Linux on a mainframe seems now to be a mainstream technology.

Oracle and DB2 have been at loggerheads for a while now, but while the survey was thinking about Linux on mainframes, it queried how many sites were running Oracle under Linux on System z. Just eight percent of sites surveyed said that they currently run Oracle on zLinux, with another eight percent planning to.

The survey was completed by100 individuals between the 1 November 2012 and the 7 December 2012. Just under half (48 percent) were from North America and just over a third (36 percent) were from Europe, with 10 percent from the Asia/Pacific region, and eight percent from the Middle East/Africa.

48 percent of the companies have in excess of 10,000 employees worldwide. Below that, with 20 percent of respondents, are staff sizes of 1001-5000. With 12 percent of responses are staff sizes of 5001-1000. And with 10 percent each are staff sizes of 1 to 200 and 201 to 1000. 84 percent of our respondents were involved in running in-house data centres, with only six percent working in an outsourced operation.

So what are the main benefits for an organization of using a mainframe over other platforms? 88 percent of respondents highlighted the benefit of availability, with 74percent identifying security as a benefit. Scalability and manageability came next with scores of 68 and 66 percent respectively.

So why isn’t everyone using a mainframe? 82 percent thought mainframes are too expensive (or appear to be). 58 percent thought that there are cultural barriers between mainframers and other IT professionals. 24 percent identified concerns about future availability/support for mainframe applications, 18 percent said difficulty in obtaining or retaining the necessary skills, and 12 percent thought mainframes are too complex (or appear to be). Sadly, only two percent thought there were no obstacles to mainframe acceptance.

Full details of all the questions and responses can be found in the user survey section of the Yearbook. It’s well worth a read.

The Yearbook can only be free because some organizations have been prepared to sponsor it or advertise in it. This year’s sponsors were: Software AG, Software Diversified Services (SDS), and William Data Systems.