This is the first part of a two-part article.
I think most mainframe sites are realizing that using the cloud is not some kind of betrayal of their previous computing environment, but a way to enhance the computing power available to their organization. It also comes with ways to control the cost of IT spending, as well as the ability to increase or decrease the computing capacity depending on the needs of the applications. In addition, some things, like analysing data and Artificial Intelligence (AI) work so much better in a cloud-based environment.
Cloud advantages
The benefits of the Mainframe as a Service (MFaaS) model that are usually given by cloud companies include:
- Scaling – organizations can scale up or down according to their needs.
- Maintenance and upgrades – organizations pay only for the compute and storage they use. In addition, maintenance and upgrades to IT infrastructure are handled by the cloud provider.
- Business continuity – the cloud provider handles any IT issues reducing any noticeable downtime.
- Predictable costs – the price for services is agreed ahead of time, which makes budgeting easier.
- Support – cloud companies provide continuous support.
- Security.
Other advantages include better business agility and a larger pool of potential staff with cloud computing experience.
Cloud providers also talk about moving from outdated mainframe technology!
Mainframe advantages
The big argument for mainframes is that they are flexible, scalable, and efficient. The truth is that cloud computing can now claim to have these characteristics. So, what reasons could people give for staying with mainframes and not moving everything to the mainframe. The answer would probably include words like security, privacy, resilience, and failover. No other platform can offer the security features that are found on z14, z15, and z16 processors. The z16’s Telum chip comes with on-chip acceleration for AI inferencing while transactions are taking place, allowing financial institutions to move from a fraud detection posture to a fraud prevention posture.
The z14s saw the introduction of pervasive encryption, which enables extensive encryption of data in-flight and at-rest to substantially simplify encryption and reduce costs associated with protecting data and achieving compliance mandates.
z15s introduced Data Privacy Passports technology that can be used to gain control over how data is stored and shared. This gives users the ability to protect and provision data and revoke access to that data at any time. In addition, it not only works in the z15 environment, but also across an enterprise’s hybrid multi-cloud environment. This helps enterprises to secure their data wherever it travels.
Also new with the z15 is IBM Z Instant Recovery, which uses z15 system recovery technologies, to limit the cost and impact of planned and unplanned downtime by accelerating the recovery of mission-critical applications by using full system capacity for a period of time. It enables general-purpose processors to run at full-capacity speed, and allows general-purpose workloads to run on zIIP processors. This boost period accelerates the entire recovery process in the partition(s) being boosted.
Cloud computing can't come close to those capabilities.
Other security questions you might ask about cloud-based working are around data, where it resides and who has access to it. There may also be questions over data latency as in how far away the data is from the application using it. It’s important to recognize that the attack surface area is larger and more complex in the cloud.
Hybrid working advantages
So, bearing those things in mind, why would a mainframe site consider utilizing a cloud environment? The answers are: to do things that the mainframe can’t, like data analytics; to do things that the mainframe isn’t as good at, like speedily developing and testing applications, and/or building multiplatform applications using APIs (Application Programming Interfaces); and to use cheaper staff who aren’t mainframe trained. The cloud makes it easy to access applications and data from anywhere at any time. Users can choose and implement only the features and services they need at any given time.
Migration choices
Any site that does decide to migrate some or all their applications to the cloud needs to decide how they want to go about it. The choices are:
- Refactor – this usually refers to modifying the applications so that they better support the cloud environment.
- Replatform – applications are moved to the cloud without major changes, but taking advantage of benefits of the cloud environment. Using a cloud-hosted emulator also allows organizations to start using .NET, Java, or other APIs to integrate with previously inaccessible programs and data.
- Rehost (lift and shift) – applications are moved to the cloud without making any changes. The code is recompiled to run in a mainframe emulator hosted in a cloud instance.
- Rewrite/rebuild – generally, not recommended because it is a risky approach. It’s complex, costly, and time consuming. It’s hard to predict the budget needed, how long it will take, and the return on investment. A better approach is to move the applications to a cloud-based emulator, migrate the database to a cloud-based database, then replace modules over time.
- Replace – completely replace the mainframe functionality with a program or suite of programs, typically a Software-as-a-Service (SaaS) application. This removes the issue of needing to maintain code, but makes it hard to customize beyond the options provided by the vendor.
- Retain (revisit) – keep applications on the mainframe. These might include applications that require major refactoring, and you want to postpone that work until a later time, and legacy applications that you want to retain, because there’s no business justification for migrating them. (This is discussed later.)
- Retire – decommission or remove applications that are no longer needed.
The second part of the article will be published next week.
No comments:
Post a Comment