Sunday, 28 February 2021

Mainframe, Cloud, and the Post-Pandemic IT Landscape

With so much value being locked up in data, which is often held on tape, Model9 wrote in this year’s Arcati Mainframe Yearbook how to leverage and monetize that data using the cloud.

The year ahead, one more step into what promises to be a tumultuous decade for mainframes, is full of opportunities to do groundwork and lay foundations. Our customers tell us two things, loud and clear: They know they need to do more to leverage their vast store of mainframe data and they also continue to feel squeezed by the costs and limitations of traditional storage options.

Put another way, they know the mainframe is a solid, reliable core for their business, but they recognize competing priorities and opportunities, most of which involve leveraging the cloud for agility, faster and better analytics, and potentially more cost-effective storage.

All of these trends were becoming visible in 2019 but the double hammer blows of disruptions resulting from Covid and Covid lockdowns and the secondary economic impacts are powering them to prominence.

While some organizations have found ways to actually eliminate the mainframe, we believe there is still a strong case for most enterprises to have access to a dedicated processing powerhouse to ensure mission-critical activities and align fully with corporate priorities. But when it comes to the data stewardship occurring within the mainframe environment there are two closely related problems.

First and foremost, business units, divisions, and even individual contributors are “voting with their feet” – putting more and more data and more and more analytics in the cloud. It’s easy, simple, and can often produce immediate and valuable business results. But this independence limits the potential for cross-fertilization between the new and the old, the central and the peripheral, and between what’s core and what’s potentially just short-term. It is past time to relink those worlds – enhancing data quality for both and making mainframe even more relevant.

Second, historic dependence on tape for long-term and archive storage as well as backup and recovery results in heavy, ongoing investments in hardware, software, and people to manage them. And, it ensures that those petabytes of data, not to mention active data that is similarly walled-off by its proprietary forms and formats, is also difficult to reach by new tools that are in the cloud.

A recent report from leading analyst firm, Gartner, predicts, “by 2025, 35% of data center mainframe storage capacity for backup and archive will be deployed on the cloud to reduce costs and improve agility, which is an increase from less than 5% in 2020.” That’s momentous!

Cloud is no longer an experiment or a theory. It is a substantial part of the world’s information infrastructure. Even more than the changes wrought in the mainframe world by personal computers and client-server computing a generation ago, cloud promises tectonic change – but change that can benefit mainframe operations. Indeed, unlike mid-range computers and PCs in the past, the cloud shouldn’t be perceived as a threat for the mainframe but rather as a platform for integration/collaboration.

Mainframe organizations should consider where they stand with regard to existing technology, what their business needs and business opportunities are, and the available paths forward. The decision making is a step on what must become a journey.

Organizations face a continuing challenge in the stranglehold of tape and proprietary data formats, which are built around sequential read-write cycles that date back to 9-track tape. With this technology, both routine data management tasks as well as more ambitious data movements put an enormous processing drain on the mainframe itself and lead to increased MIPS costs.

The solution is to eliminate the direct costs of acquisition, maintenance, and management and instead to deal directly with data. This involves mastering movement. This can allow data to be used where it is needed and stored most cost effectively. This step can provide immediate dividends, even if you plan to go no farther.

By using modern storage technologies for mainframe that provide flexibility, you are no longer confined to a narrow list of choices. The technology can also provide the connectivity that enables movement between the mainframe and the new storage platform. With this enabler, you can be master of your own data.

The next challenge is achieving real integration of mainframe data with other sources of data in the cloud.

The solution is to take advantage of Extract-Load-Transform (ELT) technology instead of complex, old, slow, compute-intensive ETL approaches. ELT can leverage processing capabilities within the mainframe outside of the CPU (eg zIIP engines) and TCP/IP to rapidly extract mainframe data and load into a target. There, transformation to any desired format can occur economically and flexibly. The net result is more cost effective and generally faster than ETL.

Building on your movement and transformation capability can help you better engage with cloud applications when and where it makes sense. It is an ideal way to move secondary data storage, archive, and even backup data to the cloud, and then transform it to universal formats.

Liberated from the complexities of the mainframe, this data can be the crucial difference between seeing the full business picture and getting the full business insights, or missing out entirely. This data can also provide a powerful addition to a data lake or can be exposed to the latest agile analytical tools – potentially delivering real competitive advantage to the enterprise, at modest cost. And, all this without adversely impacting traditional mainframe operations, since ELT can move data in either direction as needed. Transformation can apply to any type of mainframe data, including VSAM, sequential and partitioned data sets. That data can be converted to standard formats such as JSON and CSV.

Once mainframe data is no longer locked in a silo it can be leveraged and monetized for new business purposes, right in the cloud and in ways not possible in a traditional tape or VTL environment.

The final challenge for some organizations, for example those that evolved to a highly decentralized operational model, is that a central, on-premises mainframe may no longer deliver benefits and may, in fact, add to latency, cost, and inflexibility.

The solution for these organizations is to recognize that data is the only non-negotiable element. If the data is in the cloud already or if you can get it there, you can grow more and more native capability in the cloud, ranging from operational applications to analytics. Replicating or matching traditional mainframe functions with cloud-based functionality is challenging but achievable. As the most substantial step, in a cloud journey, this step is necessarily the most complex, especially when organizations determine to actually rewrite their existing applications for the cloud. But ELT and the liberation of data from mainframe silos can lay the groundwork and provide a solid basis for finding a workable path to move beyond mainframe. In particular, moving historical data from voluminous and expensive tape infrastructure to the cloud permits consideration of a post-mainframe future, if so desired.

Complete migration to the cloud, is not for all organizations. But for some organizations it offers an opportunity to transform and grow in new ways.

By enabling the mainframe to play more effectively in the data business value game, cloud connectivity and cloud capabilities can make the mainframe an even more valuable and sustainable platform for the organization. Indeed, these options have the potential to make the mainframe even more integral to an organization’s future.

You can read the full article from Model9 here.

Sunday, 21 February 2021

Relationship Advice For You and Your Mainframe

Deborah Carbo, Director, Product Management & Strategy, Broadcom Mainframe Software suggested in this year’s Arcati Mainframe Yearbook that the modern world of enterprise IT is a lot like relationships. The more you put into them, the more you get out of them. And in this relationship, the mainframe is a perfect match!

She confirmed that investing in your mainframe delivers strong ROI and integrating it into your hybrid infrastructure can advance your transformation goals, provide competitive advantage, and deliver on your SLAs. Investing in your mainframe is also an excellent way to ensure you’re prepared for growth. Sure… I’ve heard the detractors. I’ve also seen risky relationships where some fall head over heels for a new tech trend only to find the honeymoon phase fizzle quickly. You know what they say, the grass isn’t always greener...

As with any relationship, you need to (be willing to) give to get. You get an epic increase in value from your mainframe by modernizing it in place, opening it up, and bringing it closer to your front-end digital apps by integrating it with your hybrid cloud. To strengthen your relationship and gain even greater value from your mainframe, here’s the idea:

First – celebrate what you have and amplify the love. You have a tremendous amount of code invested in the mainframe, and you can depend on it. Your mainframe has been with you through thick and thin. While Cloud is an attractive focus that offers tons of new potential, it isn’t the answer to every question. Recognize that your mainframe offers extraordinary value. It delivers transactional scale, data and user protection, and always-on availability – and, yes, it also offers tremendous new potential. Cloud? It’s well suited for horizontal scaling, web and front-end app serving. Starting fresh may sound appealing, but ultimately requires rewriting enormous amounts of good, highly-optimized code with new code that is more generic, unproven, and potentially vulnerable. You’ll never recover that ROI. I’ve heard from customers time and again that modernizing in place is the best way forward. It’s easier, less risky – and burns fewer of your precious resources. Continuing to invest in enduring solutions like the mainframe and working with it enables you to get a substantial return on your relationship investment.

Second – open up to new experiences. Interaction with others is what makes life rich and exciting. Just as healthy relationships don’t exist in a vacuum, thriving IT platforms don’t operate in a silo. There is no need to limit yourself to Cloud OR mainframe. You should see your future as Cloud AND mainframe. When you get the mainframe out of the back office and connect it to your front-end world of apps and mobile devices you unleash all-new strategic value. Now, all your developers can build richer, more powerful digital applications that easily integrate with mainframe processes and data using the cloud-native tools they already know and love. You dramatically expand your talent pool. Suddenly you’re seeing each other in a whole new light.

Today, new open tools and technologies make it easy for the mainframe to open up and interact with others while also increasing security. Leveraging open APIs makes it easy to connect, automate, and combine operational and other data to integrate and manage across the hybrid environment. Your infrastructure is more predictive, efficient, and protected and your DevSec and Ops teams can work with the tools they know and love.

With Zowe, the first open-source project for z/OS, any developer, sys admin, or sys prog can work with the platform just as they would with any other platform or cloud. By using the Zowe API Mediation Layer and a growing list of third-party tools like Service Now, you can speed mainframe development and operations and automate tasks like systems maintenance and software installs.

Imagine a young developer, Jeannie, hired straight out of university to work on the mainframe. She’s excited to discover that an open mainframe with integrations between Endevor and VS Code front ends and integrated CI/CD pipelines can take advantage of modern tools such as Git and Jenkins. Or the same with David, a Systems Programmer who, working with Sysview and Service Now can automate service tickets, freeing him up to work on more strategic projects. In both cases, these mainframers are now using a common toolset that allows them to interact more with front-end teams, allowing them to work more collaboratively with them and share in their culture.

Finally – look to the future and grow together. Especially in a world of cloud and continuous innovation, your mainframe has an important and indispensable role to play. It’s not only the most reliable partner in your IT relationship but allows you to deliver powerful new value to service customers in innovative ways as well. Realize that you make a great team and make the commitment to evolve together to achieve that next phase of growth.

The mainframe is well-known as a workhorse for large transactional and batch workloads. They’re all about high throughput and derive value from the stability, security, and processing speed of the platform. So, they belong close to the data. You get a single source of truth. Better performance, trust and integrity.

But it’s a hybrid world and nothing lives in isolation. When these workloads work together with other cloud-native workloads like your mobile, web and digital front ends you not only ensure you’re delivering but making the most efficient use of your resources.

Exploiting containers on the mainframe now takes the friction out of the hybrid environment and allows you to easily bring applications closer to the data they need for optimal performance. How? Containers let you better componentize apps and expose them. This allows you to more easily consume and deploy parts of those apps in the environment, bringing them closer to the data. From there they can more quickly iterate – including moving components that would traditionally run on the cloud to the mainframe. These apps and processes can then run on the mainframe, shortening the distance between the front ends and legacy code – communicating at memory speed, more secure, more performant, so better ROI again.

When it comes to containers on the mainframe, their capability and exploitation is still maturing. But there is significant excitement around them. They serve as another example of how it’s becoming easier and easier to work with the mainframe and bring the mainframe’s qualities closer to your front-end world.

You can read the full article here, including how you can make a date with Broadcom.

Sunday, 14 February 2021

Test Automation: The Key to Increasing Velocity, Quality and Efficiency

Compuware, A BMC company, used this year’s Arcati Mainframe Yearbook to tell us about research they’d commissioned from independent research company Vanson Bourne, who’d conducted a global survey of 400 senior IT leaders responsible for application development at organizations with a mainframe. They were looking at testing and automation practices on the mainframe and their impact on the speed of innovation.

They said that world’s biggest and most successful enterprises continue to run their mission – critical workloads on the mainframe. They remain committed to the platform due to its unmatched digital strengths in reliability, performance, security and transactional efficiency; as well as its decades of proven intellectual property in the form of fine – tuned business logic and data.

What’s often overlooked, they suggest, is that the majority of customer – facing distributed, web and cloud – based applications are highly reliant on the mainframe. As a result, it’s critical that organizations are able to deliver change on the mainframe as quickly as they can in their distributed systems, to support the demands for a constant cycle of ‘new’ from the business and to satisfy their always beautifully, wonderfully dissatisfied customers.

Against this backdrop, speed and innovation have become the rallying call for IT departments across every industry. The shift to Agile and DevOps has improved the pace of software delivery and has become essential to competing in the Age of Software.

However, testing remains a critical development function where manual methods, especially within mainframe development environments, are delaying software delivery and the pressure for shortened lead times are threatening quality. Test automation coupled with a “shift – left” approach—where developers write unit tests at the same time as they write source code—is not widely embraced for mainframe application development.

Further, an increase in mainframe workloads and a growing shortage of skilled professionals are increasing the need to ensure code has been thoroughly tested, which can put it at odds with the need to increase the pace of delivery.

In the report, they examined the processes that organizations have in place to support testing on the mainframe, while exploring the challenges they face in simultaneously increasing quality, velocity and efficiency. The report shows that traditional, manual testing swallows up skilled developers’ time and in turn puts pressure on teams to take shortcuts. Ultimately, it shows that the current manual approach to testing is often difficult, costly and time – consuming, and highlights the need to increase the use of automation to alleviate these pressures.

The key highlights from the study include:

  • 77% of application development managers find it increasingly difficult to simultaneously increase quality, velocity, and efficiency to meet business innovation goals when developing and testing mainframe application code.
  • 53% of application development managers say the time required to conduct thorough testing is the biggest barrier to integrating the mainframe into Agile and DevOps.
  • 80% of application development leaders say it is inevitable that unless they can automate more test cases, bad code will make its way into production.
  • 90% of application development managers say that automating more test cases could be the single most important factor in their success as the pressure on IT to accelerate innovation increases.
  • Only 7% of organizations automate the execution of test cases on mainframe code to support their ability to accelerate innovation.

IT teams are under constant pressure to deliver innovation at an ever – faster pace, but many are struggling under the weight of the challenge.

77% find it increasingly difficult to simultaneously increase quality, velocity, and efficiency to meet business innovation goals when developing and testing mainframe application code.

85% say that it is becoming harder to deliver innovation faster, without compromising on quality and increasing the risk of bugs in production.

The inordinate amount of time IT spends during the testing phase is a leading contributor to this challenge.

92% said their organization’s mainframe teams are spending more time testing code than they needed to in the past, to account for the growing complexity of cross platform application environments.

In fact, development teams spend, on average, more than half their time (51%) on testing during the release of a new mainframe application, feature, or functionality. By comparison, teams reported spending 39% of their time on testing when mainframe code is not involved, still a very large percentage.

These findings clearly indicate that traditional, manual methods of testing are amongst the biggest challenges that organizations face when it comes to accelerating the speed of innovation. In particular, organizations face significant challenges when it comes to testing mainframe code.

Manually testing mainframe code also hinders businesses’ ability to integrate the mainframe into wider DevOps initiatives, thereby slowing the pace of innovation across the business.

The traditional manual style of testing mainframe code also means IT teams are often forced to take shortcuts to deliver innovation at the speed the business requires. Yet application development leaders expressed concern that this pressure to cut corners to deliver innovation faster creates numerous business risks.

Organizations see automation of the testing process as the key to their ability to deliver innovation faster while maintaining quality.

While having obvious benefits in terms of speeding up the development process, test automation also helps to reduce human error, improve quality, and increase output and productivity. It also enables key workers to focus on more value – generating activities, such as user experience design and adding more business functionality.

Looking at the specific areas where automated testing on the mainframe stands to bring the most benefit to their organizations’ ability to accelerate innovation, application development leaders rated unit testing quite low, calling out the automation of functional testing (63%) and integration testing (53%) as priority focus areas, ahead of unit testing (37%).

This is in contrast to the ethos of the shift – left approach, where unit testing is performed in the earliest stages of development to enable developers to find bugs at the time they are introduced into their code, providing them with fast feedback and at a lower cost.

Automated testing/software quality tools have been available for mobile, web and distributed application code for several years, but mainframe code has not been as well supported by test automation capabilities. It was no surprise, therefore, that 86% of respondents said they find it difficult to automate the testing of mainframe code.

Fractured and fragmented testing practices also threaten the impact that organizations are set to achieve by automating more test cases on the mainframe. Many organizations are failing to test code at every stage of the development process and don’t empower their developers to test their own code. This creates an increased risk of bugs finding their way into production and increasing the frequency of software failures.

The last few years have seen automation become a major focus, as organizations look to accelerate software innovation and sharpen their competitive edge. Automation offers significant advantages through productivity, quality and efficiency improvements, and allows organizations to redeploy resources to support higher – value tasks that drive the business forward. It’s no surprise then that this research shows a growing interest in automated testing, as organizations look for a way to simultaneously accelerate the quality, velocity and efficiency of their efforts to bring new innovation to market, without fearing that faster speed equals lower quality.

However, despite the proven benefits of test automation, the research shows that many organizations are still conducting testing manually when it comes to the mainframe. Given the central role that the mainframe continues to play in powering modern digital services, these manual testing practices are creating a bottleneck that impacts quality, velocity, and efficiency. Ultimately, this is hindering the delivery of innovation and preventing organizations from meeting their business goals.

The research also highlights that businesses expect to achieve significant benefits by automating more testing on the mainframe at every stage of the development process. Not only does this remove the need for mainframe specialists to be directly involved every time that code is being worked on, it helps to improve quality, velocity and efficiency by enabling developers to fine tune their code as they’re working on it.

Ultimately, by “shifting left” with automated testing – and unit testing in particular – organizations can gain fast feedback on the mainframe, supercharging innovation without fearing the risk of introducing problems that disrupt operations, introduce security risks, hinder customer experiences or impact business revenues. Better still, they can improve quality, velocity, and efficiency on the mainframe in spite of the growing shortage of experienced developers; giving themselves a major advantage as they look to drive a competitive advantage through supercharged digital innovation.

You can read the full article here, including how Compuware suggests its products can help.