Sunday 27 June 2021

Computing tomorrow


Following the announcement of a £210 million partnership between the UK government and IBM to support businesses in the adoption of new digital technologies, we wondered what the future of computing might look like.

Amanda Solloway, the science minister, explained that the investment will be put towards the new Hartree National Centre for Digital Innovation (HNCDI). Based in Daresbury, Cheshire, HNCDI will enable businesses to acquire the skills, knowledge and technical capability required to adopt digital technologies like supercomputing, data analytics, artificial intelligence (AI), and quantum computing – according to their website.

The website goes on to say that HNCDI will help organizations and individuals with an appetite for change, who are ready to innovate and create useful solutions, enhance and adapt products and processes, adopt new digital technologies, and expand into new markets. They say they will work with start-ups and SMEs to large corporates, and public sector organizations such as NHS Trusts and local government. And, they offer training on an individual and group basis.

The UK government is investing £172 million over five years through UK Research and Innovation (UKRI), and IBM is contributing £38 million. In addition, 60 more scientists, interns, and students will join IBM Research and the Hartree Centre in the joint Science and Technology Facilities Council (STFC) – IBM Programme.

While we were working from home over a Teams meeting, we started kicking around some ideas of what the future of computing, that they might be researching, would look like. Here are some of the things that were said.

Quantum computing is an obvious goal. The benefits of getting workable quantum computing are enormous, and companies like IBM and Google are regularly making small changes to how the define and measure quantum computing and then making announcements. Quantum computing will be a complete breakthrough in computing when it comes because so much more computing power will become available. One of the consequences of that is that many levels of encryption will be broken by a quantum computer in a relative short period of time (it would currently take hundreds of years using the available technology to break the encryption in use). So, that has worrying implications.

Artificial Intelligence is another obvious goal. AI and Machine Learning mean that computers can not only do straightforward repetitive tasks, but can also learn, make decisions, and perform more complex tasks. This, in turn, allows things to happen more quickly (computers work faster than people, and for longer hours). It should make our lives easier. Cars really could drive themselves, planes could fly themselves, etc. The downside is, of course, that fewer people would be needed because the ‘machines’ would be doing the work. And this has a huge impact on the economy. And, according to the movies, could lead to the formation of Skynet and the arrival of the Terminators!

Data analytics is something that many organizations are enjoying the benefits from using. The traditional model of selling is that person A makes a product and takes it to person B. Person B sells the product in their shop to Person C, who then takes it away and uses it. Data Analytics allows Person B to see what types of product Person C has been buying and is able to suggest to them other products that they might like. And they can do this with vouchers, on social media, via an app on their phone, etc. It’s a way of understanding customers and encouraging them to buy more from you rather than your competitors. Using the information available from data analytics helps a company be more competitive and therefore successful than its rivals.

Supercomputing is all about getting as much computing power as possible so it can be used to answer difficult questions like, “what will the weather be like tomorrow?” IBM has its Blue Gene/P supercomputer and Summit. Other names you’ll hear are Cray and Fugaku Systems. It’s thought that quantum computers could replace the need for supercomputers.

In the near future, mainframes are likely to continue running the amount of work they do. It’s also likely that mainframes and distributed systems will continue offloading parts of their work to the cloud. Replicating data to the cloud makes restoring data quicker and easier and is vital in a business continuity situation.

Security seems to be the biggest challenge. Any computer that people can use (and that’s all of them) is likely to be hacked. Employees, in an unguarded moment, are likely to click on a link or click on an attachment and download a piece of malware that could start the attack. It’s not just the software on a laptop that can be attacked, but the firmware can be as well – meaning that extra levels of security are required. And trusted members of staff can be cynically manipulated by bad actors to steal data or give them access. New security terms are being coined all the time to describe where the focus for security should be placed. Things like ZTA (Zero-Trust Architecture), SASE (Secure Access Service Edge), APM (Application Performance Monitoring), XDR (Extended Detection and Response), and CASB (Cloud Access Security Broker), to name but a few.

Computers getting smarter clearly has a positive impact on businesses and, indirectly, everyone. It makes life easier for them. The downside, unless steps are taken, is that smarter computers make life easier for hackers. As mentioned earlier, quantum computers could break through quite sophisticated levels of encryption. Criminals could use data analytics techniques to identify people who they could most easily manipulate to perform criminal acts. And AI could lead to various doomsday scenarios.

This has been the case with every development ever. It can be used for good or bad. Let’s hope that the developments coming from HNCDI are used for our good.

Sunday 20 June 2021

Auditors, compliance, and the mainframe

Mainframes have been successfully keeping organizations in business for over 50 years. Let’s just look at some statistics. Mainframes are used by 71 percent of Fortune 500 companies. They handle 90 percent of all credit card transactions. Each IBM z15 mainframe can handle 19 billion business transactions a day. And mainframes handle 68 percent of the world’s production IT workloads, yet they account for only 6 percent of IT costs.

Drilling down on those figures we find that in terms of ATMs and IMS:

  •         $7.7 trillion credit card payments (annual)
  •         29 billion ATM transactions (annual)
  •         12.6 billion transactions (daily)
  •         87% of CC Transactions done on z/OS.

With so much work taking place on a mainframe and so much money being transacted, you’d assume that auditors would be all over the mainframe. You’d probably assume that auditors would know almost as much about how mainframes work as systems programmers do. You’d think that they would want to know the tiniest of intricacies in order to assure themselves that corporations using mainframes were absolutely compliant with all the regulations that applied to them – things like the Payment Card Industry Data Security Standard (PCI DSS).

Worryingly, in many cases, auditors are put off by the complexity of mainframes and don’t know the right questions to ask. Not that I’m suggesting that organizations are committing any kind of fraud on their mainframes. What I am suggesting is that they may not be completely compliant with the regulations that apply to them.

The very nub of the problem is that the PCI DSS requires the use of file integrity monitoring (FIM) software on a computing platform, and hardly anyone using an IBM mainframe has that type of software installed. And that seems strange, bearing in mind that mainframes are used by the majority of financial institutions in the world.

Let’s look at those PCI regulations in more detail. Section 10.5.5 asks: “Is file-integrity monitoring or change-detection software used on logs to ensure that existing log data cannot be changed without generating alerts (although new data being added should not cause an alert)?”. And section 11.5 asks: “Is a change-detection mechanism (for example, file-integrity monitoring tools) deployed to detect unauthorized modification (including changes, additions, and deletions) of critical system files, configuration files, or content files?”.

Clearly, most sites aren’t compliant because they aren’t running file-integrity monitoring software on their mainframes, and yet these organizations are signing off the section 3 validation form saying that they are. And the person signing is probably the CIO, CFO, or CEO!

Many mainframe sites try to get round this issue with what they call ‘compensating controls’. The truth is that these compensating controls are basically non-existent. The next ploy used by organizations is to keep mainframes ‘out of scope’. But as shown in the figures at the start of this article, that clearly isn’t the case. And, if the auditors understood what was actually happening on the mainframe, they would be able to ask appropriate questions to show that was the case. The question they should be asking is: “If 90 percent of debit and credit transactions end up running on a mainframe, how can mainframes possibly be out of scope of a PCI Audit?”

Worryingly for many mainframe sites and their auditors is that V4.0 of DSS is due out in the next year. It’s unlikely that the rules in 3.2.1 will change. However, what is likely to change is that the enforcement and scrutiny of compensating controls will probably be greatly strengthened.

Focusing on security for a moment. On 12 May, US President Biden issued an executive order, amongst other security measures, to develop a plan to implement Zero Trust Architecture (ZTA) for Federal organizations. And zero-trust seems to be the way that security is going. NIST (The National Institute of Standards and Technology) earlier this year said: “An enterprise monitors integrity and security posture of all owned and associated assets. No asset is inherently trusted.” How do we get to ZTA on a mainframe? PWC recently published some guidelines. Item 2, on their 4-point list. says ‘File Integrity Monitoring’. That, I think, also highlights the pivotal role of file-integrity monitoring in mainframe security.

Lastly, and this is relevant because the majority of ATMs are connected to IMS running on a mainframe, there was advice from the PCI and the ATM Industry Association highlighting the need for file-integrity monitoring software on mainframes running transactions from ATMs. You may remember last October, there was an urgent bulletin from the PCI and the ATM Industry Association – the first ever bulletin issued by the two associations together, which highlights its significance – about cash-out attacks on ATMs. Thieves breached bank or card processor's security to manipulate fraud detection and took lots of cash from a number of ATMs. As we said, most ATM transactions are captured by IMS running on a mainframe. The advice given was that organizations should get file-integrity monitoring (FIM) software to combat the cash out hack.

What is file-integrity monitoring software? As its name suggests, it identifies when a file has been changed. It does this by taking a baseline copy and keeping that securely in a vault. It then checks the baseline copy against the current version at user-defined intervals and alerts when any differences are found. Obviously, lots of changes will be authorized, so, it can check against ServiceNow, BMC Helix, etc to only alert about unauthorized changes. More advanced FIM software can identify exactly what has been changed, when it was changed, and who by. And, following agreed policies, it can have the userid of the culprit suspended and the changes backed out. Ransomware attacks now corrupt backups before encrypting data. Advanced FIM software can check backups at regular intervals to identify if any unauthorized changes have been made and so help stop ransomware attacks. And it can do all this very quickly.

Putting it all together, it seems that the PCI, the US government, NIST, PWC, and others are looking at FIM as part of the answer to mainframe security. It seems that auditors need to be better prepared to ask more searching questions about mainframe compliance with agreed standards. And it seems that mainframe sites need to realize the benefits they will get from using FIM software.

Monday 14 June 2021

How to connect to a mainframe


With the pandemic, everyone has been finding ways of working from home – or, in fact, working from anywhere. It’s not that mainframers had never worked remotely before, it’s just that recently more of them needed to do it most of the time.

Probably, the best-known way to access a mainframe is to use 3270 emulation software running on a PC. Examples of this type of software are IBM’s PComm and tn3270 from Tom Brennan Software. There are numerous others from software vendors like Rocket Software.

Some IT teams have got their IT staff to connect from their home computer to their office computer. Once they’ve logged in to their work computer, they can connect to the mainframe in the usual way using the terminal emulator installed on that computer. Connecting from their home computer to the office can be done using something like Microsoft Remote Desktop Protocol (RDP). Other organizations have used a Virtual Private Network (VPN) to allow employees to securely connect from home to their corporate network, and then access the mainframe.

These were all great ways to get connected as the pandemic and lockdown required that people be able to work from home, but now that we are, hopefully, coming out of the worst of things, it makes sense to re-evaluate our way of working and see whether there might be a better way of connecting mainframe-using employees to the mainframe.

Now is the time to ask the question, “what do users want?” as well as simply, “how can we do it?” And what users seem to want at the moment is to work from any device with a browser. It’s what they tend to do for everything they spend time on – shopping, social media, etc. So, how can we give mainframe-users a browser interface.

Virtel has been doing this for some time with Virtel Web Access (VWA). What it does is securely serve 3270 screens as standard HTML webpages over encrypted HTTPS connections to browsers. Using a browser, users see the interface that they are familiar with. The 3270 terminal emulation is a web browser.

Similarly, Rocket Software has Rocket Terminal Emulator (Web Edition), which used to be called Rocket® BlueZone Web. This, they say, delivers secure, browser-based emulation to any device, including desktops, laptops, tablets, or other mobile devices. This allows users to access mainframe applications from any browser, anywhere.

Macro 4 has Tubes, its session management software that comes with a web interface. This can web-enable mainframe applications from the user’s point of view, and employees can then access mainframes from any device with a web browser. Tubes provide a single point of entry for a user to all the applications that they’re allowed to access. And users can change from one application to another without having to log off and on again. A web-enabled session manger allows users to work on 3270 applications on any sized screen and use mouse clicks and even touch-screen commands to get work done. 3270 session manager software is available from other vendors.

A more complex solution for CICS users is provided by HostBridge Technology. They have the HostBridge JavaScript Engine (HB.js), which enables the creation of JavaScript/JSON-based integration scripts and APIs. HB.js scripts work with all types of CICS applications without relying on screen scraping. HB.js is a complete solution for rapidly developing and deploying reusable Web services and/or scripts. That takes things a whole step further in integrating mainframe CICS applications with the outside world in general, not just staff working from home using a browser.

Some people may want to connect to the mainframe to develop code. One way to do that is to use Visual Studio Code, a source-code editor from Microsoft that runs on Windows or Linux. Users can then interact with z/OS in the Terminal window by using commands provided by IBM RSE API plug-in for Zowe CLI (RSE CLI plug-in), Zowe CLI, or both. The Zowe Explorer extension for VSCode lets users interact with data sets, USS files, and jobs that are stored on z/OS. The extension can be installed directly to VSCode to enable the extension within the GUI.

The real gold standard, the piece of software that seems to be missing from these choices that are available is web browser gateway software actually running on z/OS. It would allow z/OS applications to interoperate with other platforms (like Windows, Linux, and the cloud) in real time and users could interact with their mainframe in the same way that they do other platforms. There would be standard interfaces such as XML, HTTPS, and REST APIs. Interfaces that plenty of people are familiar with and could use easily, and there would be no need for specialists from other platforms.

And because this gateway software would actually run on the mainframe, there would be no need to access mainframe applications through a Linux of Windows server first. There would also be no need for any new hardware, or support, or repackaging. As far as the mainframe team would be concerned, they would simply install a piece of software on their mainframe – it could either run inside a z/OS based server address space or in a standalone address space – and everyone who was authorized could directly access any application functions they were authorized to access.

I guess everyone has their own favourite way of connecting remotely to a mainframe. ‘Green screens’ allow mainframe experts to work quickly and efficiently, and so terminal emulation is very common – and has been for a long time. I guess the important thing is to be able to get the work done remotely. However, there must be a better, more modern way of doing things, don’t you think?