Sunday 4 May 2014

Keeping down costs and getting ahead of the competition

We all know that everyone should be using mainframes for all their computing needs – and we also know that most people have got a laptop for their smaller computing needs on which they use Microsoft Office. A recent survey has published interesting results showing how those users could save money.

The company is called SoftWatch (www.softwatch.com), and their benchmark study looked at options for moving from on-premises applications to cloud-based solutions. They compared the cost of using MS Office with using Google Apps. Their study included over 150,000 users. What they found was that, on average, an employee spends only 48 minutes a day on MS Office applications, most of it on Outlook for e-mail. The study also revealed high numbers of inactive users in the organizations, and found that PowerPoint was not being used at all by half of the employees. In addition, most of the users of the other applications used them primarily for viewing and light editing purposes, with only a small number of heavy users: 2 percent in PowerPoint, 9 percent in Word, and 19 percent in Excel. According to SoftWatch, by moving light users from MS Office to Google Apps, organizations could save up to 90% on their Microsoft licensing fees.

By using SoftWatch’s software, companies can actually measure to what extent employees are using MS Office applications. The data is displayed on a dashboard, making it easy to determine what an enterprise really needs in terms of Microsoft licences, how to effectively transition to Google Apps, and how much money can be saved.

Also this week, Adaptive Computing (www.adaptivecomputing.com/‎) was thinking about Big Data, and more especially data paring. They were saying that data is growing exponentially, and growing your computer data exponentially will require budgets that aren’t realistic. As the amount of data increases exponentially, the amount of interesting data doesn’t, so somehow the ‘noise’ needs to be ignored.

They suggest that paring and sifting algorithms are the way forward, and that they will only grow in significance over time. They add that data capturing will always be fundamentally faster and easier than data analysis, and data will continue to multiply exponentially. Not wasting time on irrelevant data will be one of the keys to staying ahead of the competition.

They conclude that the scientific community has been determining how to remove irrelevant data for a long time, so long that the term outlier is mainstream. As Big Data moves to the forefront, organizations that can adapt techniques to ignore outliers and draw intelligent conclusions based on higher-correlated data are going to lead the way.

With the growth in the Internet of Things (IOT) and the ability to store more-and-more data, I can only agree with them that sorting the chaff from the wheat is going to be what separates successful organizations from the rest.

So there you are. I’ve given you a suggestion of how to cut IT costs, and how to keep your organization ahead of the opposition. Surely this must result in some money coming the way of the mainframe team for their IT needs – as your company looks to the future!

No comments: