Sunday, 29 April 2012

Extending the range

I last looked at extending wireless networks in “Wireless working” in October 2006 and “Extending a small network” in January 2007. So it’s about time I had a look at what’s currently available. Here at iTech-Ed Ltd, we got our hands on the Netgear Universal Wifi Range Extender – WN3000RP.

We’ve been using wifi to connect various laptops to the printer and to the router for a few years now and it has worked successfully. We also have high-speed broadband and everything is working well. But nowadays we also have smartphones and tablets that want to be on the wifi, and we could have a whole range of games boxes and other devices that would be naturally located at the limit of where our wifi reaches – or perhaps even just beyond. The answer to that problem is a device like Netgear’s range extender.

It looks like a big plug that would go into any socket, but it also has two small antennae that stick up on either side. Having said that, when it’s plugged into the socket, it can be quite unobtrusive – depending on where the socket is placed. There’s a picture of the device here.

Setting up the device is fairly straightforward. We found a socket on the wall that was close to the fringe of the current router’s wireless coverage and we plugged it in and turned on the power using the button on the side of the device. The instructions say that we could have set up the device using WPS (Wifi Protected Set-up), but we didn’t have that facility. We used the wireless connection manager facility on a laptop to ‘see’ the NETGEAR_EXT wireless network. This is the SSID (Service Set Identifier) the plug was sending out. The next stage was to launch Firefox (in our case, but any browser would do) and try to connect to a Web page. This automatically re-directed the browser to www.mywifiext.net. This is the set-up page for the device. We simply ran through the required set-up and saved our answers. The effect was to reboot the device. The next stage was to go back to the wireless connection manager and look for wireless networks. From where we were located, we could just see the original network from the router, and we could see the new one from the Range Extender. This second one now had the same name as our original router’s with “_EXT” on the end of it. We made sure that the laptop would connect to this new wireless device automatically and treated it as a ‘home’ (ie safe) network.

There are lights on the front of the Extender that indicate when it’s powered on; when there’s a connection between it and the router; and when there’s a connection to a PC. It also comes with an Ethernet port, if you need it.

It really didn’t take very long and the instructions were very clear about what to do. Our first test was to move the laptop we’d set everything up on into distant corners of the building. It could still see the new network (whereas before, it couldn’t it couldn’t see the old one). We then tried a smartphones, which easily found the new network. And then tried an iPad. It, again, could now get signal in the furthest corners of two different rooms. But having signal is one thing, having a decent speed is something else again. We used our standard network test – which is to connect to the BBC iPlayer and see whether we can watch a TV programme without it needing to buffer. We found a recent edition of Horizon and tucked ourselves away into a corner. The good news is that the reception was great. We watched about five minutes worth and there was no buffering – and we assumed that if we watched longer it would be the same. Now, there are all sorts of factors that can affect speed, and I’m not saying that HD TV would work as well or streaming a blu-ray DVD. But for our kind of usage. It passed the test with flying colours.

I liked the speed and ease of the installation. There was no CD required and I didn’t have to connect to the device with an Ethernet cable. I simply used the wireless connection manager. If anyone wanted to use this at home to send a signal to distant rooms, I have no doubt that most people would be able to follow the instructions with ease. But there are plenty of smaller offices that have thick walls, or long distances between the router and their furthest reaches that could benefit from such a device.

If you’re looking for a range extender, then the Netgear Universal Wifi Range Extender WN3000RP is a good place to start.

Sunday, 22 April 2012

Cloud wars

There was a time when mainframers simply claimed that they’d been using the cloud computing paradigm all along. Then IBM came up with SmartCloud, its vision for cloud computing, offering SmartCloud services and SmartCloud solutions. Last week saw the announcement of the PureSystems private cloud packages. But for non-mainframe users or non-IBM technology users, the clouds are getting darker!

Now, I’ve been using a Pogoplug as cloud resource for a couple of years now. Next to my router, I have a small Pogoplug box. And attached to that box are a number of big memory sticks with files that I want available wherever I am. I can upload and download the files to any computer and I have the app on my phone and tablet – it’s all very convenient.

But for people who don’t have a Pogoplug, there’s been Microsoft’s Skydrive offering online storage space to (originally) people with an MSN account. It’s handy to keep a copy of your files in the cloud and then access them when you go somewhere else. It also allows file sharing. But you may get irritated by the need to download Silverlight.

And that’s the area where Dropbox is such a success. It allows users to share folders with other people and those folders can contain very large documents. Dropbox, in many ways ,became the de facto standard – everyone had heard of Dropbox. And now, BitTorrent lhas aunched Share, which is its equivalent – and is perhaps aimed at younger people who might be more familiar with using BitTorrent offerings.

The problem with Dropbox and the other similar providers is that any self-respecting IT department is going to have to stop its staff using them. Now sharing a photo of my new grandson is one thing, sharing a business-sensitive policy document on the cloud is something else. Who knows what organizations at some time in the future may have access to those cloud-based files? Internal IT security can no more allow staff to use cloud solutions for sharing files than it can allow staff to carry around unencrypted memory sticks.

There are some sites where you can upload your files anonymously, although you get a slow connection. You can open a free account, which gets you a slightly faster connection, or there’s the paid for service with the fastest connection. Once a file is uploaded, you get a URL that you can share, and then people can download your file. Examples include FileDropper (up to 5GB), FileFactory (upload up to 50 2GB files), FileServe (up to 500GB with an account), and HotFile (you pay when your files are downloaded).

Some organizations have limits on the size of files that can be sent as attachments or received as attachments. That’s why some of these other Web cloud hosts have sprung up. People like: Bigupload (up to 50MB), Box.net (1GB of storage, but the maximum file size is 25MB), Crocko (up to 1GB), DriveHQ (1GB ), Humyo (10GB), Kontainer file storage (50MB), MediaFire (up to 100MB, MegaSWF (Flash SWF files up to 10MB), RapidShare (up to 200MB), ShareBase (up to 200MB), Sigmirror (5GB of free storage), TzFiles (2GB), and uploaded.to (250MB file size maximum).

And there are many other hosting sites that I haven’t mentioned. And many that charge varying amounts for storage.

Up to now, Google, has offered Google Docs as a cloud solution. You could create your Word files and, assuming you had the Google Docs plug-in, store a copy in the cloud. Or you could just create documents using Google Docs and share them.

But now, it appears that Google is going to launch Google Drive – 5GB of free storage available to all users. Users will get a Google Drive icon on their desktop and use it as a virtual drive. And 5GB is quite a lot of storage – although arch-rival Amazon’s Cloud Drive offers exactly the same free.

Bitcasa, which is still in beta, is offering free storage while it’s in beta and paid for unlimited storage afterwards.

With the growth in cloud storage, more IT departments will have to get involved in making sure that there is some way that business-related data can be kept secure in the cloud. And with Google joining the growing cloud space race, end users are going expect cloud storage to be available to them.

Sunday, 15 April 2012

Storage and expertise – the PureSystems box

On Wednesday 11 April, IBM introduced to the world the PureSystems family of data centre infrastructure products. The idea behind them is that they will simplify the management, automation, and running of enterprise applications on a range of virtualization technologies.

This new line of integrated systems has the ability to automatically handle everyday tasks such as configuration and updates, which reduced the amount of time needed to get applications up and running and also reduces the management overhead. Therefore, IT staff are freed up to get on with other work. Users gain the expertise of 125 independent software vendors (such as VMware, SugarCRM, Infor, and Juniper Networks) who developed what IBM calls “patterns of expertise” that automate many common IT and industry tasks such as deployment, configuration, and upgrading of applications onto the appliances. “For example, a customer relationship management program that used to take three days to deploy can now be deployed in under one hour”, claims IBM. In addition, companies can scale their operations very quickly, allowing them to go from a small number of computer systems in one site to service on the cloud, with systems that can be accessed around the world.

IBM proudly described its PureSystems family as one of the most significant announcements of the last 20 years, and said it is the result of $2bn research and development spend over many years.

The announcement might be viewed as a way for IBM to match competitors Oracle, HP, and Cisco Systems, who have all been promoting converged infrastructure – integrating server, storage, networking, and other technologies into a single managed architecture. As a sweetener, IBM says it will buy back servers, ie those sourced from HP and Oracle, from clients who migrate to PureSystems.

As mentioned above, each PureSystems package combines servers, storage, networking and virtualization technologies into a single appliance, with additional services from IBM. The PureSystems initially come in two versions – PureFlex System (which is a basic infrastructure platform for self-service private clouds), and PureApplication System (which includes IBM’s WebSphere middleware and DB2 database and can be used for Web and database applications). The systems support the Hyper-V, KVM, Power-V, and ESX hypervisors from Microsoft, Red Hat, IBM and VMware, respectively, and are based on either Intel or IBM Power processors. Storage is provided by IBM’s Storwize V7000 appliances, and networking can be a choice of either Brocade, Cisco, or Juniper kit.

IBM has included a cloud self-service and provisioning interface in the PureSystems, based on the same technology used in IBM’s public SmartCloud services, giving customers a ready-to-go cloud computing system in a box, they said.

Customers are able to define the PureFlex configuration, while the PureApplication System is available in four configurations ranging from 96 CPU cores and 1.5TB of memory, up to 608 cores and 9.7TB of memory.

IT departments will be pleased to know that IBM is offering a single support hotline for the entire system, whether an issue is with the hardware or software, while there is also just a single procurement process for the entire system.

“You can order just one box with one pin number, and it has on it all you need to get an out-of-box experience straight away with the software, the middleware, the hardware, storage, the network fabric”, claimed Graham Spittle, chief technology officer for IBM in Europe.

There is also just a single management console, according to IBM, although they can also integrate with IBM’s Tivoli platform for customers who’ve standardized on that for management.

Pricing for the PureSystem family starts at about $100,000.

Sunday, 1 April 2012

Cloud is the answer!

So if cloud is the answer, what exactly is the question? That’s what a new study (which will be published in April) by Forrester Consulting on behalf of BMC has tried to find out. Now, obviously, I’ve been mixing with the wrong kind of people, but I thought the days when someone in IT said they worked in IT rather than naming the company that paid their salary had gone. I can remember when smart programmers could move from one well-paid job to the next and never worry about the name over the door as they went to work. They just knew about IT and their services were very much in demand. I thought those days were well behind us. I thought the CIO and CFO and everyone else on the board worked together to move an organization forward. But apparently that’s not the case.

It seems that there’s still very much an ‘us and them’ culture, where you’re either part of IT or not. There are plenty of companies out there struggling to get their IT aligned with business needs and opportunities. Suddenly I feel like I’m back in the early 1990s. But that’s kind of what this survey has found.

The report suggests that cloud computing is exacerbating the divide between the business and IT, suggesting: “CIOs are concerned that business leaders see cloud computing as a way to circumvent IT”. We’ve talked about cloud and Bring Your Own Devices in previous blogs, but it seems that some (many?) members of staff view IT’s need for security as a need for control and one that they’re fighting against rather than working with.

The truth is there are many sites using Microsoft products with only a few ITers who are very stretched in order to satisfy the requirements of a workforce that ranges in ability from the fairly IT-literate to barely able to use a mouse (the type of people who demand a keyboard for their iPad!). For those IT staff, saying ‘no’ to an idea can often be the easiest option in order to save time to do everything else that is required of them in a day.

So that’s what leads to results highlighted in this survey of “enterprise infrastructure executives and architects”, which “reveals increasing tension between business and IT stakeholders”. Rather than the cloud being an opportunity for IT to expand and take advantage of this new paradigm, the report spells out that “high expectations for speedy, low-cost implementation of new software systems in the cloud are putting unique pressures on IT departments within the enterprise”.

The bald facts are that nearly three quarters of high-level executives see the public cloud as a way of getting around IT, despite most acknowledging that it doesn’t provide adequate security controls – according to the report. As we all know only too well, iincreasing pressure is being placed on IT departments to reduce costs and complexity, yet at the same time implement fast, low-cost software systems onto the cloud.

81 percent of the 327 respondents see a cloud strategy as a high priority. Seemingly, 58 percent of respondnets are running mission-critical workloads on the public cloud, without a security policy – 36 percent claimed they had a security policy (and six percent seemingly didn’t answer that question!).

Quite sensibly, most respondents thought their IT departments should be responsible for ensuring public clouds met their company’s requirements for performance, security, and availability. Just over a third (37 percent) looked to hybrid clouds as the future. Nearly two-thirds (61 percent) were realistic enough to believe  that it would be difficult to provide the same level of management across both public and private cloud services.

I guess, in many ways, IT liked the dark ages when computing was very much a black-box technology. End users asked for something and, in the fullness of time, they got it. Nowadays, there are plenty of people who can read the IT news on Google or wherever and see what can be done – and they’re wondering why their company isn’t doing it. For mainframe sites, the problem might be the skills shortage caused by too few staff who really understand the technology. For sites running other platforms, it’s so often the paucity of staff they have running their operations. It’s not that those guys aren’t keen to embrace the technology, they just haven’t got the time. But the CIO needs to look out because organizations could outsource their IT to the cloud and there would be no need for in-house expertise – no need for an IT department as such, just a few application champions spread around the departments. Now, I’m not suggesting IT professionals will be without jobs – those cloud providers are going to need staff. But life could very well be different. And we don’t want the future of IT to be driven by non-ITers, do we?

Monday, 26 March 2012

IBM takes the QI approach!

The BBC in the UK produces a wonderful programme called QI – IQ reversed, but officially standing for Quite Interesting. The point of the show, which is a humorous quiz show, is to not only illustrate how little we know, but also highlight the things we think we know that are actually wrong! So, for example, the teams might be asked to name an animal that buries its head in the sand. A contestant saying ostrich will have points deducted because no-one has ever seen an ostrich bury its head in the sand. And yet the myth dates back to Pliny the Elder. You get the idea of the show.

In a presentation at the Share user conference last Wednesday entitled “Hex, Lies and Videoblogs”, IBM’s chief architect for cloud computing, Frank DeGilio set about (in the style of QI) debunking some of those mainframe myths that we all come up against time after time.

For many years, the much-missed Xephon organization published the Dinosaur Myth, which crunched the figures for small, medium, and mainframe installations, looking at hardware, software, maintenance, and running costs. And every time it did the sums, it found that mainframes cost less overall. Similarly, DeGilio pointed out that most people ignored any figures apart from the basic hardware and software ones. He argued that, particularly for large-scale infrastructures, management complexity and personnel costs are often critically important parts of a system's final price tag. As Xephon’s publication identified, an expanding infrastructure requires more people if it’s distributed than if it is a mainframe.

There’s also the general ignorance that mainframes are your dad’s technology, and if the code isn’t written in Latin or Ancient Greek, then it’s the digital equivalent. While it’s perfectly true that mainframes run a lot of COBOL and Assembler programs, they have embraced modern trends as they have occurred over the past 50 years. This means, as Frank DeGilio pointed out, that J2EE, Linux and other modern open standards are all widely supported. Perhaps more importantly, Frank asserted that there's nothing outdated about the way mainframes handle workload management. In fact, their ability to fine-tune resource allocation based on application need is far more granular and sophisticated than that of most distributed systems.

Old iron tends to break rather than bend. But that’s not the case with mainframes, which are highly flexible and well able to balance workloads. As DeGilio says, “the very concept of capacity upgrade on demand was ‘pioneered’ by the mainframe”.

It doesn’t matter whether a computer doing nothing is a Raspberry Pi or the fastest supercomputer in the world – it’s still doing nothing! What really counts is how the box handles real-life mixed workloads. Various figures are produced suggesting that specific hardware and software combinations can set specific benchmarking records, but what’s needed is something that can handle everyday workloads as speedily as possible. DeGilio argued that the mainframe’s flexibility means that its speed in handling multiple real-world tasks is greater than what might be indicated by testing a box to perform a single activity.

Like Stephen Fry, who chairs QI, it’s really the job of mainframe professionals to go out there and debunk these myths. There’s a clear need to highlight incorrect thinking and identify ways that the mainframe could be a better answer to IT problems. QI isn’t dull – so you don’t have to be the most boring person in the world just because you want people to change their thinking and see the bigger picture!

And finally (from the QI stock of knowledge), where do Panama hats come from?
The answer is Ecuador!

BTW: if you get a chance, can you ‘Like’ iTech-Ed Ltd on Facebook. You can find it here. Thanks.

Sunday, 18 March 2012

Trevor Eddolls - IBM Champion 2012

The e-mail arrived this past week confirming that I’d been recognized as an IBM Champion 2012. I’ve been an IBM Champion since 2009, although the name has changed from IBM Data Champion, through IBM Information Champion, to just IBM Champion over the years.

But what does it mean? According to IBM: “An IBM Champion is someone who makes exceptional contributions to the technical community. Contributions can come in a variety of forms, and popular contributions include blogging, speaking at conferences or events, moderating forums, leading user groups, and authoring books or magazines. Educators can also become IBM Champions; for example, academic faculty may become IBM Champions by including IBM products and technologies in course curricula and encouraging students to build skills and expertise in these areas.

“An IBM Champion is not an IBMer, and can live in any country. IBM Champions share their accomplishments and activities in their public profiles on IBM developerWorks, making it easy for the IT professional community to learn more about them and their contributions, and engage with them.”

So why am I an IBM Champion? Well, I don't work for IBM, but I do write about mainframe hardware and software. I blog at mainframeupdate.blogspot.com and it.toolbox.com/blogs/mainframe-world. I also blog once a month on the Destination z Web site (www.destinationz.org). I’m Editorial Director for the well-respected Arcati Mainframe Yearbook (www.arcati.com/newyearbook12). I’ve also written technical articles that have been published in a variety of journals including z/Journal (www.mainframezone.com/it-management/the-z114-delivering-game-changing-opportunities). And I chair the Virtual IMS user group (www.fundi.com/virtualims) and the Virtual CICS user group (www.fundi.com/virtualcics). I also look after their social networking – you can find information about the groups on Twitter, Facebook, and LinkedIn.

IBM Champions receive the title for one year, during which they can enjoy the benefits associated with the program – rather than any direct payment from IBM. Existing Champions are eligible to renew their status for the following year, as long as they can demonstrate that they have made significant contributions to the community over the past 12 months.

Are IBM Champions compensated for their role? Sadly (from my point of view) the answer is no. Do IBM Champions have any obligations to IBM? Again the answer is no. IBM Champions have no obligations to IBM. The title recognizes their past contributions to the community only over the previous 12 months. Do IBM Champions have any formal relationship with IBM? No. IBM Champions don’t formally represent IBM nor do they speak on behalf of IBM.

The e-mail did say that as a 2012 Champion I will receive ‘IBM Champion merchandise’ including a shirt, travel umbrella, messenger bag, framed certificate, lanyard, leather luggage tag, assorted paper products, and pin. So, that’ll be nice.

There may not be a financial benefit to being an IBM Champion, but I think it’s a nice way for IBM to recognize people around the world who are helping to promote IBM products and help share information about the products amongst their users.

You can see my profile at http://tinyurl.com/IBMchampion.

On a completely different note... If you get a chance, can you ‘Like’ iTech-Ed Ltd on Facebook. You can find it here. Thanks.

Sunday, 11 March 2012

Operating systems on a stick

You’re probably familiar with IBM’s z Personal Development Tool Adapter, which allows users to develop mainframe software without a mainframe. In effect, users plug a very expensive memory stick into their PC and it acts like a mainframe.

But now, IBM has extended the idea by allowing users with the appropriate memory stick to load a cloud-hosted Windows or Linux operating system onto their PC – although they will need a Windows or Linux computer with a 64-bit processor. It’s called the Secure Enterprise Desktop (SED) and comes packaged as an extension to IBM’s Smart Business Desktop Cloud service.

The memory stick plugs into a USB port (as you’d expect) and comes with its own HTTPS stack, bootloader, and the necessary proprietary code to create a secure VPN channel connection between a partitioned drive on the user’s PC and a remotely-located server.

That’s nice, you say, but what’s the point? Well, it’s another way of allowing BYOD (Bring Your Own Device). This is an issue that I blogged about a little while ago, and one that is beginning to raise its head at many sites. Users like the devices they’ve bought themselves and are familiar with, rather than the products IT allocates them. And they want to use those devices to access their work-based data and applications.

Running the bootloader from the memory stick protects the business from the problem of home machines being riddled with viruses and trojans. The PC establishes a connection to the server, then there’s two-way authentication to ensure you’re who you say you are and the server is really the right one for your company (and not anyone else’s). Once this connection is established, the user downloads a small (kernel-based virtual machine) hypervisor, which allows the user to choose a Linux or Windows operating system. Any changes the user makes to data is written in an AES-256 encrypted format to a portion of the local hard drive with the key retained on the stick, and these changes are replicated back to the cloud-hosted operating system.

The device offers a range of authentication options, including a built-in card reader as well as PIN.

If the memory stick gets removed, the operating system instantly stops because the connection to the remote server has been severed. Re-inserting the stick allows re-authentication to occur and the user can carry on as before.

Users have the option to download the host operating system from the cloud, so they can continue to work without an Internet connection – if that’s what they require.

At the server end, a Linux server with Apache and OpenLDAP (open Lightweight Directory Access Protocol) are required.

It seems like a very useful innovation. What do you think?