Why it’s time to retire the ‘other people’s computers’ criticism of the cloud

Insights HPC


“There is no such thing as the cloud, only other people’s computers.” I’ve hosted seminars with senior IT professionals for several years now and whenever the subject is ‘the cloud’, someone always delivers this line. I don’t know who said it first, but it’s become a cliché.

A Google search finds people writing, at least as far back as 2010, that there is no such thing as the cloud: there are several clouds, run by different companies and with different operating processes and risks. That makes sense and I’d guess that most IT professionals know that by now.

A few years later, we get the first appearance of the ‘other people’s computers’ line in stories about one company or another suffering a cloud service failure.

The line is unhelpful. What wisdom it once provided is overshadowed by the impression it creates, that storing data on ‘other people’s computers’ is asking for trouble. It obviously depends on whose computers.

Consider this: there is no such thing as the haulage industry, only other people’s lorries. “Other people’s lorries” can crash, get stolen and the contents can be lost but if you run a widget business, then you might still hire a specialist to transport them from factory to store. Companies have come to accept those risks – mitigated by contracts, regulation and so on – so they can focus on being widget experts instead of becoming transport experts too.

Try this one: there is no such thing as ‘the bank’, only other’s people’s safes. Do you keep your money in a vault at home or put it in the bank? I’m guessing the latter. But, to take just one thing that can go wrong, thieves could steal your identity and use it to take money from your account. Your bank might refund the money, but they might not, especially if they decide you were at fault. Again, there are risks and benefits.

You can argue that neither of those analogies perfectly matches the situation of cloud computing and that’s true. A lorry is not the same as a server. And protections around the banking industry have grown up over hundreds of years – something that the cloud sector cannot yet match.

Still, they are close enough analogies for the key points. First, businesses outsource complex tasks to specialists all the time. Not because nothing can go wrong – things can always go wrong with suppliers – but because doing so is typically more efficient and effective than handling the task themselves.

Second, people who secure things for a living – like data or money – have an incentive to be good at it. Just as your bank is probably more secure than your house, so a cloud supplier is likely to have better security than your business.

These are among the reasons why so many large corporations are happy with moving compute up to the cloud. It’s becoming increasingly popular to move entire portfolios out of internal data centers and up to the cloud. At the same time, we are seeing the rise of specialist clouds for things like financial regulatory uses, high performance computing (HPC), and so on.

If the ‘other people’s computers’ line has any value, then it’s as a reminder that putting your data in the cloud does not absolve you of responsibility for it. You still need to know what you are storing, where it is stored and so on. You must be aware of exceptional cases, such as whether your data could cross national borders in the event of a network failure.

Likewise, some companies are caught up in cloud fever, believing that moving to the cloud is a corporate panacea. It isn’t, if it’s entered blindly. Cloud costs can mount up and many companies are paying for capacity they don’t need and features they will never use because they have not scrutinised the cloud service as they would for their internal infrastructure. As mentioned above, there are many clouds and it is important to choose the right one.

Hopefully, this kind of understanding is becoming common knowledge now, particularly with the arrival of GDPR, which has focused a lot of corporate minds on data.

We’re in the early stages of regulation of cloud storage and data. Much of what we have seen so far has been focused on getting regulators and regulations to catch up with a tech world that has outpaced existing rules. A big driver has been consumer privacy – and in the wake of the furore around Facebook and Cambridge Analytica, the scrutiny of what data companies store, where they store it and who they share it with will not lessen.

There is value in making people aware of those issues but they shouldn’t deter a business from cloud storage. And that’s my frustration with ‘other people’s computers’: it carries an admonishing tone, one that implies irresponsibility.

It is not inconceivable that the regulations around storing data could soon become complex enough that doing it yourself would be irresponsible. The day will come when the company that stores its own data starts to look like the person who eschews the bank in favour of a pile of cash under the mattress. What’s important is not ‘other people’s computers’ but the right people’s computers.


Written by Shane Richmond (Guest)

See Shane Richmond (Guest)'s blog

Shane Richmond is a freelance technology writer and former Technology Editor of The Daily Telegraph. You can follow him at @shanerichmond

Related blogs

G-Cloud 10 makes accessing high performance computing easier then ever...

As the Director of Research at Verne Global I spend a lot of my time working with our colleagues and partners within the UK’s publicly funded universities and research and science community. I’m privileged to get to see some of the truly innovative and inspiring research that is taking place, using high performance computing (HPC) and further encouraged with how Verne Global is helping them do this. This is why I was delighted to see Verne Global’s participation in the G-Cloud 10 (G10) framework confirmed last week and indeed strengthened for 2018/19 – enabling more public sector bodies to enjoy the benefits of our on-demand true hpcDIRECT platform.S

Read more


Explainable AI

SC18 here in Dallas is proving once again to be a fascinating melting pot of HPC insights and observations, and it's intriguing to see the continuing convergence of AI into the supercomputing ecosystem. Along these lines I started to think about the movement towards 'Explainable AI'. Being able to explain and understand how models work when making predictions about the real world is a fundamental tenet of science. Whether solving equations in a dynamic system for precise answers or using statistical analysis to examine a distribution of events, the results sought from these methods are intended to increase our clarity and knowledge of how the world works.

Read more


DevOps. Stacking up to be a Common Theme at ISC18

Yesterday was a good day to be in Frankfurt. All of the majors in the supercomputing universe descended upon the Messe Frankfurt to begin ISC18 with a series of training seminars. For my morning session, I chose Getting Started with Containers on HPC through Singularity put on by the team at Sylabs, Inc. I have been tracking the progress on Singularity in the HPC community since before Sylabs was founded by CEO Greg Kurtzer in an effort to bring the technology of root secure containers into the realm of enterprise supported software. I was excited to hear about the progress that Sylabs has made and to see where the future of containers lies for the broader HPC community. If I was forced to sum the tutorial into a single portmanteau, it would be DevOps. After this session, it is clear to me that the world of DevOps that has been created in the cloud native universe is on a collision course with HPC. And the future of science says that it can’t happen soon enough.

Read more

We use cookies to ensure we give you the best experience on our website, to analyse our website traffic, and to understand where our visitors are coming from. By browsing our website, you consent to our use of cookies and other tracking technologies. Read our Privacy Policy for more information.