HPC continues to drive Formula 1 success

Engineering HPC

Formula 1 seems to crop up quite often in my work. I’ve been given a tour of McLaren’s headquarters, discussed wheel-nut troubles with the CIO of Williams and even interviewed the people responsible for making the batteries that store brake energy in the cars.

Oh, and there was the time I wrote about the F1 pit crew that was training midwives to be more efficient.

I’m a technology writer, not a sport or motoring journalist but F1 is perhaps the most tech-enabled sport in the world so it comes up frequently. And even to someone like me, who has little interest in what happens on the track, the sport is fascinating off the circuit.

At McLaren I saw how the team receives as much as 3TB of data from the car’s previous performance - in practice or in a race - and sifts through it for potential improvements. Tweaks are proposed and run through computer simulations, with the most promising being 3D-printed and tested in the wind tunnel. Then new parts can be made and flown out to the circuit, ready for the team to use in the next race or practice session.

Most teams now use high-performance computing (HPC) to stay competitive. As Nick Dale wrote on this blog back in January , computational fluid dynamics (CFD) - a kind of digital wind tunnel - is quickly becoming an essential part of the process of making the car as aerodynamically efficient as possible. Data analytics is now vital to how the cars are built and driven and the teams have invested in supercomputer capacity to help them.

But simulations can only get you so far. When I spoke to Williams CIO Graeme Hackland back in 2015, the team was having problems with its tyre changes, which were taking tenths of a second longer than they should have been. In a sport like F1, those tiny differences cost points. The team had traced the problem to the wheel-nuts, which were heating up during the race and thus taking longer to remove. However, in simulations, they didn’t heat up at all. Something that was happening in the real world wasn’t being reflected in the data.

I was reminded of that when I read that a “software glitch” this season had cost Lewis Hamilton a race win. In the Australian Grand Prix earlier this year, Sebastian Vettel took the lead from Hamilton after Hamilton went into the pits for new tyres. Vettel still needed to pit himself, which would typically allow Hamilton to regain first place.

However, Vettel managed to get into the pits while the ‘virtual safety car’ was out - a time during which drivers must slow down and cannot overtake. Hamilton’s software told him that he had time to regain the lead because Vettel would spend at least 15 seconds in the pits. But Vettel took 11 seconds and held his position. Again, the real world did not match the simulation.

Toto Wolff, the boss of Hamilton’s Mercedes team, said: “It was down to a software bug or an algorithm that was simply wrong.” It seems extraordinary that a sporting event can be won or lost by an algorithm but that’s the reality of modern F1. What began as a test of driver skill and courage has morphed into a test of the ability to operate a computer at 200mph. Oh - and driver skill is still a prerequisite. It’s an astonishingly demanding job.

As Stacey Higginbotham wrote in Fortune magazine in 2015 , “Most Formula 1 spectators expect the race to be won or lost in the hills and hairpins of a Grand Prix circuit. What few realise is that it’s playing out in powerful, interconnected computers around the world.”

Higginbotham added: “The sport’s use of such information is so sophisticated that some teams are exporting their knowledge to other industries where analysing enormous amounts of information in the blink of an eye can mean the difference between life and death.”

For McLaren that means working through its spinoff McLaren Applied Technologies, with oil companies to optimise oil rig performance, Heathrow airport to maximise efficiency and even speeding up drug trials for pharmaceutical companies.

It was once the case that F1’s innovations fed into road cars - we owe traction control and multi-function steering wheels, among other things, to racing teams. Nowadays, F1 is spreading its expertise everywhere. For a technology writer like me, that means I’m likely to find myself covering it more and more...

Written by Shane Richmond (Guest)

See Shane Richmond (Guest)'s blog

Shane Richmond is a freelance technology writer and former Technology Editor of The Daily Telegraph. You can follow him at @shanerichmond

Related blogs

Iceland provides the power behind Germany's most pioneering AI start-ups

This week has seen the announcement of Analytic Engineering, a pioneering German AI engineering firm, choosing Verne Global’s data center in Iceland as the location for their intensive computing. This represents another impressive AI and Machine Learning client win for us, following DeepL joining us just before Christmas.

Read more

Deep learning for autonomous cars

Since Sebastian Thrum and his team used machine learning to win the DARPA Grand Challenge in 2005, machine learning and deep learning have been an integral part of developing autonomous vehicle technology. Great progress is being made, but complex questions remain. My latest blog looks at these issues.

Read more

Trends Advancing Industrial HPC

As we build-up to SC18, Verne Global is delighted to welcome Brendan McGinty, Director of Industry for the National Center for Supercomputing Applications (NCSA), University of Illinois at Urbana-Champaign, as a Guest Blogger. In his first blog Brendan looks at the industry trends around supercomputing and industrial HPC, and how these are advancing innovation in key markets.

Read more

We use cookies to ensure we give you the best experience on our website, to analyse our website traffic, and to understand where our visitors are coming from. By browsing our website, you consent to our use of cookies and other tracking technologies. Read our Privacy Policy for more information.