Inside Caterham HQ: the off-the-shelf tech that powers an F1 team

The Formula 1 circus rolls into Silverstone this weekend for the British Grand Prix. But what does it take to put a car on the track? We went behind the scenes at Caterham to find out

There’s something distinctly British about the Caterham Formula 1 team.

And it’s not the Eurovision-style ‘nil points’ they currently have in the world championship.

Racing's Silicon Valley

While McLaren operates from a Bond villain-esque base, all curved glass and glinting metal, Caterham’s more modest HQ is nestled among the racing-green fields of Oxfordshire. Despite investment from Tony Fernandes (the man bankrolling Queens Park Rangers football club) they’re more like the Norwich to McLaren’s money-no-object Man City, which isn’t the only connection Caterham has to East Anglia.

“Norfolk’s a very nice place but in motor racing terms it may as well be in Europe,” explains John Iley, Caterham’s Performance Director, referring to the team’s former Hingham base: “As soon as you move into Oxfordshire you know you’re in motor racing’s Silicon Valley.” And it shows.

Raw power

On a tour of the factory, which is guarded by a pond full of monstrous Koi carp, Stuff saw how a partnership with Dell is helping the team to develop its cars on a strict budget, from the 1500-core supercomputer to the gaming PCs that power the huge, articulating simulator used by the drivers.

Inside Caterham HQ: the off-the-shelf tech that powers an F1 team

With restrictions on how much testing teams can carry out during a season, the 20-teraflop Dell HPC (High Performance Cluster) is invaluable. Made up of 186 servers and capable of ten billion calculations every 12 hours, it performs virtual wind tunnel testing 24/7, with that data used to decide which parts will be made for checking on a 50% scale model (pictured) worth £250,000. The model is made from steel, with frequently changed parts churned out using a 3D printer, scanned with a FARO Edge laser scanner and compared to the original CAD files to ensure they match.

More after the break...

Birth of a supercar

Parts that pass the test and make it onto the car are invariably made from carbon fibre (around 80%), which is manufactured on-site in the autoclave room. Sheets are cut with the help of software from clothing brand American Apparel, before being layered, cured in the autoclave itself (essentially a large, pressurised oven) and tested for defects using ultrasound. Almost everything else is cut from titanium.

But it's not all wonkaflops and hypercomputers. Caterham's HPC might be made up of off-the-shelf kit but this blinking bank of lights and servers is hardly the kind of thing you'd find at a LAN party. For that you need to visit the simulator room.

And you thought your gaming rig was fast...

With on-track testing restricted to just three sessions before the first race in Australia, a simulator (pictured) is essential for mid-season development. Caterham’s might look like it’s on loan from NASA but it’s actually powered by a fleet of Alienware gaming PCs: three Area-51s, each with 3.4Ghz Intel Core i7 processors, 12GB of RAM and dual 4GB GDDR5 AMD Radeon HD 6990 graphics cards. Each runs one of three HD projectors, which combine to create an ultra-wide display for the driver. Another overclocked machine, packed full of high-speed RAM looks after the physics. 

Plenty to spare

Inside Caterham HQ: the off-the-shelf tech that powers an F1 team - Plenty to spare 2Inside Caterham HQ: the off-the-shelf tech that powers an F1 team - Plenty to spare 3

For a race weekend up to 35 tonnes of gear is shipped to each circuit, including 30,000 individual spare components. That’s enough for 2.5 cars, so it’d have to be a fairly catastrophic shunt to put them both out of action for the race.

Despite those big numbers, Caterham’s footprint at a Grand Prix is one of the smallest in the paddock. Trackside the team uses a McLaren-developed software package called ATLAS to analyse the 25GB of data that gets fed back from the 150 sensors on each car per race weekend. With historic as well as real-time data to look at, that means 10-15 terabytes of storage – and only one IT support worker actually at the track to look after the 120 laptops and half-rack of servers required to churn through it. With that kind of responsibility, it’s probably more than just a case of turning it off and on again.

You have to login or register to comment.