The Third Angle
Best Business Podcast (Gold), British Podcast Awards 2023
How do you build a fully electric motorcycle with no compromises on performance? How can we truly experience what the virtual world feels like? What does it take to design the first commercially available flying car? And how do you build a lightsaber? These are some of the questions this podcast answers as we share the moments where digital transforms physical, and meet the brilliant minds behind some of the most innovative products around the world - each powered by PTC technology.
The Third Angle
BAE Systems: Vision of the future for fighter pilots
When a fighter pilot is flying a high-speed jet, every second counts. In futuristic TV and movies we’ve got used to seeing pilots reading displays right in front of their eyes, think Iron Man and Top Gun. Well real life is catching up with the revolutionary Striker II Digital Helmet-Mounted Display made by BAE Systems.
BAE Systems is one the world’s leading aerospace innovators but their history goes back a long way at their site in Rochester, England. Once a plant that built aircraft during WW2, it’s now the birthplace of augmented reality in aviation, shaping the future of both military and commercial aviation.
Nigel Kidd, has worked on the groundbreaking technology that powers Striker II since its inception, Alongside him Paul Harrison, manages the flight simulation facility at Rochester, where they simulate various flight scenarios, from fast jets to commercial aircraft.
They explain how Striker II integrates advanced technologies to reduce pilot stress and enhance safety, including a digital night vision that replaces bulky goggles and a potential 3D Audio system for spatial sound recognition.
Find out more about BAE System and Striker II here.
Find out more about Creo here.
Your host is Paul Haimes from industrial software company PTC.
Episodes are released bi-weekly. Follow us on LinkedIn and Twitter for updates.
This is an 18Sixty production for PTC. Executive producer is Jacqui Cook. Sound design and editing by Ollie Guillou. Location recording by Hannah Dean. Music by Rowan Bishop.
Welcome to Third Angle, where you find us navigating through the night sky like Iron Man.
I’m your host, Paul Haimes from industrial software company PTC. In this podcast, we share the moments where digital transforms physical, and meet the brilliant minds behind some of the most innovative products around the world, each powered by PTC technology.
Have you ever wondered what it would be like to be a fighter pilot like in the Top Gun films? In the past, when flying a high-speed jet, the pilot would have to look down at the dials on their dashboard to see critical readouts such as altitude and remaining fuel levels. More modern aircraft contain a display fixed in the cockpit in which symbology i.e., lines and markers – are overlaid over the windshield, allowing the pilot to see all the information they need without having to look down at the dashboard and away from piloting the plane. BAE Systems have taken this a step further by creating the Striker II, a helmet-mounted display that enables pilots to see symbology overlaid onto the real world in 360 degrees.
Similar to Iron Man’s helmet, the fully HD colour display gives the pilot instant information about their surroundings. For example, it might overlay a red symbol over an incoming aircraft to warn the pilot that it is hostile. It contains pioneering new features, including a night vision mode and a 3D audio system which uses spatially positioned sounds to communicate to the pilot where other aircraft are. We sent our producer Hannah Dean to the BAE Systems simulation facility at Rochester in the UK to meet Nigel Kidd. Nigel is the product director for the helmet-mounted displays at BAE Systems and told Hannah about how the Striker II works to make pilots safer. She even got to try the Top Gun style experience herself by wearing Striker II.
Oh wow, I can literally see when I move my head wearing this helmet I can point it down towards my feet and it’s as if I’m looking through my feet, and the aircraft, and looking down below me to the fields and houses and roads below.
I’m Nigel Kidd, I’m the product director for helmet-mounted displays here at BAE Systems in Rochester. I’ve been with the company for 26 years now, just over 26 years. I started as an apprentice who grew up locally and was interested in engineering, familiar with this site being a centre of excellence for high-end engineering, electronics engineering. I look after all of our head-mounted display product developments. We’re in the simulation facility here at Rochester. So this is an area where we have simulators where we can try out all of our equipment. We’ve got fast jets, helicopters and commercial aircraft here that pilots can get in to evaluate our systems.
I’m Paul Harrison. I manage the flight simulation facility here at Rochester. And my role is mainly to develop and maintain these simulators, but also to take our customers and visitors through to allow them to really experience those products’ capabilities that we produce here and what they do for the end pilot community. And we do everything from STEM work all the way through to pilot, government and industry.
(Nigel) Here’s some pictures of the airfield as it started. This is after the airfield was being built in 1930. There’s the first aircraft to land here, which came from Gravesend Airport. Early gyrocopter. Aircraft here, the RFUs for fighter training in 1938. The hangars that you can see today were actually being built in 1940, and here is an image of four-engine Stirling bombers being produced in that space. Ironically, this space here is where our new production facility is. Rather than producing full-size aircraft it produces flight controls and avionics and helmets and hubs, mission computing, and all the things we make here in the same space. The airport is as it is today, as a general aviation training facility and a place where things like the air ambulance can come and go, and military aircraft and helicopters come and go for fuel. But yes, it’s got a rich history of aviation from here from 1930 onwards.
(Paul) So people have seen the Top Gun films, people have seen that. They will have seen what the pilot looks through on his head-up display fixed in front of the cockpit. So what the pilot used to get was that sort of information. So it was what we call symbology, which is basically lines and markers overlaid onto the real world. So as the pilot looks out into the real world, there is maybe a symbol over another aircraft, or there is speed and altitude information, which is presented in front of his eyes, so the pilot doesn’t have to look down inside the cockpit. When you go to Striker II, going from that old analogue technology to the modern HD technology, which also has gone from basically a black and white display to a full colour display, we’re able to provide far more information which is relevant to the pilot. And the best way of probably describing that is, if you watch the Marvel films, Ironman, what Robert Downey Jr. is looking through when he’s looking through his helmet, and he sees symbols overlaid on the real world giving all sorts of information in colour, which then he’s looking at that, he’s not looking at anything else, he’s looking at the real world and the information which that display is giving him while he’s flying around at speed. That’s the kind of thing that Striker II does.
(Paul) It used to be a case of what you could display was limited by the technology of the display and the information that the aircraft was able to provide to the helmet for it to display. So it was quite limited. Now, we can display all sorts of stuff. We could fill up the display with all sorts of symbols and information. And what we need to remember is that the pilots who are using this are in a stressful situation and this is all about providing information to the pilot which makes their job easier. It needs to be intuitive so they don’t have to think about what they are doing; the information makes it straightforward. A simple example of that is the older display, as I say it was black and white – it was actually green and nothing. So all the images were in green. So if the pilot had a symbol which came up which was a symbol for a warning, it’d be green, so you’d have to look at it and think, “Okay, well, that’s a warning symbol, so I need to take action in this certain way.” In Striker II, we can just put up a red symbol, and they know that red means danger, and it grabs their attention, and then they do what they need to do against it.
(Nigel) Do you want to have a go at flying it and see it move dynamically?
(Hannah) I can give it a go, why not?
(Nigel) Have a seat. Now they’re split up a bit, you can see the blue one, which is a friend, a yellow one, which is currently identified, and a lot of red ones. There’s a couple of blues out there, but what it’s really doing for you is rather than having to work out what each shape means, the recognition of something being read instantly tells you it’s a potential threat. If it’s blue, it’s a friend, yellow, you need to work out what it is. But the key thing is where they’re positioned is physically where they are relative to you. But now you see it’s all the way up to there. So in real time, it’s giving you that situational awareness where things are around you. So you don’t have to think about it. You don’t have to process it. You just know. And that’s what it’s doing. It’s reducing your workload and taking you straight to decision-making point. If it was night time outside, it would be like that. So now, out of the window, you can see lights from towns, but not a lot else. You’ve only got the symbology. But if we give you the night vision system, you can now see.
(Hannah) Yes, I’ve got mountains in front of me. I can see a lot more.
(Nigel) So you can see at night with it, and scan the night sky in the same way you’d scan the day sky.
(Paul) When you’re operating at night, the pilot needs to be able to have that imagery that the helmet can provide but also wants to be able to see the outside world. So previously, with Striker I and other systems, the pilot would have to clip on night vision goggles. And those night vision goggles are heavy, they’re half a kilo, they hang off the front of the helmet, they unbalance the helmet quite significantly. These aircraft they can regularly pull 9 G’s, so that means that everything becomes nine times heavier – including your own head. So the neck strain that can come from that is immense. And generally, when pilots are flying with goggles, they limit the G that the aircraft can pull down to about 4.5G, which is not ideal for anyone. So what we’ve done with Striker II is included a digital night vision sensor. So those old bulky analogue goggles are thrown away, they’re not required anymore. And we’re able to provide that night vision image over the entire visor that the pilot is looking through at the flick of a switch.
(Nigel) If I do this, you’ve probably got a sound in there now.
(Hannah) Yes, I’ve got like a constant beeping sound.
(Nigel) That’s tied to one of the aircraft. So it’s coming from a certain direction. If you move your head in that direction, when the sound is in front of you you’ll be looking at the aircraft it’s coming from because it’s spatialized where the source is.
(Paul) One of the things that is a growth option for Striker, and we can give a demonstration of it in our simulator here, is a thing called 3D audio, where we can spatially position sounds very accurately. It’s more than just stereo. So we can do stereo, obviously. But this is within a fair degree of accuracy. We can position sounds around the pilots such that if we give them a symbol on their display, we can make a sound come from exactly where that symbol is in the real world. So again, it’s about grabbing attention and making it intuitive, not having to think, “Okay, I’ve got a tone, where’s that tone coming from?” It’s immediately apparent and you can look straight at it.
(Paul) This has historically been the preserve for what we class as high-end, fast jet aircraft. However, the utilisation of what we’re able to display now means that what we’re really doing is we’re able to provide information which can keep people safe, reduce stresses in stressful environments, reduce the chances of mishaps. So this type of technology moving from the high-end, fast jet fighter aircraft down into some of the other more utility aircraft, both military and non, we absolutely see that being the case. And the technology which we’re developing as part of these latest head-mounted displays, that has applications not just for aviation, but for keeping people safe in all sorts of environments: soldiers, ships, understanding where shipping lanes are and what other ships are out there, armoured vehicles, where you have to see through the armour and have situational awareness. There’s lots and lots of opportunity for this type of technology, this type of system, to be transformational in a number of arenas.
(Paul) The commercial world is full of discussions around augmented reality and extended reality and mixed reality. We’ve been doing that here on site at Rochester since before those terms were coined. We developed the very first head-up display here on-site at Rochester, in the 50s/60s for the Buccaneer aircraft. And that was augmented reality, for the very, very first time ever. Talking about the future, we’re always looking at what the next thing is. So we are already working on what comes next. So we’re looking at the Tempest programme, the GCAP aircraft, that will come in the 2030s. And what does the pilot of the future need to do? And what do they need from their display system? What technology and capability can we provide as a part of that? So I guess there are two aspects of it. One is, that we see the pilot of the future probably more as a mission commander than as a pure pilot. So AI systems will be helping them to fly the aircraft. What they will be doing is they will be controlling unmanned aerial vehicles around them, drones, potentially around them, in a very data-rich environment – again, which is even more information flowing all the time. So they’ll be a battlespace commander as opposed to just a pilot. So how they need to utilise that information is key. And secondly, we’ve already been looking at, for that type of future platform, things like a virtual cockpit environment. So instead of at the moment, and in our simulator, you can see there are what we call head-down displays, basically big television screens in the cockpit, as well as head-up displays that you look up through the front as well as what you can get on the helmet. Imagine that cockpit just as a blank environment, just black, nothing there at all, and the helmet is able to provide everything. So all of your map information, all of your speed gauges, everything provided virtually on the visor of your helmet. And instead of having to change boxes and be limited with a display in a certain area, you can virtually move displays around or have a different set of displays depending on what type of mission is being performed. So a virtual cockpit is definitely one of those key things for the future.
(Paul Haimes) That was Nigel Kidd and Paul Harrison from BA Systems. Striker II is considered one of the world’s most advanced helmets. It not only protects the pilots head, but it also displays mission-critical data in colour on the pilot’s visor using augmented reality. Developing a helmet that provides protection and getting all the electronics into it is no mean feat. And as you can imagine, one of the most complex parts of the helmet is the carbon fibre shell. This is where PTC’s 3D CAD solution, Creo, comes into play. Time to meet our expert, Brian Thomson, who can tell us more.
(Paul) Brian, BAE Systems uses Creo’s style tool, which is the is ISDX module. Can you give the listener an overview as to what the tool delivers, and how it might aid a complex product such as Striker II?
(Brian) Sure, I would love to. ISDX is PTC’s top-tier surfacing module. It gives design engineers – and surfacing experts, frankly – very, very precise control over the flow and management of surfaces, starting with curves and building all the way out into controlling all the intricate connections between surfaces so that they can achieve a very, very nice aesthetic effect while also – and this is really the important part for our company like BAE –creating those surfaces parametrically. The real power here actually is not just in being able to create what we would call Class A surfaces – super high-quality surfaces intended for the exterior of any product – but the real power is that you have a highly technical application here in the development of this helmet. And what’s underneath has to be designed also with incredible precision and with safety in mind for the pilots. And the design techniques for what’s inside the helmet might be very, very different, might use very, very different tools. And the power of ISDX being integrated, sure it’s got these great tools for advanced surfacing, but the fact that it’s integrated so deeply into Creo, and provides real parametric surfaces as a result of the design techniques use an ISDX that can be linked and associatively developed alongside other types of geometry that are built for purposes of designing the other parts of the helmet, that’s where the real power comes in. I’m sure BAE has really leveraged the connective tissue between internal surfaces and internal structures and the beautiful external surfaces of his helmet. So it’s a really great application, and we love to see it.
(Paul Haimes) As someone who, many, many years ago, worked on the styling of a face mask for one of the military customers that we have here in the UK, I know how tough it was to do that without the ISDX capability at the time. But the other thing that you get with ISDX is also all of the information about the quality of the surfaces – the reflection lines, the zebra striping that we do which does just show the connections and the accuracy – which I think gives the designer confidence that the aesthetic is exactly what he’s expecting.
(Brian) Yes, you’re right. There’s a whole suite of tools in there to make sure the design engineer can really see clearly how the surfaces are flowing. Sometimes your eye can be confused, so we have these great plots that we can overlay onto the surfaces to make it obvious how the geometry is flowing so they can really get a sense of what it will look like in certain lighting and in reflective environments and so forth. It’s just part of what people who are surfacing experts expect. And again, because it’s there integrated into the tool, you can instantly see how the flow of the surfaces are changing with these different visual effects as you do your design work. So it’s a really, really great set of integrated tools.
Huge thanks to Nigel Kidd and Paul Harrison for showing us around the simulation facility at BAE Systems in Rochester.