
Living With AI Podcast: Challenges of Living with Artificial Intelligence
Living With AI Podcast: Challenges of Living with Artificial Intelligence
How Trustworthy are Drones? Should I be Worried?
00:44 - Joel Fischer
00:49 - Christine Evers
01:00 - Sean Riley
01:25 - Presidential debate 2020: Trump and Biden final debate fact-checked (BBC)
03:22 - James "Jim" Scanlan
03:55 - 2011: Revolutionising aircraft design with the world’s first printed aircraft
05:30 - DJI Mavic Air 2 review: a video director’s perspective (The Verge)
06:50 - Students to transform drone design in fight against climate change and poaching
08:05 - Drone successfully completes first delivery of medical equipment to the Isle of Wight
09:00 - Snowboarding Drone Filming
11:00 - Iris Automation Vision Systems
12:10 - Boeing 737 Max (Wikipedia)
12:45 - Byzantine generals problem (Wikipedia)
17:55 - Coronavirus disease (COVID-19) pandemic (WHO)
19:05 - Pseudo Satellite / Atmospheric Satellites (Wikipedia)
21:25 - Airplane! Movie (Wikipedia)
24:05 - Maritime & Coastguard Agency (MCA)
24:50 - Public Safety Drones Save Four Lives In One Day (DJI Website)
31:00 - Gatwick Airport drone incident (Wikipedia)
32:30 - Namibia Drone Filming (YouTube)
Podcast production by boardie.com
Podcast Host: Sean Riley
Producer: Louise Male
If you want to get in touch with us here at the Living with AI Podcast, you can visit the TAS Hub website at www.tas.ac.uk where you can also find out more about the Trustworthy Autonomous Systems Hub Living With AI Podcast.
Podcast Host: Sean Riley
The UKRI Trustworthy Autonomous Systems (TAS) Hub Website
Living With AI Podcast: Challenges of Living with Artificial Intelligence
This podcast digs into key issues that arise when building, operating, and using machines and apps that are powered by artificial intelligence. We look at industry, homes and cities. AI is increasingly being used to help optimise our lives, making software and machines faster, more precise, and generally easier to use. However, they also raise concerns when they fail, misuse our data, or are too complex for the users to understand their implications. Set up by the UKRI Trustworthy Autonomous Systems Hub this podcast brings in experts in the field from Industry & Academia to discuss Robots in Space, Driverless Cars, Autonomous Ships, Drones, Covid-19 Track & Trace and much more.
Season: 1, Episode: 3
How Trustworthy are Drones? Should I be Worried?
00:44 - Joel Fischer
00:49 - Christine Evers
01:00 - Sean Riley
01:25 - Presidential debate 2020: Trump and Biden final debate fact-checked (BBC)
03:22 - James "Jim" Scanlan
03:55 - 2011: Revolutionising aircraft design with the world’s first printed aircraft
05:30 - DJI Mavic Air 2 review: a video director’s perspective (The Verge)
06:50 - Students to transform drone design in fight against climate change and poaching
08:05 - Drone successfully completes first delivery of medical equipment to the Isle of Wight
09:00 - Snowboarding Drone Filming
11:00 - Iris Automation Vision Systems
12:10 - Boeing 737 Max (Wikipedia)
12:45 - Byzantine generals problem (Wikipedia)
17:55 - Coronavirus disease (COVID-19) pandemic (WHO)
19:05 - Pseudo Satellite / Atmospheric Satellites (Wikipedia)
21:25 - Airplane! Movie (Wikipedia)
24:05 - Maritime & Coastguard Agency (MCA)
24:50 - Public Safety Drones Save Four Lives In One Day (DJI Website)
31:00 - Gatwick Airport drone incident (Wikipedia)
32:30 - Namibia Drone Filming (YouTube)
Podcast production by boardie.com
Podcast Host: Sean Riley
Producer: Louise Male
If you want to get in touch with us here at the Living with AI Podcast, you can visit the TAS Hub website at www.tas.ac.uk where you can also find out more about the Trustworthy Autonomous Systems Hub Living With AI Podcast.
Episode Transcript:
Sean: Welcome to Living With AI, a podcast where we get together to look at how artificial intelligence is changing our lives, altering society, changing personal freedom and the impact it has on our general well-being. Today we're looking at drones, cute flying spy cameras, silent weapons or just another fad? Shortly we'll hear from Jim Scanlon, Professor of Aerospace Design and Co-Director of the Southampton Rolls-Royce University Technology Centre in Design. As well as producing the world's first 3D printed aircraft his team conduct research into unmanned aircraft systems. Before that though, it would be rude not to introduce our panel. Joining me today are Joel Fischer and Christine Evers.
Joel is Associate Professor of Human and Computer Interaction at the University of Nottingham. He likes rock climbing and running and Christine is a Computer Science Lecturer at the University of Southampton. She focuses on machine listening, equipping robots with the ability to make sense of sounds. Christine enjoys road cycling. And asking convoluted questions of clever folk, it's me Sean Riley, normally to be found at the blunt end of a video camera but hiding behind a microphone today. Now bearing in mind you could be listening to this in the future, we're recording this at the end of October 2020, just before the US presidential election. So has there been anything that we've noticed in the world of AI this week? Joel, anything you've spotted?
Joel: There was the Trump and Biden debate, the most recent one, and there was a headline I saw afterwards which was saying that they would be getting fact-checked live. So I thought that was intriguing and in some ways I thought.
Sean: Sounds like a perfect use of AI, right?
Joel: Yes, my assumption was this was an AI but then actually when I looked at the stories it seemed to be just journalists life-checking what they were saying during the debate and then kind of live tweeting about it.
Sean: Now I've got visions of a room full of journalists kind of thumbing through filing cabinets looking for the references, you know, even in a digital way. Is it something you could do with AI, Christine? Where would you start?
Christine: I suppose in principle you could. I mean with speech recognition technologies, it's relatively advanced technologies already that can transcribe speech to text which then allows machines to actually extract content from the spoken word and make sense of what is being said. I suppose from the fact-checking perspective the question arises of where do you check your facts and where do your fact sources actually come from? So you would have to build up a very large database of resources that you trust and where you have very high confidence in the reliability of the data is in it in order to then cross-check it against statements that were uttered by a human speaker, for example, in a political debate.
Sean: The way these things tend to fall down as far as I see is, it tends to be context because it's a bit like any Google search, you know, unless you're incredibly specific about what you're looking for you're going to get a lot of flim-flam around it, right? So yeah, challenges across the board there. Time now to hear from Jim Scanlon on our featured topic. Today we're discussing drones or in technical parlance UAVs, Unmanned Aerial Vehicles. Jim was formerly head of manufacturing research at Airbus and he currently co-directs Rolls-Royce University Technology Centre in Design at Southampton. His work has led to patents being filed and his research group conduct work on Unmanned Aircraft Systems, focusing on safety, reliability of unmanned air vehicles, modelling, architecting future airspace and controls and regulations and sensing for collision avoidance. Now one of the things Jim worked on is a 3D printed, well, laser-sintered unmanned aircraft. So some amazing achievements, Jim. Welcome to Living with AI.
Jim: Thank you very much. Delighted to be here.
Sean: Well, as mentioned, the reason we've asked you here, we're going to talk about UAVs or drones. Now I found it interesting doing a bit of research that if I Google UAV, I get lots of things to do with military grade and size aircraft but when I Google drone, I seem to get a lot of toys at Argos and the latest offerings from DJI. Then listening to a couple of drone podcasts, it seems to be that split goes, actually full disclosure, I own a DJI Mavic for filming myself, but is there any real difference between a UAV and a drone or are they just handy terms of reference?
Jim: Well, for years we battled the use of the word drone because it was somewhat pejorative but we've given up now, so we just call anything that flies automatically in the sky a drone and it ranges from little things, little toys, to big serious stuff.
Sean: So it's not a delineation? Because I did a filming project a few years ago on some archaeologists who were working on mapping a region of Italy and they were at pains not to say the word drone under any circumstances because of this. Where are we at the moment with UAVs then or drones? What's the current state of the art?
Jim: Well, it's a great question. So I see five waves coming towards us and the first wave has happened and that's a big industry. I like you, own DJI Mavic, we use it for filming like everyone does. It has jaw-dropping capabilities. It'll film HD video in a gale and you know, I'm sure, Sean, you've operated in less than ideal conditions. The drone's bobbing around but the picture's rock solid. Amazing capability and I've even flown it, you know, several miles away and still got HD video download. So, that sector has been done. So if you want to do aerial filming or inspection within line of sight, it's done. There's a great industry out there. You can buy some fantastic products. You have permission to operate them provided you're flying, you know, and meet certain conditions. So that's done.
So the next wave is beyond visual line of sight where you send a drone off and you lose sight of it and there are some issues there, some research issues actually, about how we do that. And so my team, we're working on, that's essentially what we're working on at the moment, beyond visual line of sight flying. And to give you an example, we're flying drones in Guatemala to monitor volcanoes and the public good aspect of that is that it's an early warning system. So the idea is that we give local people a set of drones that are relatively robust and easy to operate and they just hit a button. It goes and flies and says, “Is the volcano likely to blow tomorrow?” In which case there's some emergency action to take.
So that flies about 20km, has to climb up to the top of the volcano, fly around. So it's quite a long endurance drone and come back and it has to be cheap. And then that's operating in remote regions where the safety case is not that critical. So if it falls out in the sky, it's likely to fall on jungle or into the water. The next wave of applications and permissions is about operating in high risk areas. So operating over roads, schools. So the next project we're working on is to actually deliver an NHS delivery network in the Solent region where we will be flying in urban regions and if something goes wrong with the drone, it could kill people. So some big challenges associated with that.
Sean: I think it's interesting, you know, talking about the use cases of the difference between something that's going in a remote place and over an area which is potentially full of people. The thing that strikes me is that the technology is obviously capable of dealing with all of these, depending on how much money we throw at it, in terms of how many rotors, perhaps, you know, for redundancy, for instance. You know, you lose a rotor, the thing keeps flying, etc. So we can have a chip in a watch, right, that can do all of the computation needed to fly one of these drones. It seems to me that it sort of comes down to the software.
I mean, to give you a for instance, I took my drone flying when I went skiing and the software was unable to recognise that although it had started from a point up the mountain, when I skied down, the flight path was to go below the point of starting and it seems to me that was purely a software thing to understand that it had started on a mountain and therefore it could fly lower than that. I suppose what I'm trying to get at is, is software, is the software the problem here rather than the technology?
Jim: It's a huge chunk of the problem for sure. I've got a slide that I use at conferences that represents what I call the nightmare scenario for me. So we're flying a very large drone, so nearly half a tonne of drone and we're flying, we've been flying to the Isle of Wight but we're going to be flying to the Silly Isles and Shetlands, Orkneys. And the nightmare scenario is this, it's beyond visual line of sight, so you can't see it. And there happens to be a paramotor flyer, so one of these lunatics that puts a big fan on their back and flies around for fun. And they're perfectly entitled to be there, having fun, you know, at 400ft out over the sea. They're not talking to anyone, so they have no radio. They don't have a strobe, so they're not very visible. They have a low cross radar cross section, so you know, even using radar, they're not easy to see. They won't carry transponders, so what we call electronic conspicuity devices.
And therefore, how do I avoid automatically that person? And the answer, possibly, is vision systems. And the world leader, I would say, in vision systems is a company called Iris Automation in the States, who have a lovely vision system, very expensive. And what it has to do is essentially software, so what am I seeing and what is it? Is it a paramotor and how do I avoid it? But what if an insect splats on the lens of that camera and it stops working? What if the camera stops working? Do I need two of these? If I have two of these, which one do I believe? Et cetera, et cetera.
So there are some big challenges still around essentially replacing the pilot. I mean, we trust pilots with a couple of, you know, pretty good eyes to do all that donkey work on the airplane and we're still, to be honest, struggling to answer that fundamental question. How to do it cheaply, reliably, and that's the scenario we're dealing with at the moment.
Sean: Even with a pilot, we've heard horror stories, particularly fairly recently with, was it Boeing and the 737 MAX, I think, have I got that right? Where, you know, the software decided to override the pilot and with horrible consequences. So it's, yeah, it's not unthinkable, but I think the key comes down to keeping it at cost, I'm guessing, because you could put 10 cameras on it, 50 cameras on it, 100 cameras on it, right?
Jim: Exactly, and I'm glad you mentioned the 737 MAX aircraft, because we have studied that in huge detail. And that, from an academic perspective, is a class of problem that has been called the Byzantine generals problem. And what that means is that you've got people that not only might be faulty and therefore not responding, but people that tell lies. And in the case of the 737 MAX, there were angle of attack sensors, and one was telling the truth and one was telling a lie. Now, how do you work out who's telling the truth? And it's a difficult problem. It's sort of sensor fusion, but it's much more subtle than that. And that's something we're very specifically working on, we've come up some algorithms that help to decide who to trust.
Sean: Presumably, you test the water by making a change and seeing what the consequences is. But, I mean, there's a huge amount of research, and we're not here to talk about driverless vehicles on the road, but there's a huge amount of research, presumably using radar and vision systems, etc, etc, going into that at the moment. Is that something you can use in the air? Is it a much more difficult problem in the air? Or are there ways where it's actually more simple in the air?
Jim: Well, I would argue that it's simpler and more complex. So it's more complex because in a car, you've got a much more cluttered landscape, lots of things going on. But actually more difficult in some respects, because the closing speeds could be 200 knots, and you've got to very quickly detect something, decide where it is and do something about it. So it's just different, really. But there's some fantastic work, I mean, the world of image processing and sensing that is being pioneered in many respects by the automotive community has a crossover. Absolutely. Yes.
Sean: And how ready is the regulation for drones in commercial applications? Are we getting towards a point where they'll be able to be used for things like this? So you mentioned you're doing some trials of the Isle of Wight with the NHS and presumably delivering supplies, etc, medical supplies. How far off is getting that to be the norm?
Jim: Sadly, I think there's quite a long way to go, because a lot of it is the inertia of rulemaking and consultation. Essentially, sadly, the huge challenge we have ahead of us is we have to redesign the airspace, full stop. And it's something that we're not very good at. And so that's something, again, we're working on. So we're doing a very large simulation of the airspace, including all airspace users to work out objectively, what are the risks? Because there's no point in introducing rules that don't reduce the risks.
And I'll give you an example. There's been some airspace changes recently around Farnborough. And those changes in the local area improve safety and they, you know, rules around aircraft and where you can and can't go, without realising that they've created pinch points further away that make the airspace more dangerous. So at a macro perspective, it's a stupid rule change. In a micro perspective, it's a very sensible thing to do.
Sean: This is the whole point of complex systems, right? There are ripples, right?
Jim: Exactly.
Sean: If you change everything to autonomous overnight, obviously incredibly impractical. Straightforward, because you decide on the rules, and then you go from that. But then you don't have that micro light or para-send or whatever it is in the system, because he, you know, can't be doing that. You're not allowed to recreationally fly anymore.
Jim: Well, I think there's lots of low hanging fruit. So there are lots of useful things drones can do beyond visual line of sight in low risk areas. And actually, you can do, you can segregate the airspace without annoying people. I fly light aircraft so it really annoys me when people steal bits of airspace from me. Because, you know, that's, that's compromising my freedom of the skies. But in areas where no one cares about the airspace in remote regions, you know, if I say, this is a drone corridor, and I consult people, then that's fine, we can do that. And we also have a concept of airspace multiplexing, so you use the airspace when no other people want to use it. So drones are perfectly happy to fly at night, for example, where the airspace is less crowded. So yeah, lots of fascinating challenges. But I think the COVID situation and I think the disruption in manned aviation is just accelerating this stuff.
Sean: So it's almost an opportunity, the COVID situation, isn't it? Because as much as anything, you've got fewer aircraft up there and therefore there is the time and the chance to explore these things. But also, it's a required thing now, you kind of need you need more of the help from the drones, I guess?
Jim: Yeah, we're getting such a strong pull from all sorts of charity organisations and others to just accelerate this stuff. I mean, funnily enough, I have two sisters that live in remote regions. One lives in the Shetlands, and one lives on Fair Isle, which is between the Orkney and Shetlands, really isolated community. And she phones me up every now and again, nags me and says, “When can we have these drones? We want deliveries, cheap deliveries, connectivity, it's a bit foggy, so the aircraft can't come in.” Drone would happily fly in foggy conditions.
Sean: There's also, I mean, doing my little bit of research in this, we've got what do they call it kind of pseudo satellite drones as well and solar powered drones and all sorts. What can you tell us about those?
Jim: Yeah, so the most extreme example, there's an organisation called Stratospheric Drones, who have produced something about the size of a jumbo jet, which is going to be hydrogen and solar powered, that'll be up at, and stratospheric means tens of thousands of feet, so 50,000ft plus, just operating there for months, operating essentially communications relay, so low altitude satellites effectively, cheaper than satellites. So there's a whole industry around that. So stratospheric platforms, and there's at least two other UK organisations that have already flown examples of these.
Their frailty is that they have to be made very, very lightweight to fly at those sorts of altitudes. But they have some advantages in that you can position them more easily where you want them to be or where you need them to be. And also, because they're at lower altitude, the communication losses are more favourable.
Sean: So they're almost gliding. I think the thing is, though, you know, jumping to clickbait question from off the back of that, something the size of the jumbo jet up there in the stratosphere, how worried should we be about things like that falling out of the sky?
Jim: That is obviously an issue. I mean, these tend to be very, very lightweight structures and lightweight aircraft. So there's a UK company that produced one that is, you know, it looks like a human powered aircraft. It's very, very lightweight. So it's not a huge hazard in terms of ground risk. But it's something they have to worry about. They have to make it airworthy. They need to make it safe as a 747 jumbo jet.
Sean: We're used to those jumbo jets being piloted by several people, you know, Rudan Razif, one of the pilots, I mean, the Airplane movies to one side, there always tends to be somebody who can take over if there's a problem. Whereas we are now again relying on software with these things flying.
Jim: Well, yeah. I mean, I'm somewhat ghoulish in that I'm an avid reader of accident reports, aviation accident reports and most of them these days are human factors. So the pilot did something wrong. So no longer are we in the realm of, you know, the system going wrong or the aircraft going wrong or the software or hardware going wrong, it's down to humans. So in some respects, it's a very strong safety argument to simply remove the pilot from the airplane. You talked earlier about the 737 MAX aircraft, and in fact, Airbus lost, a very famous accident, Air France 447, where the pilots actually made things worse and if the automation had been left alone, it would have recovered.
Sean: It's an interesting switch, isn't it? But what does the future look like now? What should we expect to see in that sort of five to ten years?
Jim: Well, I think increasingly, we're just going to see drones. I mentioned the low hanging fruit, so we're going to see drones operating in remote regions increasingly. The great thing about that is that that gives us confidence and gives us safety cases to convince the regulators that now we need to put drones in semi-rural areas, and gradually build that up. The MCA, so the Maritime and Coast Guard Agency have just, are due to renew the helicopter contract for search and rescue helicopters, and they've explicitly written drones into the new contract. So, whoever wins that bid is expected to operate drones. So they'll be operating off our coastal regions. And as you said earlier, the technology is all there to do that safely, as we stand.
Sean: Do you think that will help the drones or the UAVs escape these negative connotations? Because I've flown my drone in a massive field, perfectly legal space and still seen people looking unhappy about it. Unfortunately, they have this reputation, don't they?
Jim: Yeah, I think safety and privacy concerns are huge. But, you know, these drones, that the MCA will operate in the future will be helping search and rescue helicopters and help people that are, you know, drowning or in distress at sea. So that's a public good thing. Personally, I'm not a great fan of drones being used in a military or security context and I don't work on any projects of that nature.
Sean: I seem to remember a story where a drone delivered a life vest to someone in, you know, in rough surf fairly recently. So maybe if more and more of that's happening, people will see this as a as a positive rather than a negative?
Jim: Well, you're quite right. So DJI, who make the Mavic that you operate and I operate, did a really good report, I know, 18 months ago and there's proof that their drones have saved 50 lives with those sorts of interventions.
Sean: So well, hopefully, as people see more and more lives being saved by drones, then, you know, these negative connotations will disappear. Jim, thank you very much for joining us today on Living With AI and thanks once again.
Jim: All right, thank you very much. I really enjoyed it.
Sean: Over to our panel now, what did you think about what Jim was saying there, the current reputation drones have and the future use of drones?
Chrstine: I thought it was very interesting. It's interesting to hear both from the technology perspective, as well as from the perspective of societal concerns and I think Jim very nicely addressed a number of issues that I think the public are concerned about. For example, what you said about drones dropping out of the sky. I think more broadly, for me, there are concerns. I mean, there's a huge potential for drones to be applied for societal good. For example, the search and rescue scenarios where a drone could either detect humans who are in danger and dangerous situations, but also to supply materials or equipment that they're required to help themselves.
There's also the situation of medical delivery, for example, for people who live in remote area where an ambulance could not reach the person fast enough to actually supply first aid. So in terms of the good that they supply, there's an enormous potential. But I do think that there's still questions about what drones specifically are used for more broadly. For example, would the public feel comfortable if the drones were applied for, say, police purposes? Now, of course, I believe there were surveys where people indicated that they're generally relatively happy with police drones being deployed, for example, for particular scenarios and dangerous scenarios.
But in general, it's not widely accepted that police drones would actually patrol neighbourhoods, for example, since there's obviously quite significant infringement of privacy there, potentially. So it was promising to hear that Jim is not actually working on military drones for that purpose as well. So that's very interesting.
Sean: You can imagine that even on search and rescue, taking it to the next sort of step, if you've got a drone, I don't know, overflying a beach, keeping an eye that there's nobody drowning and then using computer vision to work out if somebody's in trouble or not, then it's not a far cry from saying, “Well, hang on, this is spying on me.” What do you think, Joel? How far are people prepared to accept this sort of automation?
Joel: Yeah, I think it's an interesting question. And I wanted to actually go back to a point you made earlier. Search and rescue is a good example. A point that you made earlier was about, isn't it all about the software? And isn't that, you know, the technology is kind of there, the hardware is kind of there, but the software isn't sort of clever enough. And then we had Jim talking about the challenges, though, that they have in their research around, you know, essentially replacing the pilot. And I think what's intriguing about, and I would contest that it's just about the software, I would say it's not at all just about the software, but it's about clever ways of linking humans into the loop.
And so take the search and rescue example. So you have drones, which are, you know, they're flying in an automated fashion, they've got some kind of vision system. So they're surveying the area, and they're maybe looking for survivors, you know, injured people so that they can then, you know, let the ground staff know and send in the emergency response. So that's the principle. But then, in actual reality, you're going to have a lot of uncertainty regarding the data that comes from the vision system. And how do you solve that uncertainty? You don't solve it with algorithms, and you don't get rid of that certainty. But what you can do is where you have uncertainty, you can then involve a human to take a look at the data.
Sean: And interpret it, yes, yes.
Joel: And so that would happen at critical moments where you have high uncertainty in the system to actually then bring a human into the loop. So you don't need to have an operator that looks at it the whole time, but just looks at it at critical moments driven by someone. So you're kind of almost using the flaws in the technology to then drive a person to look at it. I think that's where I think we're going to see some of the technical challenges being solved, not by having better algorithms, but by having clever ways of bringing humans into the loop.
Sean: Okay. I mean, it was an overly simplistic thing to say about the software. I suppose what I meant was as opposed to the hardware. But I think, yeah, implementation is key here. And talking of implementation, the moment we start using things for policing, then it's not going to take long before somebody decides, oh, we could equip it with a weapon of some sorts. And then suddenly, without being overly dramatic, we're in Robocop territory, aren't we?
Joel: Well, of course. And we already have UAVs that are equipped with weapons. I mean, they are a reality of military systems now.
Christine: But then on the other hand, if you think back to recent events in the last few years, you don't actually need to equip a drone with active weapons in order to weaponize them. I mean, I'm thinking about the scenario two or three years ago when commercial drones were deployed in order to attack Gatwick and Heathrow, which caused absolutely an enormous disaster for passengers as well as air traffic and the security of air traffic as well. If you think about a landing airplane, what happens if this drone were to actually enter the turbine at that moment? So it could be disastrous.
And similarly, if there are concerns about, well, you've got these drones that can reach areas now where humans may or may not be able to reach that. Could you deploy drones that are not weaponised in order to sniff, for example, Wi-Fi signals for critical facilities like oil rigs? So there is a potential of harm both physically as well as in terms of data and privacy that drones can take and that need to be addressed.
Sean: The whole thing is quite interesting because it's just a technology that's happened so quickly or it's accelerated so quickly. I mean, Jim mentioned the DJI Mavic being able to fly several miles. My point about the software is challenges I've encountered, I suppose. I mean, for instance, I used my drone in Namibia to track a moving vehicle, which I was in. And then the cleverness, I'm holding air quotes up here, the cleverness of the software determined when it wouldn't have sufficient battery to return to where it set off from, which is exactly not what I wanted it to do because I was in the vehicle. So it turned around and set off back to base, which happened to be a spot by the dusty roadside in the middle of nowhere. We had to turn the car around and chase the drone. So the software side of it, I suppose, yes, coming up with those hybrid techniques, coming up with those ways of working, it's about the user interface then?
Joel: Yes, I think what you're describing there is the importance of context and any kind of usage of a technology, context is important. But I think especially in these kind of technologies where location matters so much and you've got a mobile piece of technology flying through the air, I mean, the context changes around it all the time. And having ways to then adapt and override, perhaps, the way the system would normally assume it should respond, like it makes an assumption, that there's an inbuilt assumption that it should always return to base. But what if the base moves? In this case, you're the base.
Sean: Or if it's a ship at sea, I think I'd be slightly concerned that that's it, bye-bye drone.
Joel: Yes.
Sean: But, you know, perhaps that's just me not knowing how to work that system properly. There's probably overrides and things you can do for that. But, yes, it just highlights that these things are really important. Okay, Joel and Christine, thanks again for being part of the panel on today's Living With AI podcast and we'll see you very soon.