Living With AI Podcast: Challenges of Living with Artificial Intelligence
Living With AI Podcast: Challenges of Living with Artificial Intelligence
Trustworthy and Useful Tools for Mobile Phone Extraction
Mobile phones store valuable information about geographical movements, communications behaviours and online browsing history. They are increasingly used as a source of evidence in criminal investigations. The Mobile Phone Extraction (MPE) process, involves making copies of a device belonging to suspects, victims or witnesses. The extracted data is examined by police and others in the criminal justice system during ongoing investigations.
There is a crisis of trust and practice in MPE. This project addresses the crisis by supporting development of a trustworthy and useful platform for MPE. The project partnered with experts in software development (Telemarq) and digital forensics (Hargs Solutions).
Project webpage: Trustworthy and Useful Tools for Mobile Phone Extraction – UKRI Trustworthy Autonomous Systems Hub (tas.ac.uk)
Joining us on the podcast:
Helena Webb, Transitional Assistant Professor, University of Nottingham
Liz Dowthwaite, Senior Research Fellow, University of Nottingham
Anna-Maria Piskopani, Research Fellow in IT Law, Horizon Institute of Digital Economy, University of Nottingham
Podcast production by boardie.com
Podcast Host: Sean Riley
Producer: Stacha Hicks
If you want to get in touch with us here at the Living with AI Podcast, you can visit the TAS Hub website at www.tas.ac.uk where you can also find out more about the Trustworthy Autonomous Systems Hub Living With AI Podcast.
Podcast Host: Sean Riley
The UKRI Trustworthy Autonomous Systems (TAS) Hub Website
Living With AI Podcast: Challenges of Living with Artificial Intelligence
This podcast digs into key issues that arise when building, operating, and using machines and apps that are powered by artificial intelligence. We look at industry, homes and cities. AI is increasingly being used to help optimise our lives, making software and machines faster, more precise, and generally easier to use. However, they also raise concerns when they fail, misuse our data, or are too complex for the users to understand their implications. Set up by the UKRI Trustworthy Autonomous Systems Hub this podcast brings in experts in the field from Industry & Academia to discuss Robots in Space, Driverless Cars, Autonomous Ships, Drones, Covid-19 Track & Trace and much more.
Season: 4, Episode: 6
Trustworthy and Useful Tools for Mobile Phone Extraction
Mobile phones store valuable information about geographical movements, communications behaviours and online browsing history. They are increasingly used as a source of evidence in criminal investigations. The Mobile Phone Extraction (MPE) process, involves making copies of a device belonging to suspects, victims or witnesses. The extracted data is examined by police and others in the criminal justice system during ongoing investigations.
There is a crisis of trust and practice in MPE. This project addresses the crisis by supporting development of a trustworthy and useful platform for MPE. The project partnered with experts in software development (Telemarq) and digital forensics (Hargs Solutions).
Project webpage: Trustworthy and Useful Tools for Mobile Phone Extraction – UKRI Trustworthy Autonomous Systems Hub (tas.ac.uk)
Joining us on the podcast:
Helena Webb, Transitional Assistant Professor, University of Nottingham
Liz Dowthwaite, Senior Research Fellow, University of Nottingham
Anna-Maria Piskopani, Research Fellow in IT Law, Horizon Institute of Digital Economy, University of Nottingham
Podcast production by boardie.com
Podcast Host: Sean Riley
Producer: Stacha Hicks
If you want to get in touch with us here at the Living with AI Podcast, you can visit the TAS Hub website at www.tas.ac.uk where you can also find out more about the Trustworthy Autonomous Systems Hub Living With AI Podcast.
Episode Transcript:
Sean: Welcome to the Living With AI podcast from the Trustworthy Autonomous Systems Hub. This is season four, so plenty of back episodes there if you want to go and binge on those. I’m your host Sean Riley. This podcast discusses AI in relation to trust and trustworthiness and today we’re going to be talking about trustworthy and useful tools for mobile phone extraction. We’re recording this on 9 April 2024. This edition is a projects episode, so that means we hear from some researchers on some of their TAS Hub projects and we hear them discuss some of the findings and challenges they came across. Each of them will introduce themselves, then we’re going to hear a little bit about the project they’ve been working on and discuss how AIis levelling things up. So please welcome Helena Webb, Liz Dowthwaite and Anna-Maria Piskopani. Thanks for being part of Living With AI. Can I just ask you each to give us a bit of an introduction, what’s your name, and tell us a bit about the project. We’ll come to the project afterwards. Let’s start with you Liz.
Liz: Hello, so my name is Liz Dowthwaite, I’m a senior research fellow in the Horizon Digital Economy Research Institute at the University of Nottingham. Although I work in a computer science department I’m actually a psychologist, I’m a social psychologist, and most of what I do is try to understand how people interact with and through new technologies, with a particular focus on human flourishing. So how can we make sure that all these wonderful new technologies are not detrimental to our wellbeing.
Helen: Hello, my name’s Helen Webb and I also work at the University of Nottingham. I am a social scientist by background but I now work in the school of computer science. The projects that I do are all about interdisciplinality, bringing together insights from the social sciences, computer sciences, law, humanities and so on to give us a very in-depth and granular understanding of the lived experience of technology, ways in which we can ensure technologies are responsible and better suited to human needs and so on.
Anna-Maria: Hello, my name is Anna-Maria Piskopani and I am a legal scholar. I am also a research fellow of the Horizon Institute of Digital Economy at the University of Nottingham and my expertise is data protection law and generally I’m involved in projects and try to bring up all the legal issues raised by different projects.
Sean: Superb. And you’ve all been working on this same project, Trustworthy and Useful Tools for Mobile Phone Extraction and as I understand it, that’s getting data from mobile phones. Maybe I’m over-simplifying it there. Can one of you give us a bit of an explanation as to what that means in layman’s terms?
Helen: Yes, so it’s all about the ways in which mobile phone data is used in criminal investigations. So by the police and then law cases afterwards. So if the police are investigating a crime, they might ask the complainant in the case, the suspect or a witness to hand over their mobile phone so they can take information from the phone as part of the investigation. So if you think about everything that you have on your phone, actually this can provide a lot of useful evidence to understand a crime. So there could be messages between the claimant and the suspect, for instance, information about what a suspect’s been looking at online, where they’ve been physically and so on and so on. So all of this information can be very, very helpful in investigating a crime and securing a conviction when it goes to court and so on. However, there are lots of issues associated with it as well. So if you think about how much information you have on your phone, you know, going back years and years and years, lots of text messages, photos, videos and so on, lots of information, apps and so on, first of all that’s a massive amount of data that’s available and until quite recently, the default by the police was often to take everything off the mobile phone. So it could be data that relates to events that happened well before the crime being investigated for instance. So you have lots and lots of data and this could mean that it takes a very long time for the police to analyse it because there’s just so much to wade through, so it creates backlogs in cases. Another issue is the privacy aspect of it as well because you have instances in which the police and others in the criminal justice system are able to look at information that doesn’t have any relevance to the case whatsoever, and again, if you think about what we have on our mobile phones, most of us have details on there, information on there that’s actually quite private and we don’t want everybody to see. You know, private relating to us and also to other people as well. We all have information about third parties on our phones. So you have these issues around the efficiency of the process and also the lack of privacy in the process as well. So there have been, in recent years, a lot of concerns expressed by organisations such as Privacy International, Big Brother Watch and ultimately the ICO as well over the ways in which these mobile phone extraction processes were taking place and that’s led to renewed attention to this area and this is where the project’s came in as well. So we were interested in these processes of mobile phone extraction. How you can make them more trustworthy. So how can you ensure that people feel better, they can trust the process, that they would be willing to hand over their phones because in fact a lot of people, because of these kind of issues, were quite reluctant to hand over their phones to the police. So what can you do to make that process more trustworthy? And is there a capacity for digital tools to aid in the process as well by making that process more efficient in terms of analysis? So can we have an AI-informed tool to help look through the data? And also tools that ensure a more privacy preserving mobile phone extraction process, so one that just gets information that’s relevant to the case.
Sean: I think it’s really interesting thinking about that relevancy because who determines what’s relevant? I mean thinking about a real world rather than a cyber example, if a potential suspect lived in a shared house and there was a search warrant for the house, of course that’s going to affect everybody who’s living in that house and just because you say no, no I didn’t put anything in that room, you shouldn’t go in that room, isn’t there a kind of digital equivalent? Anna-Maria, is there precedent for that?
Anna-Maria: Well, the thing is that if you kind of- An equivalent of that, I wouldn’t say exactly. But one thing you can actually take from those devices is actually select exactly what it is necessary and proportional to the exact kind of investigation that you’re doing. So as a general principle I could say anyway, in order to respect everybody’s rights as you very, very easily can understand.
Sean: Yes I mean maybe I’m being cynical but I’m going to ask Liz this from a technical point of view, but surely if I just said no, no, you know, you don’t need to look at my photos, none of the evidence would be in the photos, but of course, potentially could have hidden something in the photos, but is that overly cynical?
Liz: No I don’t think so. I think it depends who you are in a case. So obviously the victims and witnesses often just want you to take anything that might be relevant and actually most of who we’ve worked for are people who are looking at the victims’ phones and the witnesses’ phones and they talk through in very great depth what might actually be on the phone before they start the extraction process. Where it does become more of an issue is where it’s the suspect and there may be things that they don’t want to see on their phone and that’s a bit more of a complex situation and often, you know, they’re told not to comply and not to hand over their phones. So there needs to be another step in the process before we can even get access to the phone. And in that case, they do tend to take a more- Not let’s take everything but a more broad strokes approach to that. But certainly people we’ve talked to who deal with victims, they’re very careful to sort of identify everything that might be relevant and to understand why you might not want to take other things.
Sean: I can understand that. I mean I suppose I’ve come at this from thinking of that story from a few years ago where the FBI managed to use an external company to crack an iPhone that a suspect wasn’t giving a password for and yet actually, this is broader than that isn’t it? This is people who are helping police with their enquiries if that’s not too much of a euphemism.
Helen: There are differences around who the phone owner is. The police do have- Under PACE they do have the rights to seize mobile phones if they believe that they might provide useful evidence in relation to a crime but they tend to go towards more of a consent process rather than seizing phones, particularly if it’s a complainant or a witness, they’ll go for a consent process of asking the phone owner to hand over their phone. If it’s a suspect, they might always ask, do it on a consent process. Very often, as Liz said, the defence will be advising them don’t hand over your phone, follow a non-cooperation tactic, so that will lead to different processes in which the phone is gathered and the data from it is collected as well. So we’re beginning to see one of the things we find in the project, we’re beginning to see in discussion, is that there’s a lot of complexity around it because it depends on who the phone owner is, what the crime is, how the investigation unfolds and so on and so on. So it’s quite hard to generalise across all cases.
Sean: I’m wondering what the binary for no comment is. Maybe we should talk about some of the challenges you’ve faced in this project. It’s an ongoing project as we speak, you know, where have some of the difficulties lain? Has it been in that kind of side of it which is legislative? Or has it been on the technical side or is there some social engineering problem that I’ve not foreseen?
[00:10:02]
Helen: So we’ve been doing a number of different things in the projects. Anna-Maria’s been leading a scoping of the legal situation around mobile phone extraction which has been changing a lot in recent years, it’s quite a complex picture. Liz and I have been working on engaging the stakeholders in this area. So we’ve been talking to people across the legal profession, so police, lawyers and victim support organisations and so on, understanding their perspectives on mobile phone extraction. And then we’re also working with our project partners Telemark who are producing a tool that’s designed for selective extraction of mobile phone data. So the idea is that all of these packages gel together and we can use them to identify different ways in which this process could be more trustworthy in the sense of privacy protecting and efficient. And I think we’re having lots of successes with the project, but also lots of challenges as well and the first one that I’ll mention before passing on to the others is understanding really that there’s a limitation to what additional tools themselves can do. So our starting point is to try to see, well, how can additional tools aid with this process? How can they make the process of mobile phone extraction more efficient or trustworthy, privacy preserving and so on. And there is scope for tools to be able to help with this but what we’re finding, it’s such a complex situation in which you have stakeholders with very different perspectives, different objectives of what’s going on, you’ve got certain issues with the way the criminal justice system is organised that create very entrenched positions and embedded eproblems that a tool by itself can’t resolve. So one of the things that we’re very aware of when we talk about our responsibility as researchers is not to overstate the capacity for the digital tools to address these problems. They can achieve certain things but it’s not going to overcome a lot of the tensions that we see in the project.
Sean: Not a panacea, yes, understood. I was thinking, you know, even just a single photograph can contain so much information, you know, metadata that goes with the miimage itself, GPS location, you know, what kind of camera, all this sort of information in one photograph and many of us have thousands of photographs on our phones, access to more and that’s just the photos and the camera information, never mind what we’re doing with the apps and any other trackers we’re using.
Liz: Well I just- I think that’s really key. I think a lot of what we hear from- So we say the laypeople, so people who might be asked to hand over their phones, is there’s so much on there about other people, so photos aren’t just of you, they’re of your friends, they’re of your family, they’re of your dog, they’re of, you know, all sorts of things and you might not want to invade the privacy of other people. So I’m fine with you taking anything to do with me but actually there’s a lot on there that is entrenched socially with other people and that can, you know, you might be divulging minor crimes that other people have say taken part in, or not so minor, and there’s a lot of concern about I don’t want to bring other people into this. So certainly, as you talk about photos, but even text message conversations and things. You know, even if part of a text message is about an assault you might have been talking about your massive illegal rave the day before and things like that. So they don’t want to necessarily divulge all of that. But yeah, on the data side, you’re right, there’s masses and masses and masses of stuff, and part of the problem is a lot of it is still necessarily manual. So people we spoke to, any redaction, any anonymisation at the moment is mostly manual, so the police have to go through and decide what to take out of these vast texts and vast photo libraries. So that’s somewhere where we feel that tech might actually be quite useful.
Sean: But I’m imagining the tool perhaps take a kind of- Well potentially takes a slice out of time and also of, I’m going to call it a layer, maybe, you know, the messaging layer of your phone or a slice out of time, I don’t know, maybe I’m guessing too much, but I’ve noticed in my phone, I’ve had problems where it’s made mistakes. So I think there’s a problem also with people believing these devices are infallible, I’m not going to mention the Post Office Horizon scandal at this point but I will just allude to it. But looking at, say, for instance, I use an Android phone, I use Google Maps, I let it track me so I can see where I’ve been and how far I’ve run perhaps, and I’ve seen huge spikes where it seems to jump from one place to another which is completely, well, impossible, but it could potentially place you in a place that you weren’t if that’s not too strangled a way to put it. What are these- Are there implications in that from a legal point of view I'm thinking Anna? Is that something that we need to be worried about that these devices might make mistakes that are then used in error?
Anna-Maria: Definitely. And one of the things that we have to keep always in mind is exactly that. The police that investigate these cases but also anyone who’s involved in a court case has to bear in mind that there might be restrictions. Those have to be highlighted. When we extract the data, you have an obligation from GDPR to keep logs and report exactly what did you do while you were extracting those kind of data from the devices, so anything that is missing or any technical issues have to be reported in that report. So anyone can [unclear 00:15:49] all this information the right way and one of the things that we generally say in the scope of responsible AI is that there are tools that are helping us enhance possibilities but investigations have a lot of the capacity so they never should overrely on those kind of tools to make the [unclear 00:16:15] they have always to keep in mind that it’s just a tool, that it helps them to find some information that should be compliant with a lot of other informations in order to get to some kind of decision or some kind of judgement and that should be very, very highlighted every time.
Sean: Yeah, the whole case shouldn’t hinge on it, it’s supporting evidence as much as anything. I mean that brings us nicely onto the idea of responsible AI and how you know, how we can use this. I mean, what’s the AI element here? Is the tool using some kind of machine learning? Can we talk about that a little bit?
Helen: Yes, so the tool that’s being developed is called RIME, which stands for Responsible Investigation of Mobile Environments and it’s being designed as a tool that supports selective extraction of data from a mobile phone. So the new guidance for police is that they should, wherever possible, selectively extract from a mobile phone rather than taking all of the data from a mobile phone. So RIME is a system that can support that. And then it supports forms of analysis to work through the data. So it has- We’re experimenting. This is the fun part in the project that we get to try out different possibilities of ways of looking at the data based on what we hear from the police in terms of what they need to see when they’re working through the data. So we’re experimenting with the different possibilities such as visualisations of a dataset. So getting like a word cloud, for instance, that can show you the common terms that are being used in text messages between two people of interest, timelines that show peoples’ movements over a period of interest, different kinds of features including also privacy protection features as well. So one of the things that RIME can do is to psuenonomise names when information is being shown. So you’ve got two people who’ve been sending text messags to each other. You know the name of the suspect. The other person, you don’t know if they’re a person of interest yet. RIME can give them a pseudonym so you don’t see their real name but it’s indexed, so that every time- It’s not a random alternative because it’s indexed to actually who they’re- Their original name, if you see what I mean, so if you do find out that they’re of interest then you can go back and uncover their real name. So all of those features use AI technology in some form or another. Not necessarily particularly complex forms. In fact, really what we’re interested in is the usability of it. So are we meeting requirements that police investigators have in the sense of how can we make this process more efficient for them by harnessing available technologies.
Sean: I can imagine that being a really powerful thing to do visually to help unpick what’s happened in a case. I’ve seen too many crime dramas, so I’m imagining the wall in the police station with the pictures on it and the pieces of string pulling across. But from a digital point of view I can imagine that being really helpful to be able to, as you say, look at the timeline. Can you also intersect multiple devices and connect those together?
Helen: Yes.
Sean: Yeah, I would imagine that’s really good. So that sounds like that’s gone well. What’s been the feedback from some of the people you’ve worked with about RIME?
[00:19:50]
Helen: So it’s being developed ongoing and I should say that there are tools available commercially that do similar things that we know some police forces are using. They’re very expensive and also I’ve tried using them, quite incomprehensible to use as well. They’re not necessarily user-friendly. So we’ve been trying to develop RIME in the first place as a kind of experiment to see what additional features we might be add that these existing tools don’t have. Also to work on the usability aspects too and also as a kind of accountability measure in itself because RIME is an open source tool. It's available on the Horizon Digital Economy GitHub, anybody can go and take a look at it because one of the things we’re interested in is where you have these technologies, as you yourself said, there is capacity for errors to be made in them. If they’re open source that means they’re available to inspect by anybody so it could be people using them, it could be defence barristers who are representing someone who’s been in part convicted by information that’s been detected from a tool, auditors and so on. So how could open source add to this kind of transparency element and ultimately the trustworthiness element of it. So that’s what we’re tryint to achieve with RIME and so far, when we’ve taken it back to different stakeholders, shown what we’re doing, we’ve had very positive responses from them in the sense of like yes, we want to be able to work with a selective extraction from the phone, and there are different kind of analysis techniques that we want to be able to use and we also want it to be highly usable because we’re talking about, you know, in this country, it’s not like CSI sadly with the walls and the people moving around video images. We’ve got a very underfunded police force, often people who don’t necessarily have a lot of training in this as well, because the police force in this country is quite young so they don’t necessarily have time to build up the training and the experience. So the more that we can deliver something that’s easily very usable to large groups of people, that’s really where we see the contribution that we can make in helping the efficiency of this process.
Sean: Liz?
Liz: From the other side as well, we’ve spoken to some, as I say, laypeople, people who potentially might be on the other side of this process, and as we said at the beginning often it’s more like well I don’t really trust the police so I’d never do this, but as you start to discuss what a tool like this could do, the selective extraction, the fact that you are actually maintaining your privacy and you’ll be doing it with the police and it’ll be all this- It really helps them to be ensured that even if you’re there with a junior officer who wants to take everything there are processes in place. You have to fill out all these forms before you can do any of it, and we’ve been talking about how we can make it more visible to these people so they’re never without their phone, which is a real concern. So like I don’t want you to take my phone for a week, two weeks, two years, can you do it with me in situ? And actually yes, these tools allow that to happen and allow victims or complainants to see exactly what has been taken from their phone and what hasn’t, and that really helps to build that trust in the system.
Sean: Anna-Maria?
Anna-Maria: And also, this open source tool can also enhance the privacy by default and design because it’s exactly trying to pseunonymise third parties as it is that- As Liz says, it can give more trust to someone to actually give their mobile phone for that purpose. And most of all, it can help other researchers in the area who want to advance or find out more in other ways and have more future features of privacy by default and design. One of the key issues it raises is that we don’t- One of the problems in data, we- There was in the beginning, the primary thinking was that we don’t really want people to not actually file a complaint just because they don’t want their phones to be given to the police and that means that a crime is not reported with all the consequence it means for society.
Sean: That’s a really interesting point, yeah, because obviously, yeah, you don’t want people to think that they might be incriminating themselves in some sense. They may not want to be separated from their phone. It might just be a simple, I’m not saying it’s a crutch, but we all depend heavily on phones for everything from parking through to paying for things.
Helen: Yes and I think that point is a really important one and it’s another one of the issues that we found in this whole process of mobile phone extraction, particularly from the victim point of view. So reports put together by, for instance, Privacy International, the Women’s Centre for Justice, were saying that victims who had experienced crime were being told you need to hand over your phone. If you don’t hand over your phone we won’t pursue the case. And also, were handing over their phone but they’re not getting it back for a large period of time, which is taking us back to the inefficiency of the process, because sometimes there aren’t people available to do the extraction. Well imagine if you’ve just experienced- You’ve had a serious assault. Actually your phone is a crutch because it’s got all of the details of everybody in your support network. If you don’t have your phone for a year then you lose that contact. And these are precisely the types of issues that were leading people to not reporting a crime in the first instance, or going to report a crime but then actually withdrawing their complaint because they felt they couldn’t go through with this process. So it’s another really important part of the issues that we want to address in the project. It goes straight to the issue of trustworthiness and trust in the police.
Sean: I’m wondering if you had any barriers that came up while you were- While you’ve been working on this project. Have there been any problems? Things that haven’t gone so well?
Liz: Yeah. I mean the major one is actually being able to reach all the stakeholders we wanted to talk to. Victim support agencies for example are predominantly volunteer led, they’re incredibly busy, they don’t have the time to spend two hours chatting to us in their day. So actually getting involved- Although they’re very interested, they’re sort of saying we don’t have the capacity, sorry, we can’t engage with it at this moment. There’s also people who won’t talk to us because they’re just like no, I don’t trust the police, I don’t want to work with you anyway because you’re working with the police. So being able to reach all the stakeholders we wanted to has been quite a challenge. And if anyone’s out there who wants to talk to us, because the project is still ongoing, please do get in touch.
Sean: Well we can put a link in the show notes. We’ll put a link to the GitHub for the software and we’ll also put a link where people can contact you about the project.
Liz: Yeah, brilliant, particularly as I say, victim support agencies and people like that. And also, yeah, lay people who have an interest. We’d love to get more viewpoints and perspectives on that. But I think for me that was the major challenge, it was just getting people involved.
Sean: Yeah, yeah, connecting to people who, you know, who probably need your help in some respects, it’s perhaps they just don’t realise they need your help. Was there anything that’s surprised you on the project. Anything that you’ve come across that you weren’t expecting?
Helen: I think for me a lot of it is around the complexity involved. For instance, with the way the police forces are organised. We have, I think it’s 43 separate police forces and they all have their own procurement processes which actually, when you think about it, makes a great deal of sense because you know, a police force in you know, in a city, is going to be quite different from one in a rural area, so their needs for technology are going to be very different. But because of this, you have very large differences across police forces in the kind of tech that they’re using, the way that budgets are set for it, the way that digital forensics and so on is organised too. So that’s been a very big surprised because I think I was assuming it was a much more centralised process, but it’s really very different and it’s interesting for us to learn about it, but it also means there’s a challenge to the generalisability of our findings because we’re talking to forces who do things very differently, so there’s not a one size fits all solution to things.
Sean: And what do you think is next for the project? Obviously it’s not quite completed yet but you know, how long is left on the project and what do you see happening afterwards?
Helen: So we’re currently set to go on until the end of June, we’re actually hoping to go on a little bit longer. As Liz says, we’re still doing the data collection so fi anybody would like to talk to us, anybody across the criminal justice system would like to be interviewed or take part in a workshop, we’d be really delighted if you’d like to get in touch. So we’re working on that element, we’re also working with a further development of RIME. So we’re using the findings that we’re gathering to send over, you know, requirement, user requirements, features that should be in the RIME tool working to test out the different functionalities of RIME going forward as well.
Anna-Maria: I would just add that one of the key challenges as Helen mentioned at the beginning is that it’s a very complex legal framework. We had some reports and we’ve been conducting an analysis of relevant literature, caselaw and reports by competent authorities like ICO and they’re helping develop us in the legal legislation in one of the things we don’t have exact view of what exactly are the practices right now in the whole place which is the major challenge in this and we’re hoping that there’ll be more investigation into that.
[00:29:54]
Sean: And one last kind of question on the legal side of things, is there an issue with getting the output of this tool recognised in a courtroom?
Anna-Maria: Well, as I said, as more- When the procedure is followed and by regulation, by following and being compliant with data protection law, then you’ve got evidence that can be used in court when there are legal users and then your evidence becomes more under-
Liz: I think that’s something that’s really interesting that’s come out recently actually as we’ve been talking about visualising the outputs of the tool. So as we said, it comes out in massive reams of information, people have to analyse it, people have to present it. If a tool can help us to present it in court to juries or to other people in court, to help them understand what actually the phone- The data is saying and what it isn’t saying, we certainly know it’s not perfect, if a tool can help with that, I think that would be a really great step forward as well.
Helen: I just wanted to add to Anna-Maria’s point, there are industry standards for digital forensics as well, so with RIME we have to look to be compliant with those standards. I can’t remember the number of the standard because I don’t have it written in front of me, I’m going to take a stab and say it’s ISO 17825.
Sean: It’s been really great hearing about the project and also the RIME tool that you’ve been developing. It just leaves for me to say thank you to each of you for joining us today on the Living With AI podcast, so thank you very much Liz.
Liz: Thank you, that was really fun.
Sean: Great stuff, thank you Helena.
Helena: Thank you very much I really enjoyed it.
Sean: And thanks Anna-Maria.
Anna-Maria: Thank you so much.
Sean: If you want to get in touch with us here at the Living With AI podcast, you can visit the TAS hub website at tas.ac.uk, where you can also find out more about the Trust for the Autonomous Systems Hub. The Living With AI podcast is a production of the Trustworthy Autonomous Systems Hub, audio engineering was by Bordie Ltd and it was presented by me, Sean Riley.
[00:32:11]