Living With AI Podcast: Challenges of Living with Artificial Intelligence
Living With AI Podcast: Challenges of Living with Artificial Intelligence
Embodied Trust in TAS: Robots, Dance, Different Bodies
Exploring themes of risk and trust and autonomous systems, we're discussing a project where professional dancers with disabilities dance with robots. The project exposes interesting challenges with choreography & safety as well as expression and creativity.
Guests:
Professor Sarah Whatley, Director and Professor, Centre for Dance Research, Coventry University
Paul Tennent (Associate Professor, Faculty of Science, University of Nottingham)
Kate Marsh (Assistant Professor, Centre for Dance Research, Coventry University)
More information about the project: Robots as dancing partners; somatic interventions – UKRI Trustworthy Autonomous Systems Hub (tas.ac.uk)
Podcast production by boardie.com
Podcast Host: Sean Riley
Producer: Stacha Hicks
If you want to get in touch with us here at the Living with AI Podcast, you can visit the TAS Hub website at www.tas.ac.uk where you can also find out more about the Trustworthy Autonomous Systems Hub Living With AI Podcast.
Podcast Host: Sean Riley
The UKRI Trustworthy Autonomous Systems (TAS) Hub Website
Living With AI Podcast: Challenges of Living with Artificial Intelligence
This podcast digs into key issues that arise when building, operating, and using machines and apps that are powered by artificial intelligence. We look at industry, homes and cities. AI is increasingly being used to help optimise our lives, making software and machines faster, more precise, and generally easier to use. However, they also raise concerns when they fail, misuse our data, or are too complex for the users to understand their implications. Set up by the UKRI Trustworthy Autonomous Systems Hub this podcast brings in experts in the field from Industry & Academia to discuss Robots in Space, Driverless Cars, Autonomous Ships, Drones, Covid-19 Track & Trace and much more.
Season: 4, Episode: 3
Embodied Trust in TAS: Robots, Dance, Different Bodies
Exploring themes of risk and trust and autonomous systems, we're discussing a project where professional dancers with disabilities dance with robots. The project exposes interesting challenges with choreography & safety as well as expression and creativity.
Guests:
Professor Sarah Whatley, Director and Professor, Centre for Dance Research, Coventry University
Paul Tennent (Associate Professor, Faculty of Science, University of Nottingham)
Kate Marsh (Assistant Professor, Centre for Dance Research, Coventry University)
More information about the project: Robots as dancing partners; somatic interventions – UKRI Trustworthy Autonomous Systems Hub (tas.ac.uk)
Podcast production by boardie.com
Podcast Host: Sean Riley
Producer: Stacha Hicks
If you want to get in touch with us here at the Living with AI Podcast, you can visit the TAS Hub website at www.tas.ac.uk where you can also find out more about the Trustworthy Autonomous Systems Hub Living With AI Podcast.
Episode Transcript:
Sean: Welcome to the Living With AI podcast from the Trustworthy Autonomous Systems Hub or TAS Hub. I’m your host, Sean Riley, and today we’re discussing the project Embodied Trust in TAS: Robots, Dance and Different Bodies. So just to let you know we’re recording this on April 10 2024. Regular listeners will know that on the podcast we discuss AI in relation to trust and trustworthiness. Todays topic involves dancers interacting with robots and I’ve seen some video of this project and it definitely raises some interesting challenges that may not be that obvious until you get into the detail. So without further ado, let’s welcome the guests for today. So we’ve got Sarah joining us, we’ve got Kate joining us and we’ve got Paul joining us. Thanks all of you for being part of Living With AI. So I’m just going to go round the virtual room and ask you to give us a brief introduction. What’s your name and what do you do. So we’re going to start with Sarah. Sarah, give us a little intro.
Sarah: Hi Sean, yes I’m Sarah Buckley, I’m the director of the Centre for Dance Research at Coventry University and I also happen to be the leader for this project.
Kate: Hi I’m Kate Marsh, I’m an assistant professor also at CDR, the Centre for Dance Research. My main area of practice research is around dance and disability, but very broadly interscting with other research areas.
Paul: I’m Paul Tennant and I’m an associate professor of mixed reality or human/computer interaction and I suppose I’m the nerd on the project so I get excited by the robots, but also very excited by everything we’re doing with them and with the dancers.
Sean: This is an ongoing project as I understand it, and I mentioned at the beginning I’ve seen some videos. Is that something- Can people go and watch some video of what’s going on? Is there stuff available? Is it shared already or will that be happening at the end of the project?
Sarah: Yeah, so it’s not yet publicly available. We’ve shared it in various public sharing events where we’ve brought- We’ve invited people into that space and we’ve shared some of it, partly through the TAS showcase but also through bringing the public into our last workshop at the end. So we have shared some of it, but our intention is to be able to share more of it once we’ve done some editing work and we’ve got a big corpus of material now which we’ve been gathering over the various workshops that we’ve been working on. So yes, eventually we’ll be making sure more of it is available.
Paul: You definitely want to see it because some of them are just beautiful.
Sean: That means that right now what we need to do is explain what people will see when they do see it, because I think just hearing the title, embodied, and robots and dancing, I mean it evokes certain things. Can you give us an overview of the project? Sarah perhaps you could give us that?
Sarah: Well you’ve mentioned what it is at heart. So what we’re doing is we’re bringing together dancers who have a direct lived experience of what it means to interact with machines, if you like, because all of them have a lived experience of disability, so they’re either prosthesis users or wheelchair users, so they have that knowledge and it’s an extraordinary expert knowledge already but they haven’t had the experience of working with robots before. So this project is bringing them into that exchange with different robots and exploring what happens in that exchange. What happens in that partnership, in that duetting? And how do we think about the kind of sense of embodiment and what embodiment means when you’re working with things that are non-human and human, and that interface between the human and the non-human, and how that tells us something about that slippery liminal space between what is human and non-human. So we’ve been exploring with different robots and finding out which of those robots, ina a way, felt most interesting, let’s be honest. You know, aesthetically interesting or confusing or challenging. So different ways in which those encounters stimulated different ways of thinking about that interaction and we’ve spent more time with the arms, the Franka arm robots because there seems to be something, I think- I mean, Kate can say more about this, but I think there’s something almost more human-like about them because they have this kind of jointedness. And it was also interesting for us to think about how do you place those robots in space in relation to the dancers. So there were all sorts of interesting conversations about actually how that space is organised. And then there’s all sorts of other considerations that no doubt we’ll talk about to do with health and safety, about how we manage risk, about the ethics about that, and that stimulated so many interesting conversatuions about what we mean by risk and what we mean by trust of course which is at the heart of the project. So that’s really what we’ve been playing with and then having lots of conversations about how does this feel? What does this mean for us? How do we see things different? And I will let Kate maybe say something about what’s been fundamental to this series of workshops is that we all start from a place of being in the space moving together. But maybe that’s a prompt for Kate to say more about that.
Sean: Even if you’re not a huge amount into dance, I think a lot of the public have seen things like Strictly Come Dancing and dancing is an intimate thing isn’t it? So it’s interesting to replace or augment with a device, effectively. Kate, what’s- How does that work? That communication? Because that’s part of it, isn’t it? When you’ve got two people dancing or more than two people dancing together, there’s a communication there, even if it’s just in the training?
Kate: Yeah I mean there’s a complexity I suppose when you ask people entrenched in dance research, it’s tempting to well maybe it’s about communication because it’s about so many things. And I think to bring it back to the project, I think it’s interesting that you should talk about intimacy because that was something I observed a lot when the dancers, the two- Well, there’s a core group of three but the dancers that we were working with, there’s a real sense of intimacy and there were moments when actually, our focus just went onto these very beautiful duets between the Franka arm and one of the dancers and I think it was really- When you talk about communication, I think that came up in a lot of our conversations. For me, as a non-robotics expert, although slightly more expert than I was 18 months ago, is that what we read into that communication, what we’re seeing, that these things are communicating with some kind of equity, but of course, that’s such a rich area of exploration because part of that communication is coming from handler who may be non-visible to the performance space that we’re making up, but we’re reading into, as I think often happens with performance, we’re reading this communication in. If I can just go very briefly back to Sarah’s point because I do think it’s a really important one that in the workshops where we all came together, we absolutely resisted the model where the robots expert, the sciency people in the room, sat behind their laptops and kind of watched the action. So we all moved together. We warmed up together, and that felt really, really important and actually not at all tokenistic. I think it had a really big impact on how we then- How we worked together and also how we were evaluating what we were finding. That sense of embodiment and being close to each other. Working with contact, working with touch, and then coming into the play, the exploration with the machines was really, really important and, for me, such a joyful part of the process actually.
Sean: Probably time to bring Paul in here. But one thing you just said struck me and you mentioned, you know, the robot’s handler. I was wondering whether the robots had any agency themselves or whether there was another, I’m going to use, third-party, second-party? Third-party- Another person there as part of it. Paul, can you talk us through that a little bit please?
Paul: Yeah absolutely. I think broadly we’ve been working with robots which are effectively being remote controlled by people, so when we talk about robot handlers or robot wranglers, they are absolutely a third-party within the dance and it’s been really interesting looking at the language that gets used. So we have dancers who are communicating both bodily and verbally, talking about what it is they’re going to do, and then we’ve got these robot handlers who are attempting to interpret that, not from a perspective of coming from a dance background themselves, although one of our robot wranglers is a martial arts expert, which is really exciting, because his bodily skills are excellent, and another of our handlers has been a model and a dancer, so there is some experience there. But listening to the communication differences. When we talk about robots, we think about them sort of, you know, moving a limb by 30 degrees, or you know, walking forward a particular pace. We don’t think about them- We don’t talk about them in terms of the language we use in dance. So when we were doing those all moving together experiences, one particularly memorable one for me was Kate shouting out words, like slinky, or high, or across. A whole range of completely, what seemed like fairly random words, which we had to interpret with dance. Now I’d made jokes for years about giving my lectures through interpretive dance, but this is the first time I felt like I’d really be doing my research through it. But how do you turn those words- So we do it kind of intrinsically don’t we? We do it with our bodies. Well slinky just means sort of moving slinkily. That’s easy for us to do. It’s remarkably hard for us to tell a robot how to do that. There’s been a real challenge around that.
[00:10:28]
Sarah: I think it’s reminding me of a conversation that was really present throughout- Has been present throughout the whole process is, I mean, when- There’s something around autonomy that really interested me and the link between autonomous machines and the notion of autonomy and the experience of disability and I don’t know, maybe we can talk more about that later. But I think to go back to Paul’s point which is so interesting, and really drew me in such a lot to this work is both- Neither the robots that we were working with or the dancers with lived experience of disability, fit the normative language to talk about bodies or dance. So actually it causes a problem- You can’t see inverted comma fingers on the podcast, but it causes- It problematises this notion of we all understand how we might interpret slinky. And I think that was so fascinating, I think for all of us, for the dancers to understand that these words, they have to make an adaptation in their body to find some kind of representation of what that word might manifest as, and it is with the robots, and that, for me, has been fascinating about where is the- How does agency, or not, show up for the experience of disability and being a robot or a machine.
Sean: One thing that you’ve mentioned there is whenever disciplines meet, communication is always a really key part because terminology means different things in different disciplines, from very simple examples right through to quite complicated ones. But one thing, and Paul might be about to talk about this anyway, I think in the terms of things like robotics, we often talk about moving something from A to B, not how you move it from A to B. Was that something that- Because the how is everything in dance, right?
Paul: Absolutely and funnily enough, that is more or less where I was going to go. So one of the later experiments we’ve been doing is recording motions on one of the Franka robot arms and then playing them back. So they don’t behave quite autonomously, but they are not being directly, at the time, controlled by a human, and we wanted to see how the dancers would interact with that. Now that led to a whole bunch of really interesting things about the robot not communicate what it was going to do and the dancer just having to remember what it was going to do in a way it can’t give the sort of bodily cues that you might give as a real dancer, and that’s one of the reasons we haven’t been that keen to push into autonomous systems yet, it’s because we need to explore the language of that kind of communication. But just to pick up the idea of the motions, we recorded it in two different ways. We recorded one by moving the arm to a series of points and let it do its robotic way of moving between them, that is it would use a technique called inverse kinematics to work out what its arm should look like and it would move its pincers, because that’s what on the end of it, it would move its pincer from point A to point B in the most efficient way possible. And we contrasted that with recording the state of the robot every frame. So we were then able to actually move the arm, or the dancer was able to move the arm in a much more beautiful set of movements and the resulting dances were contrasting and I think really interesting both in terms of what the robot looked like when dancing but also in terms of how the dancer performed with it.
Kate: Yeah I think that’s fascinating. And at what point does that look like or begin to feel like an equal duet? Or a duet where each partner in that has some kind of, going back to the word agency, in it? And I was also thinking about how, as the dancers became more expert, because there was a sort of growing expertise, and to some extent a sort of confidence and comfort in those rather alien experiences, how they became more expert in noticing where a glitch might happen, or where the robot might be pushed too far. And over time, I remember one or two comments about how I felt I knew it was at its limit. I began to feel where it was at its limit. And that was quite interesting. So there was a growing expertise in a way of the dancer becoming a knowledgeable partner with a robot. So it wasn’t that there were simply two agents sort of trying to find a common place but actually they were beginning to sort of learn from each other and feed from each other and read from each other.
Sean: And there’s presumably a parallel there in what happens in actual human to human dancing. You learn each other’s limits, you learn each other’s habits.
Paul: Well the exciting thing, of course, about learning with the robots is that you’re learning about this completely different body. So the Franka arm has seven elbows which is far too many elbows. You’re dealing with the robot dog, the Boston Dynamics dog, Spot, dealing with various other robots that moved in different ways, but the unique characteristic about them was that none of them were androids, none of them were humanlike bodies. So we make some assumptions when we dance with a human about how the body bends. We know how elbows bend. But of course that is why we’re doing this project with the people we’re doing this project with, because they have this unique perspective on bodies that are not, necessarily what we expect them to be.
Sean: One thing that I wanted to bring up and I’m sure it’s something that came up time and time again within this project is when we see robots on television or in factories, they’re often either covered in yellow and black sticky hazard warning signs with boxes around them, do not go near. I’ve worked in TV studies with robotic cameras where you’re not allowed to go anywhere near them, despite them being lucky enough to not be strong enough to bump over a cup of tea, basically. But there is this hazard, this worry, this concern. How do you deal with that in this situation Sarah?
Sarah: Yeah, I mean of course it was a conversation that we were having all the way through and I think Paul will probably have much more to say from his expert side of things. All I would say is that when we started this project, I think coming from a dancer perspective, dancers are natural risk takers if you like and really enjoy pushing at the boundaries of what should happen, again in inverted commas, what should happen or what is allowable and permissible. And it was very interesting in those early experiments have we probably hadn’t quite taken account of where we needed to intervene and make sure that all of that kind of health and safety and risk management process was transparent, was clear, and was supportive all the way through. And it was really interesting because all the time we were having to be very aware of where the limits were of what was possible and what wasn’t whilst also experimenting with what risk might mean and what those boundaries might mean. And where- What is acceptable and unacceptable? And of course, taking care, because nobody wants to be hurt. Taking care of real bodies and robot bodies. But it was definitely a conversation that we were having all the way through. But I’m sure Paul has more to say from that side of things because we were rather rebellious I think as the dancers in the project. And there’s this rather wonderful thing- You may have seen it on the film where we have the kill buttons. And so when we were in this scenario of working with the Franka arms, we have the dancers and the duetting and we have this kind of almost perfomative moment of standing with the kill button at the back and the kind of anxiety that you can almost read in the bodies of those of us standing at the back with the kill button. So all of this becomes part of that performance of experience in a way.
Sean: I mean, there’s obvious risk in standard dance,you know, big lifts can go wrong, there are problems that can happen. But I think it feels a bit more intuitive what’s going to go wrong. Whereas a robot, potentially, you know, making the wrong move and doing some serious damage by just getting something trapped in one of its joints or something is a problem isn’t it?
Paul: I think it goes in both directions. I think the robots are inherently actually quite delicate and humans can apply quite a lot of force. And so there was definitely a bit of concern over the £70,000 Franka arm, that a little bit too cautious a- Or too incautious a dance move may do it some harm. But of course the reverse of that’s also true. It’s got extremely strong motors in it, and it can deliver a great deal of force in certain ways. It doesn’t feel like it can push very hard, but if you stick your finger between its motors you’re going to know about it. So yeah, we had quite an extensive conversation around the health and safety and risk and the language of risk, and how we have different language for risks and how we might inherently understand risk as humans but the robot doesn’t inherently understand risks. So it’s swinging its arm. We as a person might know to stop swining our arm, the robot doesn’t necessarily know that. And it has particular ways of handling risk. So when it feels a certain amount of force it stops. That’s fine. But that’s only one way in which it can kind of prevent these things. So we ended up having really regular conversations, and they were conversations, they weren’t kind of strict orders from the roboticists in the room to say you must do it this way. There was a little bit of that.
[00:20:19]
But for me as a human/computer interaction person, rather than necessarily a robotics person, it was really interesting watching the negotiation of risk happen between the engineering roboticists group and the performative dancing, the CDR group, who would tend to be inclined to push back a little bit towards- And we recently wrote a paper called- I can’t remember the name of the paper now, but it was about reframing risk from the traditional way in which you look at risk, which is these risk matrices- We’ve almost certainly all filled in risk matrices for projects we’re going to do when we have to say oh, the building might burn down or you might trip over a cable or all these things, we classify them by severity and by likelihood. So if something has a severe risk of injury happening and is likely to happen then it’s in the red category. If it’s severe risk but unlikely to happen it’s in the orange category and if it’s low risk but likely to happen it’s in the green category. So we have this kind of extant risk framing system and it’s quite scary. Oh nasty red stuff, scary orange stuff. But we said what if we thought about that as the adventure matrix. Because actually, we do a lot of things that have some risk inherent in them. We play sports. Sports are quite likely to give you small injuries, but that’s okay, we take that risk. We cross the road. That’s fairly unlikely but absolutely could give you a major injury, but we take that risk. In order to live our lives in a world which is inherently risky we’re okay with that as long as we understand the risks. So we kind of suggested that there’s space for taking risks and space for making interesting experiences. That’s a design space. And yeah, probably we don’t want to hang around the red space where it’s definitely like to hurt you, but people do. People do wingsuit flying and all these kinds of things which are incredibly dangerous. So we consider risk to be a design space. Then having that conversation with the engineers who consider all risk to be unacceptable. There is no level of risk that is acceptable. And so we have this kind of tension between us. We’re looking at this like risk is a creative space and they’re like risk is a no space.
Sean: Without turning this into a health and safety risk assessment podcast, of course it’s down to familiarty with a certain thing, and this is part of the problem, is that when you encounter robots you’re not familiar with and you don’t know how they move, it’s a learning experience and as you mentioned before, you know, the dancers were learning the limits. They were learning how the robots move. And when you get to that point yeah, it’s much easier to start taking risks. When I did some risk assessment training years ago, I remember the concern was people would say it’s just common sense and the trainer said something like the problem with common sense is that it’s not common.
Sarah: Slightly sticking with that just briefly, because I think what Paul was saying and what
you were saying, risk is context dependent as well. And it was very interesting going back to what Kate was saying earlier about creating what Kate described as this sort of safe space where we would all be moving together. But of course, as dancers we have an understanding about what those limits and risks are and we have to similarly kind of discuss the potential risk because if you do throw your body at somebody you’re likely to break yourself or them in the process. And on the of the interesting parts, I think, of this project is we’ve said right from the beginning, this is about codesigning. This is about codesigning the project. And that sounds great and it was absolutely the ethos and the principle that we were working to, but there are moments in that where we were needing to turn towards the expertise and the authority in order to then be able to make that codesign process real. So that everybody does have a stake in what is being developed and how that project is developing. But it’s not straightforward. When you bring together very different disciplinary practices and expectations. And that’s what makes it exciting but it’s also challenging in a way.
Kate: But just on risk, and I do, I think like Paul- Well, like we’ve all said, it is important that
the dancers we were working with had lived experience of disability. It’s key obviously and I do think- I would argue there’s something quite specific about the relationship to risk that comes from many disabled people because we are, often I think, perceived or framed by this idea of care. Of being cautious or not being quite, maybe- I’m just about to say being quite as able, but I’m not sure that’s quite what I mean. So I think there’s often a temptation for me anyway as a disabled person, to play with risk a little bit. But what was really fascinating about this project, for example, there was a moment with the Boston- What’s it called? The Boston- I was going to say Boston terrier, I know it’s not that.
Paul: Boston Dynamics, Spot.
Sean: I think it should be called the Boston terrier. But anyway, carry on.
Kate: Where we were having a conversation about health and safety and it was made
incredibly clear that if you put your finger in the wrong place it would cause some significant damage at which point that was a very important moment for me because as someone with one hand, I kind of- That changed how risky I was prepared to be around particularly that machine. And I think other dancers we were working with had similar- Yeah, there’s something about how- Like Paul and Sarah both said, how we play with risk in the room. And one other thing Sarah, you just reminded me and I thinkthis was such a lovely thing about the project, that the more we worked together, the more I think we understand each other’s humanness in the space, which seems an odd thing to say given that we were sharing it with machines. But there was a definite moment, I think, when the dancers became very, very aware of the handlers, the wranglers in the room, and also their nervousness about- Their protectiveness of the robots. So there was this really lovely dynamic of people looking after each other I suppose. I think we created that space really, really well.
Paul: I think on that relationship between the dancers and the wranglers of the robots, we saw this really lovely thing that happened where, because we played with lots of different robots, especially in the early workshops, we would see specific dancers start to form relationships with specific robots.And while we did ultimately settle on the Franka for a bunch of reasons, different dancers engage with different robots and it was quite exciting to watch those relationships develop. I think Sarah talked about getting to know the- The other dancer getting to know the robot. That was really exciting. So watching- Was it Kat and Spot I think got on fantastically? Like cats and dogs perhaps. And there were some absolutely beautiful moments of them dancing on the stairs of one of the buildings as they came together and spent a lot of time with that robot. So seeing those individual bonds form was quite exciting. Even though we ultimately moved away to different sets of robots for practical reasons of being able to do systematic experiments, we needed to limit the number of robots we were working with. That initial process of toying around with the different robots and seeing how they could move and seeing what their potential was for dance as we kind of assessed their capabilities, assessed their movement flexibility and other things in the same way that you might kind of consider what we can do for dancing. But the robots themselves have got these sets of limitations and I thought it was really interesting to see these forming somatic relationships between the dancers and the robots.
Sarah: Yeah I think that’s absolutely right and again, it comes back to that thing that you start to sort of almost embue the robot with a kind of human quality or what you expect the robot to do, or what you desire the robot to do. And it made me reflect back on where we started thinking about the project right from the beginning, and Steve and I were having a conversation around the potential of collision and that was such a rich idea immediately and I know talking to Kate, we got really excited about this idea of collisions, because that means something very particular for dancers, taking into account what Kate was saying, that if you’re having a lived experience of disability it may mean different things. But it’s fundamentally quite exciting. It’s an exciting idea. And we were immediately drawn to this idea of collision and then as we started to develop the project we thought actually, we can’t really talk about collision, because collision, you think about autonomous cars and accidents and all of the negative side of collision. So we were sort of trying to mediate between- Again it comes back to language, you know? These words, simple words, single words, but they mean quite differenrtly in different contexts and for different practitioners and for different disciplinary fields. And that gets exciting again. But I think there was something very interesting about how different dancers were drawn to the physicality or the exchange, the physical exchange that they were having with particular robots and how that started- We started to see humabn qualities in that nature of the touch. And thinking about it, coming back to language again, what does it mean when you have a feathery touch with a robot? That means something very different. Touching like a feather, than with a human body. And it’s fascinating then for how you then think more generally about the full potential of that human robot interaction and what that robot can do for the human.
Sean: Like parts of the human body have nerves that feel things and that is touch, whereas a robot touch means perhaps something different. But these terminologies keep coming up oiver and over again don’t they Paul?
Paul: Yeah, so I wanted to pick up on the terminology actually. One of the exercises we did in one of our workshops was to get a whole lot of Post-It notes and try and label the robots and what we thought the various robots’ body parts were. That was such an interesting experience because we labelled things very differently. So I’ll look at qhat I thought was the shoulder of the Franka arm and somebody else has written that it’s its waist. Someone else had written that it’s its wobbler. You know? There’s a whole collection of different language we use to describe these bodies because it turns out, you know, that there is no kind of common term and when you look up the robot’s instruction manual it says joint seven. Joint seven isn’t terribly descriptive or useful. It’s entirely useful when you want to address that joint in a programme and tell joint seven to move by thirty degrees, but it’s not especially helpful if you want to think about it in terms of an embodied thing in the world which has someone anthromorphic characteristics. And we needed that language and we used to do that exercise partly to create some commonality in the language we were using but partly to help us to- Anthromorphise isn’t quite the right word because I don’t want to suggest we’re trying to make robots feel human, but to make them feel like present things in the space that might have some agency. And even though we were explicitly not giving the robots agency and having them controlled by wranglers or pre-recorded programmes in some cases, that’s still helping us think about well, in the future when we do want to make them autonomous and we do think about autonomy, what’s the language going to be like? And the language of touch, as Sarah brought up a minute ago is really important. With dancers- Again, going back to our improvisation exercises, we had some leading and following, where two people would touch each other and one person would be leading and the other person would have to follow that touch as it moved around. Doing that with a robot is suddenly very interesting. A robot’s got no way of sensing that- That’s not technically true, a certain force on the joints, with enough force on the touch you absolutely could or you could put capacity of sensing around the robot, but these robots couldn’t sense touch but the human absolutely can, so we could play leading and following between a human and a robot and that got us thinking about those bodily cues, those bits of information that we confer to each other when dancing that are just not there with the robots and that’s the trust point, I guess, with autonomy is that we can’t trust the robot to tell us what it’s going to do. It’s a very unpredictable dance partner.
Sean: You beat me to it, because I was going to come back to the trust because obviously the overarching theme of the podcast is trust, autonomous systems. I’m just wondering what it feels like as a dancer to trust one of these robots. Does it take a while to build that trust? Is it that you’re trusting the operator rather than the- Paul’s mentioned some of these were recorded sequences so you are in the hands of the robot, and perhaps Kate, could you speak to that a little?
Kate: Yeah I think that’s a really interesting question. So there isn’t a hugely simply answer and I think it’s already been nodded to in this conversation that my perception of the dancers’ interaction with the robots in this project is that actually- It goes back to Sarah’s point about dancers being inherently risk taking, that I think the dancers probably start from a place of trust. I think they probably came into the project going yeah, I trust this, I trust that person to handle this robot. I don’t think there was suspicion or lack of trust in the initial thing. So I actually think the process, in my observation anyway, is we had to re-evaluate trust. The dancers, I think, came in very open, very ready, very trusting and then they had to relearn or reimagine, I suppose, what it is to trust the person you’re interacting with and I think that definitely did develop, the more and more that the dancers we were working with kept with the same- I mean, it’s interesting that- Not entirely, but Kat and Welli, the dancers, sort of had a preferred Franka arm. They did near the end move about and mix between, but they were quite bonded I think. That was my perception. But I think there was also an understanding, because of the conversations we’d had around risk, around health and safety, and because of the practical experimentation, I think the dancers were in those improvisatory tasks knowing that they couldn’t- That these were not fully trustworthy dance partners. There was an unpredictability which, of course, exists in all dance partners but the consequences are quite different. So I think- But I don’t want to take us off in a totally different direction but when you were just talking there Paul it reminded me that for me there’s something so important about what get reads into the moments when we’re watching the dancers interacting with the robots. And that, for me, that was such a beautiful part of the project and it’s dependent on so many things. The music that we were using, the sun coming through the window, but there were definite moments when everybody’s attention would have been on one dancer with a Franka arm and then the sense of connection, the atmosphere in the room was really- Peoples’ reaction to it was really clear. There was a kind of holding of breath, a moment of silence because we’re being moved by whatever it is we think we’re seeing, whatever we’re reading into that, whatever intimacy or- So I think we’re reading trust into that, into the watching of it, which is fascinating to me,
Sean: One thing I was wondering is, having heard this conversation, is, is there another scenario in dance where something is unfamiliar and perhaps a novice comes in or there are props you have to deal with or ice dancing, or where there are some elements of it that is completely unknown to a dancer, and how do you approach that?
Sarah: It may be worth just saying that whilst the dancers were given, if you like, open scores to work with, there was a sort of guide for what they might be exploring, all of the work was improvisational. So in a sense that’s constantly thinking in the moment about what might be the unexpected. And it may be the unexpected because something happens with the robot, something happens externally, and the same thing happens within dance performance where it’s essentially improvised, it’s always about dealing with the accident, dealing with the unexpected. Of course, there may be severities of accidents, you know, the scenery falls down or something really does kind of challenge those in the moment to think about, you know, if somebody falls down, the lift doesn’t work, I remember years ago watching a wonderful performance where one of the man dancer’s trousers split down the back and it did take attention, of course, away from the beautiful of the dance just watching just how much of his anatomy is going to be revealed. But these are things that dancers deal with all the time. The potential for the unexpected, the potential for the accident, the potential for the unplanned. And so I think we were deliberately working within that space of improvisation so there was going to be the space for the unexpected, the unplanned to be part of that process and as Kate was saying, I think that’s where you read trust perhaps most clearly because of that. And of course, you know, our dancers are experts. They’re experts in dealing with the unexpected. But there’s a lot of potential unexpected in those scenarios. So I think we do see trust emerging in really interesting and different ways and I think also it tells us something about not only what we read as trust but how we experience trust in ourselves. And it’s a little bit like the sensorial, you know, you learn something about your own sensorial experiences because of what’s being transmitted or what you feel is being transferred from what you’re seeing.
[00:40:02]
Kate: It’s reminding me, Sarah, of a moment that I think we’ll all re- Well several moments, but with Kat in particular, the Franka arms, I think Paul mentioned earlier that the sensitivity to an amount of weight so that they would stop, they would shut down and have to reset, Paul is that the right word? And that, I think, is a really- That’s an interesting example of thinking about trust because in that moment, everybody in the room is observing that dancer and the Franka arm and we’ve set it up, rightly or wrongly, or whatever decision was being made, we’d set it up in a somewhat performative context. There’s a moment where we say go, we put the music on, and we know there’ll be a moment of stopping. So at that moment, the Franka arm and the dancer are the performers and we are the audience and I think there’s something that I read into the moment where the Franka arm stops, there is a breakdown in that trust between the dancer and the Franka arm of like, everyone’s watching us and you’ve just stopped. As it would in a live moment. Because actually, I suppose, if you were dancing with another human body, we have a fundamental sense of trust that they won’t just stop. But I think that- There’s something- Again I’m just reiterating Sarah’s point, I think, about how we read trust into what we observe.
Paul: I thought it might be a good idea just to provide a little bit of context of what we mean when we’re talking about dancing with the Franka arms and give an example of one of the scenarios we set up with them. So the Franka arms are a matched pair of seven elbow robot arms with pincers on the end and thanks to some work by a PhD student a couple of years ago, one of those Franka arms can mirror the movements of the other. So if I grab the gripper on the end of the Franka arm and I move it somewhere, the other arm can simultaneously move in exactly the same way. Now that’s really interesting because actually it’s not mirrored. We talk about this term mirrored but they’re not, they’re duplicating. So when you put them facing each other, if I move my arm to the left, the other one moves to the right, immediately interesting. So what we did was we had one dancer who was the control dancer, the robot wrangler, really, taking the role of technician here, moving the first arm around. The second arm was duplicating that set of movements and then the second dancer was dancing with the second arm. So what would happen is you’d have the dancers effectively dancing with each other but through the medium of these remote remote arms and it led to some extraordinarily beautiful moments. Of course, standing hovering over us is the two of us with the big red buttons to stop them, because, you know, terrifying, but we’d have the sun streaming in behind us, we’re very excited to be able to share those videos as and when we can. And so you would get these lovely moments of communication and when I was talking about those cues not being there because the robot would just move and it can’t tell you it’s going to move, but when you’ve got a human that’s doing that control, and human dancer, in fact, doing that control, well they do look at each other and they do kind of observe and they do make some cues and one of the things we’re going to be doing in our next workshop activity is really looking in lots of detail into those videos and looking at how dancers might be communicating with each other. And there were some moments we’d see dancers with their eyes shut, dancing with the robots, and totally in the moment. And some moments where they’re very much looking at each other. Some moments where the wrangler is very much looking at the performative dancer, even though they’re both performative, we don’t like to think about them in that way, of one person setting up the performance, the other person’s doing the performance. I’m not quite sure about that. Again, the language is really complicated here. But- So that’s been, I think, our primary set up after working with lots of different robots, this is the one we kind of settled on is this connected pair of robots and looking at all the myriad complexities around that, because as academics you look at any tiny thing and it suddenly becomes unbelieveably complicated and exciting and interesting, and that’s kind of where we’re going. I just wanted to flag one more point about trusting autonomous or incompetent systems. I like to think that we, as the computer science academics and engineers joining in with the dancing at the beginning of the sessions was a good warm up for the dancers of how to deal with slighty incompetent agents that will do things you don’t expect them to do.
Sean: Conscious as I am that we’re running a little bit low on time, if anybody wants to check out this project or is interested at all, in the show notes there’ll be a link to the project page and keep an eye out for those videos because I promise you, they’re well worth watching. It just remains for me to thank everybody for joining us today. So Kate, thank you very much for joining us today.
Kate: Thanks.
Sean: Sarah thank you for joining us.
Sarah: Thank you, it’s been a please.
Sean; And Paul, thank you.
Paul: Thanks Sean.
Sean: If you want to get in touch with us here at the Living With AI podcast, you can visit the TAS hub website at tas.ac.uk, where you can also find out more about the Trustworthy Autonomous Systems Hub. The Living With AI podcast is a production of the Trustworthy Autonomous Systems Hub, audio engineering was by Bordie Ltd and it was presented by me, Sean Riley.
[00:45:54]