Emma Thwaites’ view as told to Katrina Marshall.
Ethics in Artificial Intelligence appears to be having a coming-of-age moment. Moreso, perhaps because it is so closely tied to the ‘come to Jesus moment’ being experienced by the rest of the world.
If we didn’t know it before, Covid-19 has taught us that human interaction is essential to wellbeing but it has also provided us with many other valuable lessons. We are physically distanced but digitally more connected; we recognise who key workers truly are; we understand more than ever what it truly means to have white privilege; these three can broadly be deemed the biggest takeaways and the catalyst for real changes beginning in 2020.
This is no less true for ethics in AI. As many commentating on the global upset sparked by the murder of George Floyd have noted, many of the issues being highlighted are not new. Jo Kingston said previously in her blog that empathy, compassion and social acuity are still uniquely human characteristics and vital to the efficacy of tech. What appears to be more the case is that a perfect storm of factors has created this unprecedented awareness that puts a consideration of ethics right at the heart of AI; the lessons of Covid-19 and moving the dial on race relations globally are in pole position to shift our dominant paradigm.
At Allegory we are in the eye of that perfect storm and do not take that position lightly. Starting with where we have positioned ourselves both prior to and in the aftermath of CogX 2020. With an attendance of 30,000 and over 600 speakers, this conference was among the many being (virtually) hosted in the white hot aftermath of the Black Lives Matter protests. Social media’s ubiquitous, all-seeing eye would naturally have been trained on all-white male speaker line-ups, which are the low hanging fruit; the most obvious manifestations of racist structures and white privilege. In this regard (and looking only at the vectors of diversity that we can see or hear), the line-up appeared to be a much more balanced reflection of wider society than many events. But CogX and its organisers were open to much more scrutiny this year than ever before. Some participants took to social media suggesting that white presenters should give up their seats to drive the dismantling of all-white panels even further; in line with one of the oft touted practical solutions to systematic discrimination: ‘give up your place, give a platform or make room for a marginalised group to have a seat at the table’. But that outrage lives best on social media in this instance. Our view is more nuanced.
Broadly, we’d agree, but hasten to add that there should be a more fine-tuned lens through which we assess so-called minorities. Without it there is a great risk that highly marginalised individuals will be excluded. To the matter we’ve said publicly that the decision is, and should be, highly personal. We cannot claim to be leaders of the majority of us, if we don’t consider the plight of the least of us. There are people who have fought discrimination of other kinds to earn their place at the table. People with disabilities (seen & unseen), and with personal struggles that we can’t know. They should not be asked to step aside. It is the reason we have been deliberate and proactive in encouraging equality in all the work we do for our Allegory clients; from who is chosen to contribute to their events to broadening their networks. We are broadening our own recruitment processes so that we attract and hire as diverse a team as possible, checking our own privilege, building our understanding and learning from our black and ethnic minority colleagues. We recognise that the burden of learning and understanding their particular challenges is on us, not them and have built and accessed an ever-growing canon of resources to keep us in check. While we do not claim to have a single perfect answer to a problem that is literally hundreds of years in the making, we would hope that our actions position us as trusted allies committed to the work of diversifying tech long after the loud hashtags have stopped trending. To this end we hold ourselves publicly accountable to an ethical code of conduct.
Yet with this surge of progress, we are mindful that progress has historically not been linear. As such it cannot be ignored that some of our industry’s influential thinkers are calling for what they refer to as ethics for urgency. Most especially in light of the race to find a vaccine for Covid-19. In an interview with the MIT Tech Review, Jess Whittlestone at the Leverhulme Centre for the Future of Intelligence at the University of Cambridge discusses the reactive nature of current ethics in AI and the gaps in robustness that its applications reveal. In particular, saving lives on the one hand but opening data breaches on the other. In other areas, stagnation has been cited as a problem overall. Tim Harford, writing in the FT, points to a stalling of innovation in the past 40 years. I find it noteworthy that even in the current climate (humanity over tech) he urges a return to the spirit of competition in innovation. This, he believes, will speed up the currently lethargic process of finding a vaccine for Covid-19.
It is clear how much the forces that drive and evolve our industry do not do so in a vacuum and as such our outlook needs to be much broader. Not least of which in terms of ethics. We need to ensure that we are as agile in checking our ethics as we are in innovating. It goes back to the premise about great power coming with great responsibility. Professor Sir Nigel Shadbolt was interviewed at CogX and had encouraging words for keeping the data that the current pandemic has unlocked flowing and feeding into that cycle of innovation. However his caution was that while there’s no doubt that this shift in pace and momentum could become an ongoing force for good, we need to be checking the algorithms that then get to work on the data. This is where ethics come in. As the Open Data Institute has highlighted, tech giants have taken steps to limit the use of facial recognition software by law enforcement agencies, on the basis that the algorithms they use are biased. For an even greater emphasis on the human impact of flawed data, Rebecca Ghani and Anne Edimo have paid particular attention in their writing to the ‘countless less-visible, long-term, often-buried consequences of systemic racism in society. These impacts are felt from education and employment, through to the justice system and healthcare.’Race disparity in Data completes the loop of interconnectivity in all the areas mentioned above. It demonstrates how one aspect of innovation and development cannot be advanced without the ethical checks and balances that preserve the very society AI is meant to serve and improve.
It is our belief that working in the data and technology space has never been more important. And within this sphere, our roles as communicators are more critical than ever. As Nigel Shadbolt says, we need to keep the human ‘in the loop’ to ensure that we tell clear, balanced stories and ensure that everything we do truly reflects the society in which we live. As an agency, we are reminded to look deep into our business and ourselves to make sure that we are behaving ethically in everything we do, while understanding that sometimes this can present us with challenges, and demand that we make difficult choices. We are confident we can make those choices and guide our clients to do the same.