Call for accountability of deployed AI services

Edit: I want to clarify during my tenure at AWS, I did not work on face recognition or was involved in any of the decisions to sell it to law enforcement. I added this after a journalist asked me about this. 
About two weeks ago, Timnit Gebru and Margaret (Meg) Mitchell approached me with a request to sign a letter outlining scientific arguments to counter the claims made by Amazon representatives regarding their face recognition service and calling them to stop selling it to law enforcement.
I am one of 26 signatories. This includes many veteran leaders in the community including Yoshua Bengio, one of the Turing award winners this year. So I am in good company ­čśë In addition, there are numerous other groups which have called for Amazon to stop selling it to the police.
Screen Shot 2019-04-03 at 10.10.12 AM
Joy Buolamwini and Inioluwa Deborah Raji have done amazingly in-depth research on this topic and you can check it out at the gendershades website. So all the credit goes to them for laying this strong foundation.
When I read the letter I was happy to see careful factual arguments being made that are grounded in science. My hope is that the letter opens up a public dialogue on how we can evaluate face recognition (and other AI services), both in terms of metrics, but also the social context in which it is being deployed.
I am a former member of the AWS AI group and I want to clarify I have at most admiration of how AWS has transformed the developer ecosystem.┬á AWS services have removed a lot of “heavy lifting” in DevOps and democratized software development. I am hoping that this letter leads to productive dialogue and we can collectively work towards enhancing the beneficial uses of AI.
Govt. regulation can only come about once we have laid out technical frameworks to evaluate these systems. The gendershades paper shows how our current evaluation metrics are broken, and it starts with imbalanced training data. So we need a variety of different ways to evaluate the system and we need accountability from currently deployed AI services.  In short, regulation is only part of the answer but is badly needed.
Update: AWS released a FAQ outlining guidelines of how face recognition should be used.  Unfortunately, this does not solve anything.

5 thoughts on “Call for accountability of deployed AI services

Leave a Reply