Stuff

    Subscribe to our newsletter

    What's Hot

    Asics’ new after-running sandals are 3D printed, and they look like it

    August 8, 2022
    WhatsApp

    WhatsApp is making it easier to stop attempted account hijackings

    August 8, 2022
    LG Tone Free New

    LG’s new Tone Free T90 earbuds support Dolby Head Tracking

    August 8, 2022
    Facebook Twitter Instagram YouTube SoundCloud
    Trending
    • Asics’ new after-running sandals are 3D printed, and they look like it
    • WhatsApp is making it easier to stop attempted account hijackings
    • LG’s new Tone Free T90 earbuds support Dolby Head Tracking
    • Samsung Galaxy A53 5G review – Is it time to join the A-team?
    • Instagram to test 9:16 images in feeds, because it’s not TikTok if they’re images, right?
    • Twitter admits to a flaw in its system that exposed user data
    • Light Start: Korea’s moonshot, Fitbit’s syncing, Capitec’s databases, and CoD MW2’s beta
    • Wearable technology can change autistic people’s lives – if they’re involved in designing it
    Facebook Twitter Instagram YouTube
    StuffStuff
    • News
      • App News
      • Business News
      • Camera News
      • Gaming News
      • Headphone News
      • Industry News
      • Internet News
      • Laptops News
      • Motoring News
      • Other Tech News
      • Phone News
      • Tablet News
      • Technology News
      • TV News
      • Wearables News
    • Reviews
      • Camera Reviews
      • Car Reviews
      • Featured Reviews
      • Game Reviews
      • Headphone Reviews
      • Laptop Reviews
      • Other Tech Reviews
      • Phone Reviews
      • Tablet Reviews
      • Wearables Reviews
    • Columns
    • Stuff Guides
    • Podcasts & Videos
      • Videos
      • Stuffed
      • Stuffing Around
      • Tech Byte
      • T2S2
    • Win
    • Subscribe
      • Print
      • Digital
        • Google Play
        • iTunes
        • Download
        • Zinio
    • Stuff Shop
      • Shop Now
      • My Account
      • Downloads
    • Contact Us
      • Get In Touch
      • Advertise
    0 Shopping Cart
    Stuff
    Home » News » Industry News » AI technologies — like police facial recognition — discriminate against people of colour
    Industry News

    AI technologies — like police facial recognition — discriminate against people of colour

    The ConversationBy The ConversationAugust 30, 2020Updated:October 1, 2021No Comments6 Mins Read
    Share
    Facebook Twitter LinkedIn Pinterest Email

    Detroit police wrongfully arrested Robert Julian-Borchak Williams in January 2020 for a shoplifting incident that had taken place two years earlier. Even though Williams had nothing to do with the incident, facial recognition technology used by Michigan State Police “matched” his face with a grainy image obtained from an in-store surveillance video showing another African American man taking US$3,800 worth of watches.

    Two weeks later, the case was dismissed at the prosecution’s request. However, relying on the faulty match, police had already handcuffed and arrested Williams in front of his family, forced him to provide a mug shot, fingerprints and a sample of his DNA, interrogated him and imprisoned him overnight.

    Experts suggest that Williams is not alone, and that others have been subjected to similar injustices. The ongoing controversy about police use of Clearview AI certainly underscores the privacy risks posed by facial recognition technology. But it’s important to realize that not all of us bear those risks equally.

    Training racist algorithms

    Facial recognition technology that is trained on and tuned to Caucasian faces systematically misidentifies and mislabels racialized individuals: numerous studies report that facial recognition technology is “flawed and biased, with significantly higher error rates when used against people of colour.”

    This undermines the individuality and humanity of racialized persons who are more likely to be misidentified as criminal. The technology — and the identification errors it makes — reflects and further entrenches long-standing social divisions that are deeply entangled with racism, sexism, homophobia, settler-colonialism and other intersecting oppressions.




    A France24 investigation into racial bias in facial recognition technology.

    How technology categorizes users

    In his game-changing 1993 book, The Panoptic Sort, scholar Oscar Gandy warned that “complex technology [that] involves the collection, processing and sharing of information about individuals and groups that is generated through their daily lives … is used to coordinate and control their access to the goods and services that define life in the modern capitalist economy.” Law enforcement uses it to pluck suspects from the general public, and private organizations use it to determine whether we have access to things like banking and employment.

    Gandy prophetically warned that, if left unchecked, this form of “cybernetic triage” would exponentially disadvantage members of equality-seeking communities — for example, groups that are racialized or socio-economically disadvantaged — both in terms of what would be allocated to them and how they might come to understand themselves.

    Some 25 years later, we’re now living with the panoptic sort on steroids. And examples of its negative effects on equality-seeking communities abound, such as the false identification of Williams.

    Pre-existing bias

    This sorting using algorithms infiltrates the most fundamental aspects of everyday life, occasioning both direct and structural violence in its wake.

    The direct violence experienced by Williams is immediately evident in the events surrounding his arrest and detention, and the individual harms he experienced are obvious and can be traced to the actions of police who chose to rely on the technology’s “match” to make an arrest. More insidious is the structural violence perpetrated through facial recognition technology and other digital technologies that rate, match, categorize and sort individuals in ways that magnify pre-existing discriminatory patterns.

    Structural violence harms are less obvious and less direct, and cause injury to equality-seeking groups through systematic denial to power, resources and opportunity. Simultaneously, it increases direct risk and harm to individual members of those groups.

    Predictive policing uses algorithmic processing of historical data to predict when and where new crimes are likely to occur, assigns police resources accordingly and embeds enhanced police surveillance into communities, usually in lower-income and racialized neighbourhoods. This increases the chances that any criminal activity — including less serious criminal activity that might otherwise prompt no police response — will be detected and punished, ultimately limiting the life chances of the people who live within that environment.

    https://www.instagram.com/p/CDKKDxmlqEQ/?utm_source=ig_embed

    And the evidence of inequities in other sectors continues to mount. Hundreds of students in the United Kingdom protested on Aug. 16 against the disastrous results of Ofqual, a flawed algorithm the U.K. government used to determine which students would qualify for university. In 2019, Facebook’s microtargeting ad service helped dozens of public and private sector employers exclude people from receiving job ads on the basis of age and gender. Research conducted by ProPublica has documented race-based price discrimination for online products. And search engines regularly produce racist and sexist results.

    Perpetuating oppression

    These outcomes matter because they perpetuate and deepen pre-existing inequalities based on characteristics like race, gender and age. They also matter because they deeply affect how we come to know ourselves and the world around us, sometimes by pre-selecting the information we receive in ways that reinforce stereotypical perceptions. Even technology companies themselves acknowledge the urgency of stopping algorithms from perpetuating discrimination.

    To date the success of ad hoc investigations, conducted by the tech companies themselves, has been inconsistent. Occasionally, corporations involved in producing discriminatory systems withdraw them from the market, such as when Clearview AI announced it would no longer offer facial recognition technology in Canada. But often such decisions result from regulatory scrutiny or public outcry only after members of equality-seeking communities have already been harmed.

    It’s time to give our regulatory institutions the tools they need to address the problem. Simple privacy protections that hinge on obtaining individual consent to enable data to be captured and repurposed by companies cannot be separated from the discriminatory outcomes of that use. This is especially true in an era when most of us (including technology companies themselves) cannot fully understand what algorithms do or why they produce specific results.

    Privacy is a human right

    Part of the solution entails breaking down the current regulatory silos that treat privacy and human rights as separate issues. Relying on a consent-based data protection model flies in the face of the basic principle that privacy and equality are both human rights that cannot be contracted away.

    Even Canada’s Digital Charter — the federal government’s latest attempt to respond to the shortcomings of the current state of the digital environment — maintains these conceptual distinctions. It treats hate and extremism, control and consent, and strong democracy as separate categories.

    To address algorithmic discrimination, we must recognize and frame both privacy and equality as human rights. And we must create an infrastructure that is equally attentive to and expert in both. Without such efforts, the glossy sheen of math and science will continue to camouflage AI’s discriminatory biases, and travesties such as that inflicted on Williams can be expected to multiply.

    • Jane Bailey is Professor of Law and Co-Leader of The eQuality Project, L’Université d’Ottawa/University of Ottawa
    • Jacquelyn Burkell is Associate Vice-President, Research, Western University
    • Valerie Steeves is Full Professor, L’Université d’Ottawa/University of Ottawa
    • This article first appeared on The Conversation

    AI facial recognition featured Police racial bias technology The Conversation
    Share. Facebook Twitter Pinterest LinkedIn WhatsApp Reddit Tumblr Email
    The Conversation

    Related Posts

    Asics’ new after-running sandals are 3D printed, and they look like it

    August 8, 2022
    WhatsApp

    WhatsApp is making it easier to stop attempted account hijackings

    August 8, 2022
    LG Tone Free New

    LG’s new Tone Free T90 earbuds support Dolby Head Tracking

    August 8, 2022

    Leave A Reply Cancel Reply

    In The Mag
    Stuff August-September 2022 Latest Issue

    In This Issue – The Women in Tech (August-September 2022) Issue

    By Brett VenterAugust 1, 20220

    August is a pretty special month. It’s the host of International Women’s Day and is…

    2021 Wish List
    wish list Stuff Wish List 2021

    Stuff Wish List: for the tech impaired

    By Duncan PikeDecember 22, 20210

    Are you from the time before being glued to a smartphone was considered normal? Here’s…

    Wishlist DIY Stuff tech

    Stuff Wish List: for the DIY Diehard

    December 21, 2021
    Wish List Gearhead

    Stuff Wish List: For the petrol-soaked gearhead

    December 20, 2021
    outsiders

    Stuff Wish List: for the Outsiders

    December 17, 2021

    Latest Video

    Sonos

    SONOS Roam SL unboxing by Toby Shapshak

    Mini Cooper

    The Mini Cooper SE Electric with Toby Shapshak

    MSI Crosshair 15 Rainbox Six Extraction Edition unboxing

    MSI Crosshair 15 Rainbox Six Extraction Edition unboxing

    Samsung Galaxy S22 Ultra Unboxing

    Samsung Galaxy S22 Ultra unboxing with Toby Shapshak

    Contact

    South Africa's Consumer Tech News Hub

    General: stuff@stuff.co.za
    Subscriptions: stuff@onthedot.co.za or 087 353 1291
    Editorial: 072 735 2614
    Sales: 083 375 2418

    Facebook Twitter Instagram YouTube SoundCloud

    Subscribe to Updates

    • Terms and Conditions
    • Privacy & POPI
    • My account
    © 2022 Stuff Group. Designed by Chronon.

    Type above and press Enter to search. Press Esc to cancel.