• UnderpantsWeevil@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    arrow-down
    1
    ·
    10 months ago

    These algorithms already have a comical bias towards the folks contracting their use.

    Case in point, the UK Home Office recently contracted with an AI firm to rapidly parse through large backlogs of digital information.

    The Guardian has uncovered evidence that some of the tools being used have the potential to produce discriminatory results, such as:

    An algorithm used by the Department for Work and Pensions (DWP) which an MP believes mistakenly led to dozens of people having their benefits removed.

    A facial recognition tool used by the Metropolitan police has been found to make more mistakes recognising black faces than white ones under certain settings.

    An algorithm used by the Home Office to flag up sham marriages which has been disproportionately selecting people of certain nationalities.

    Monopoly was a lie. You’re never going to get that Bank Error In Your Favor. It doesn’t happen. The House (or, the Home Office, in this case) always wins when these digital tools are employed, because the money for the tool is predicated on these agencies clipping benefits and extorting additional fines from the public at-large.