"with no racial bias"

Discussion in 'Science and Nature' started by Vee, Jun 25, 2020.

  1. [​IMG]
    A US university's claim it can use facial recognition to "predict criminality" has renewed
    debate over racial bias in technology.


    Harrisburg University researchers said their software "can predict if someone is a criminal,
    based solely on a picture of their face".

    The software "is intended to help law enforcement prevent crime", it said.

    But 1,700 academics have signed an open letter demanding the research remains unpublished.

    One Harrisburg research member, a former police officer, wrote: "Identifying the criminality
    of [a] person from their facial image will enable a significant advantage for
    law-enforcement agencies and other intelligence agencies to prevent crime from occurring."

    The researchers claimed their software operates "with no racial bias".

    But the organisers of the open letter, the Coalition for Critical Technology, said: "Such claims
    are based on unsound scientific premises, research, and methods, which numerous studies
    spanning our respective disciplines have debunked over the years.

    The group points to "countless studies" suggesting people belonging to some ethnic
    minorities are treated more harshly in the criminal justice system, distorting the
    data on what a criminal supposedly "looks like".

    University of Cambridge computer-science researcher Krittika D'Silva, commenting on the
    controversy, said: "It is irresponsible for anyone to think they can predict criminality
    based solely on a picture of a person's face.

    "The implications of this are that crime 'prediction' software can do serious harm
    - and it is important that researchers and policymakers take these issues seriously.

    "Numerous studies have shown that machine-learning algorithms, in particular
    face-recognition software, have racial, gendered, and age biases,"
    she said, such as a 2019 study indicating facial-recognition works poorly
    on women and older and black or Asian people.

    In the past week, one example of such a flaw went viral online,
    when an AI upscaler that "depixels" faces turned former
    US President Barack Obama white in the process.

    The upscaler itself simply invents new faces based on an initial pixelated photo
    - not really aiming for a true recreation of the real person.

    But the team behind the project, Pulse, have since amended their paper to say
    it may "illuminate some biases" in one of the tools they use to generate the faces.

    The New York Times has also this week reported on the case of a black man
    who became the first known case of wrongful arrest based on a false facial
    recognition algorithm match.

    In the Harrisburg case, the university
    had said the research would appear in a book published by Springer Nature,
    whose titles include the well regarded academic journal Nature.

    Springer, however, said the paper was "at no time" accepted for publication.
    Instead, it was submitted to a conference which Springer will publish the proceedings of -
    and had been rejected by the time the open letter was issued.

    "[It] went through a thorough peer review process. The series editor's decision to reject
    the final paper was made on Tuesday 16 June and was officially communicated
    to the authors on Monday 22 June," the company said in a statement.

    Harrisburg University, meanwhile, took down its own press release
    "at the request of the faculty involved".

    The paper was being updated "to address concerns", it said.

    And while it supported academic freedom, research from its staff
    "does not necessarily reflect the views and goals of this university".

    The Coalition for Critical Technology organisers, meanwhile, have demanded
    "all publishers must refrain from publishing similar studies in the future".

    Guilty Here Guilty as Charged ....lol
     
    • Funny Funny x 1
  2. Somebody call tom cruise.
     
    • Funny Funny x 1
    • Winner Winner x 1
  3. Sounds similar to the phrenologists claims of the 1800s that they could identify criminals by studying the bumps on their skulls.
     
    • Agree Agree x 1
  4. :lmafoe:

     
    • Agree Agree x 2

  5. lol
     
    • Like Like x 1

Share This Page