New York – Google’s New App Blunders By Calling Black People ‘Gorillas’

    11

    FILE - Anil Sabharwal, Director of Product Management, gestures during the Google I/O developers conference in San Francisco, California May 28, 2015. REUTERS/Robert Galbraith New York – Google’s new image-recognition program misfired badly this week by identifying two black people as gorillas, delivering a mortifying reminder that even the most intelligent machines still have lot to learn about human sensitivity.

    Join our WhatsApp group

    Subscribe to our Daily Roundup Email


    The blunder surfaced in a smartphone screen shot posted online Sunday by a New York man on his Twitter account, @jackyalcine. The images showed the recently released Google Photos app had sorted a picture of two black people into a category labeled as “gorillas.”

    The accountholder used a profanity while expressing his dismay about the app likening his friend to an ape, a comparison widely regarded as a racial slur when applied to a black person.

    “We’re appalled and genuinely sorry that this happened,” Google spokeswoman Katie Watson said. “We are taking immediate action to prevent this type of result from appearing.”

    A tweet to @jackyalcine requesting an interview hadn’t received a response several hours after it was sent Thursday.

    Despite Google’s apology, the gaffe threatens to cast the Internet company in an unflattering light at a time when it and its Silicon Valley peers have already been fending off accusations of discriminatory hiring practices. Those perceptions have been fed by the composition of most technology companies’ workforces, which mostly consist of whites and Asians with a paltry few blacks and Hispanics sprinkled in.

    The mix-up also surfaced amid rising U.S. racial tensions that have been fueled by recent police killings of blacks and last month’s murder of nine black churchgoers in Charleston, South Carolina.

    Google’s error underscores the pitfalls of relying on machines to handle tedious tasks that people have typically handled in the past. In this case, the Google Photo app released in late May uses recognition software to analyze images in pictures to sort them into a variety of categories, including places, names, activities and animals.

    When the app came out, Google executives warned it probably wouldn’t get everything right — a point that has now been hammered home. Besides mistaking humans for gorillas, the app also has been mocked for labeling some people as seals and some dogs as horses.

    “There is still clearly a lot of work to do with automatic image labeling,” Watson conceded.

    Some commentators in social media, though, wondered if the flaws in Google’s automatic-recognition software may have stemmed on its reliance on white and Asian engineers who might not be sensitive to labels that would offend black people. About 94 percent of Google’s technology workers are white or Asian and just 1 percent is black, according to the company’s latest diversity disclosures.

    Google isn’t the only company still trying to work out the bugs in its image-recognition technology.

    Shortly after Yahoo’s Flickr introduced an automated service for tagging photos in May, it fielded complaints about identifying black people as “apes” and “animals.” Flickr also mistakenly identified a Nazi concentration camp as a “jungle gym.”

    Google reacted swiftly to the mess created by its machines, long before the media began writing about it.

    Less than two hours after @jackyalcine posted his outrage over the gorilla label, one of Google’s top engineers had posted a response seeking access to his account to determine what went wrong. Yonatan Zunger, chief architect of Google’s social products, later tweeted: “Sheesh. High on my list of bugs you never want to see happen. Shudder.”


    Listen to the VINnews podcast on:

    iTunes | Spotify | Google Podcasts | Stitcher | Podbean | Amazon

    Follow VINnews for Breaking News Updates


    Connect with VINnews

    Join our WhatsApp group


    11 Comments
    Most Voted
    Newest Oldest
    Inline Feedbacks
    View all comments
    8 years ago

    This is the problem, it’s not anything Google can do. The system which is good enough to group a persons pics from Baby through Old age, considered these pics Gorillas. No one to be angry at other than G-d

    SandmanNY
    SandmanNY
    8 years ago

    Why are they upset? Don’t they believe that the gorillas were their alter zeidys?

    fat36
    fat36
    8 years ago

    I hope all those companies that dropped trump for his remarks will now drop and stop using Google.

    chassidisheyid
    chassidisheyid
    8 years ago

    To #1 Sandman
    Don’t you believe your Elter Zaide was Lavan and Terach? You don’t seem too upset about having ovdei avoida zara in your yichus. Tha’s probably because your remarks are not representative of ‘Am chochom Venavon’.

    MrSmith
    MrSmith
    8 years ago

    It was a computer not a human who understood to put them in that category

    Rafuel
    Rafuel
    8 years ago

    Why such a big deal? It’s an honest easy to understand error. Google and Yahoo will probably be able to refine their image recognition software reasonably soon; if not, that’s not a big deal either.

    ConcernedMember
    ConcernedMember
    8 years ago

    Folks who are upset over this really do not understand how these automated systems work. It’s the same thing as pasting a long message into Google Translate and then sending it to a Hispanic person. It’s not going to make any sense to them because the machine does not understand context. It translates word for word, not sentences, paragraphs, tone, intent…

    Similarly the photo analysis program is told to analyze a picture and work off what it seems similar to, based on specific points in other images. It’s a machine. A machine can not be racist. I’m sure with more data input over time they’ll work it out.

    yaakov doe
    Member
    yaakov doe
    8 years ago

    Will their self driving cars also have errors in technology?

    WakeUp2
    WakeUp2
    8 years ago

    I’m afraid that once this error is fixed they’ll start to identify Gorillas as Black people…