Hacker News new | past | comments | ask | show | jobs | submit login

One of the things that amuses me is trying to find racist/sexist google search results. Here's a few:

I remember a while back Google got flack because the image search for "scientist" was almost entirely famous African American scientists. That's now changed and shows stock images of (mostly white) people in lab coats.

"Three black teenagers" shows mostly groups of mugshots.

The word "Brazilian" shows hot, almost nude women. "German" shows the flag. "Portuguese" shows maps, flags, and a lot of normal looking people. "Hispanic" all pictures are normal looking people.




Seeing images that would be 'racist' or 'sexist' is reflective of you, not the results. For instance if you search for 'white man and white woman' you'll find almost exclusively pictures of interracial couples. Is it some conspiracy to push interracial relations onto people? People of a different bias would say so, and it's equally ridiculous. In reality the simple matter is that Google's search is still extremely primitive and the results are mediocre at best. So you can easily break the search when searching for anything that cannot be trivially mapped to a direct text mapping such as e.g. Justin Bieber or Abraham Lincoln.

For instance search for 'green circle' - okay you get mostly green circles. Now search for 'green circle with red line' and the results are completely nonsensical. The huge leap forward in search engines was being able to avoid returning hardcore porn when searching for Abraham Lincoln. But in spite of tens of thousands of engineers, hundreds of billions of dollars in revenue, and all sorts of fancy declarations of ultra sophisticated AI solving every problem under the sun, we really haven't moved that far beyond that early milestone.


Yeah, I didn't mean to suggest I think the AI/search results are actually racist/sexist. If I really believed that, I wouldn't find it amusing. As you suggest, it's an amusing anecdote which shows how much farther we have to go with regards to getting ML/AI/search right.


'Brazilian' has other meanings you may be unaware of... namely it being the name for a bikini wax. Almost nude women is beyond expected in this case. Just another example of how complex these things are linguistic and cuturally.


I would recommend checking out the google image search for "Brazilian wax" to see what comes up for that. It's not a bunch of hot models in bikinis.


I would recommend looking into why people do Brazilian waxing, and particularly how it relates to bikinis.


I'm not suggesting that models are an illegitimate way of representing the word "Brazilian." There's normal Brazilian people, Brazilian monuments, maybe the flag, the relatively "clinical" pictures that come up with "Brazilian wax", and of course models. The fact that all of the results for "Brazilian" are only in one of those categories shows a bias that I find amusing.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: