Hacker News new | past | comments | ask | show | jobs | submit login

> The CaaS form already asks applicants for some personal information, such as LinkedIn profiles and educational background. Carroll is resistant to the idea of using such data to glean insights about businesses. Because well-educated white men have the easiest time raising money today, any model using demographics to predict success would favor them—the opposite of her intention. Still, Social Capital is experimenting with building personalized models anyway, though it hasn’t implemented any yet

They specifically talk about this risk, current data models only use business data and they discuss the dangers of using personal data. It seems to me that data like customer loyalty and cash on hand are perfectly fine to use and don’t come with any direct gender or ethnicity bias.




My point is that a number of features can be well correlated with gender/other social factors and result in training bias into the algorithm.

The following article discusses this in the context of policing for example:

https://boingboing.net/2015/12/02/racist-algorithms-how-big-...


My point is the article doesn't ignore it, and are specifically targeting data points less likely to have gender/race bias. You presented your comment like it was a point of view not covered in the article.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: