Hacker News new | past | comments | ask | show | jobs | submit login

A) this is a buyer’s agent, not a seller’s agent.

B) I don’t think you tell the agent your race, and traditional bias protections (namely doing the whole process without meeting in person) would still be in place AFAIU.




A) Correct.

B) Yes, I think the AI agent actually can more easily ignore your race / age / gender because it is not programmed to go look up those things and people don't text their agent what they are - hard for the LLM to know these things and start to bias


> I think the AI agent actually can more easily ignore your race / age / gender

I challenge you to support that idea in any way.

> because it is not programmed to go look up those things

The concern is what it's trained on. How have you curated your data to avoid introducing biases based on race/gender/etc?

> people don't text their agent what they are - hard for the LLM to know these things and start to bias

The exact opposite. The way people communicate, especially via text, is heavily dependent on their background and is full of social signals. LLMs are trained on data sets that are often annotated with that kind of information.

How would you affirmatively prove that your LLM model wasn't making inferences about those categories that influence it's output?




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: