Hacker News new | past | comments | ask | show | jobs | submit login

I would say robots.txt is meant to filter access for interactions initiated by an automated process (ie automatic crawling). Since the interaction to request a site with a language model is manual (a human request) it doesn't make sense to me that it is used to block that request.

If you want to block information you provide from going through ClosedAI servers, block their IPs instead of using robots.txt.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: