And what happens to the 1% where this fails? At the moment the responsibility is on the person. If I incorrectly book my flight for date X, and I receive the itinerary and realise I chose the wrong month - then damn, I made a mistake and will have to rectify.
An LLM could organise flights with a lower error rate, however, when it goes wrong what is the recourse? I imagine it's anger and a self-promise never to use AI for this again.
*If you're saying that the AI just supplies suggestions then maybe it's useful. Though wouldn't people still be double checking everything anyway? Not sure how much effort this actually saves?
An LLM could organise flights with a lower error rate, however, when it goes wrong what is the recourse? I imagine it's anger and a self-promise never to use AI for this again.
*If you're saying that the AI just supplies suggestions then maybe it's useful. Though wouldn't people still be double checking everything anyway? Not sure how much effort this actually saves?