To implement the synchronizer token pattern you usually store the randomly generated CSRF token in the session to validate it on the subsequent request, even if you generate a new one for each form.
You could also handle this stateless without the session using encryption or HMAC, but then you need to manage secret keys and not screw up.
I think parent was referring to the session cookie. The linked article mentions putting the generated token into the server side user session and then to validate it on the next request. You might need a session cookie for that.
Session cookies persist for the length of the session. That's still too long for a CSRF token. You should be generating a new one in every request that needs a token in the response.
A good additional measure is a scheduled brownout. Turn off the API for a couple of hours or a day to make the consumers notice, then turn it back on for some weeks to give them time to migrate.
Google did this with their old Helm chart repository.
One solution I’ve seen posted here (can’t remember the link) is to put a sleep in some call and step it up every day/week/month until retirement.
That way when the application slows down, people complain, a story is created to figure out why, and the answer will be the library is deprecated and needs to be migrated.
The calls can always be made, they just get more expensive.
Naturally, you would combine any planned API delay or outage with conventional deprecation steps like updating documentation well in advance, posting to your blog, twitter and mailing list, e-mailing every identifiable user of the deprecated API, and having your account managers reach out to paying customers who use the API.