There are some differences though. Not only the name and types differ. But also `timespec_get` is only required to support `TIME_UTC` and the specification is not as detailed.
Not about C, but IMO the way that Go handles time is very convenient. Time zones are pretty well-encapsulated in the Time struct and that makes for really nice function interfaces. Also, there's this gem:
`
These three considerations—choose an epoch as early as possible, that
uses a year equal to 1 mod 400, and that is no more than 2^63 seconds
earlier than 1970—bring us to the year -292277022399. We refer to
this year as the absolute zero year, and to times measured as a uint64
seconds since this year as absolute times.
`
I like the higher resolution of Windows' FILETIME epoch better (microseconds since 1601, positive and negative, or about 30000 years in each direction). I'd rather have the resolution than be able to reference years the universe didn't even exist in and only be able to do it in seconds.
The problem with a system that incorporates leap seconds is that you can't calculate time in the future. Leap seconds happen haphazardly and so can't be predicted.
POSIX defines a day as precisely 86400 "seconds". That makes date-time calculations incredibly easy. Then, using a database of historic leap seconds, you can convert to UTC, TAI, etc quite simply, albeit without being able to pinpoint the precise UTC second during days with a leap second. Also, your _future_ POSIX timestamps will always remain valid, regardless of how many leaps occur between now and the future timestamp.
Of course, POSIX timestamps are really only useful for civilian date-time applications. (Which fortunately is pretty much all anybody cares about.) A POSIX "second" isn't the same as the SI second. If that doesn't cut it for you, then you really need to use TAI (UTC is kind of a half-measure). But then you effectively forgo easy date-time manipulation and display, particularly for future dates.
Admittedly, the Go code is a little misleading. Their calculation of UTC is imprecise, and can be off by a second here or there, although the day, month, and year will always be correct. But the same holds for Unix and, AFAIU, Windows as well.
Edit: I can see it matches the right value but I can't figure out why. There have been 25 leap seconds but there's a 35 second offset...
Edit 2: Apparently the clocks ran at different speeds for a decade, and the difference got rounded to 10 in 1972.