-

@ David
2025-05-30 23:17:25
lexicographic, monotonic decreasing precision makes the most sense for dates
if you store timestamps as is done with unix, you literally turn it into human readable form for hours just by converting it to base 60 (with two decimal numbers per cipher)
then it gets messy for weeks, months and years but watcha gonna do
also for the thousands based divided units of seconds, that's also ugleh
anyway, days are not exactly 24 hours, currently for the first time in a long time they are under 24 hours, and years vary as well versus the sidreal (versus the star constellations) in a famous retrograde cycle known as the precession of the equinoxes
a random side fact about calendars also, the book of Genesis declares that 360 day years were prescribed by the Lord, which is bullshit, because Enoch goes into great detail specifying that seasons are exactly 91 days long and a year is 364
still wrong but closer. i think the only archaic calendar that doesn't get out of sync for several thousands of years is the chinese lunar calendar
and then... we could start talking about the problem of gravity and the speed of time after that, haha.
computer time representations, ie Unix Time is a pretty good universal standard tho, because it's based on measuring angles of stars against the rotation of the earth
in the actual real world tho, time is subjective by the local gravitational force, imposing standards is entirely determined by the utility based on the physical region of space it governs... what time standard would you use on mars, or in deep space? how do you even make an oscillator that is isolated from local gravity anyway?