• azertyfun@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        10 months ago

        EDIT: NVM I’m a goddamn idiot, Unix Time’s handling of leap seconds is moronic and makes everything I said below wrong.


        Unix Time is an appropriate tool for measuring time intervals, since it does not factor in leap seconds or any astronomical phenomenon and is therefore monotonously increasing… If T1 and/or T2 are given in another format, then it can get very hairy to do the conversion to an epoch time like unix time, sure.

        The alt-text pokes fun at the fact that due to relativity, at astronomical scales then time moves at different speeds. However, I would argue that this is irrelevant as the comic itself talks about “Anyone who’s worked on datetime systems”, vanishingly few of which ever have to account for relativity (the only non-research use-case being GPS AFAIK).
        While the comic is funny, if:

        • Your time source is NTP or GPS
        • “event 1” and “even 2” both happen on Earth
        • You’re reasonably confident that the system clock is functioning properly

        (All of which are reasonable assumption for any real use-case)
        Then ((time_t) t2) - ((time_t) t1) is precise well within the error margin of the available tools. Expanding the problem space to take into account relativistic phenomena would be a mistake in almost every case and you’re not getting the job.

    • AVincentInSpace@pawb.social
      link
      fedilink
      English
      arrow-up
      1
      ·
      8 months ago

      if that person who wrote all these could provide examples for why literally any of them are wrong instead of just resorting to the standard “falsehoods programmers believe” fare of “you believe this? ha. it is wrong. therefore I am smarter than you” I would very much appreciate it

    • CanadaPlus@futurology.today
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago
      1. Ok, but the time on the server clock and time on the client clock would never be different by a matter of decades.
      2. The system clock will never be set to a time that is in the distant past or the far future.

      Does this come up? I feel like if you’re doing retrocomputing you assume a certain level of responsibility for your software breaking.

      1. Ok, but the duration of one minute on the system clock will be pretty close to the duration of one minute on most other clocks.
      2. Fine, but the duration of one minute on the system clock would never be more than an hour.
      3. You can’t be serious.

      You can’t be, can you? Ditto on that being the user’s problem. My thing also isn’t portable onto Zeus Z-2 or a billiard ball computer you built in your garage.

      There’s some weird shit in the crowdsourced ones. I don’t even know where to start.

      • Redjard@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        0
        ·
        10 months ago

        You heard of standby and the likes? What do you recon that does to programs calculating with time in that exact moment?

        • CanadaPlus@futurology.today
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          I… Actually don’t know.

          The real time clock continues to move in real time under reasonable conditions. If it’s in a weird year it’s either because you’ve decided to run a disk you found in a cave, left by the Ancient Ones, or you’re cheating at Animal Crossing.

          I’m a little unclear on how the rest of the clocks typically work together. If your program is drawing from one that gets stopped for a while, I guess yeah, a minute could totally be weeks long, and I’m in the picture as a falsehood believer.