• FuglyDuck@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      21 hours ago

      Not when that definition of pi goes to all 300 trillion decimals that we have resolved. (To be fair, I don’t know of any that do… but eh…yeah. And I’m pretty sure it was defined by a masochist if one did.)

      That leads to unnecessary time spent calculating even simple equations. That level of precision is almost never actually needed.

      With fermi problems, usually that level of precision is moot and potentially a waste of time. (Particularly when the math is requiring some kind network cluster to do.)

      • mnemonicmonkeys@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        19 hours ago

        Pi has it’s own button on most graphing calculators, and those that don’t usually only requure 2 button presses to get it. Meanwhile, there’s some iteration of ‘pi()’, ‘pi’, etc. in most programming languages

        • FuglyDuck@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          8 hours ago

          Sure.

          But sometimes, the problems are complex enough that solve time becomes a concern. When they’re complex enough, you start asking “is everything these precise enough to justify that” and when the answer is “no”, then you don’t do that because runtime on networked clusters like AWS costs money.

          And when you’re talking about scales that encompass the galaxy…. Well. There’s just not a lot of precision there to begin with.

          • mnemonicmonkeys@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 hours ago

            The counterpoint to that is that including a term for pi (or even rounding it to 3.14) would insignificant to add and look way more professional