What concepts or facts do you know from math that is mind blowing, awesome, or simply fascinating?

Here are some I would like to share:

  • Gödel’s incompleteness theorems: There are some problems in math so difficult that it can never be solved no matter how much time you put into it.
  • Halting problem: It is impossible to write a program that can figure out whether or not any input program loops forever or finishes running. (Undecidablity)

The Busy Beaver function

Now this is the mind blowing one. What is the largest non-infinite number you know? Graham’s Number? TREE(3)? TREE(TREE(3))? This one will beat it easily.

  • The Busy Beaver function produces the fastest growing number that is theoretically possible. These numbers are so large we don’t even know if you can compute the function to get the value even with an infinitely powerful PC.
  • In fact, just the mere act of being able to compute the value would mean solving the hardest problems in mathematics.
  • Σ(1) = 1
  • Σ(4) = 13
  • Σ(6) > 101010101010101010101010101010 (10s are stacked on each other)
  • Σ(17) > Graham’s Number
  • Σ(27) If you can compute this function the Goldbach conjecture is false.
  • Σ(744) If you can compute this function the Riemann hypothesis is false.

Sources:

  • AlmightySnoo 🐢🇮🇱🇺🇦@lemmy.world
    link
    fedilink
    arrow-up
    9
    ·
    edit-2
    11 months ago

    The fact that complex numbers allow you to get a much more accurate approximation of the derivative than classical finite difference at almost no extra cost under suitable conditions while also suffering way less from roundoff errors when implemented in finite precision:

    rac{1}{arepsilon}{athrm{Im}}eft[ f(x+iarepsilon) ight] = f'(x) + athcal{O}(arepsilon^2)

    (x and epsilon are real numbers and f is assumed to be an analytic extension of some real function)

    Higher-order derivatives can also be obtained using hypercomplex numbers.

    Another related and similarly beautiful result is Cauchy’s integral formula which allows you to compute derivatives via integration.

    • MBM@lemmings.world
      link
      fedilink
      arrow-up
      2
      ·
      11 months ago

      This is the first one that’s new to me! Really interesting, I guess the big problem is that you need to actually be able to evaluate the analytic extension.

    • Kogasa@programming.dev
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      11 months ago

      The fact that complex numbers allow you to get a much more accurate approximation of the derivative than classical finite difference at almost no extra cost under suitable conditions while also suffering way less from roundoff errors when implemented in finite precision:

      What?

      The formula you linked is wrong, it should be O(epsilon). It’s the same as for real numbers, f(x+h) = f(x) + hf’(x) + O(h^(2)). If we assume f(x) is real for real x, then taking imaginary parts, im(f(x+ih)) = 0 + im(ihf’(x)) + O(h^(2)) = hf’(x)) + O(h^(2)).

      • AlmightySnoo 🐢🇮🇱🇺🇦@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        11 months ago

        I can assure you that you’re the one who’s wrong, the order 2 term is a real number which means that it goes away when you take the imaginary part, leaving you with a O(epsilon^3) which then becomes a O(epsilon^2) after dividing by epsilon. This is called complex-step differentiation.