• 6 Posts
  • 530 Comments
Joined 11 months ago
cake
Cake day: June 13th, 2023

help-circle






    • Compression algorithms can reduce most written text to about 20–25% of its original size—implying that that’s the amount of actual unique information it contains, while the rest is predictable filler.

    • Empirical studies have found that chimps and human infants, when looking at test patterns, will ignore patterns that are too predictable or too unpredictable—with the sweet spot for maximizing attention being patterns that are about 80% predictable.

    • AI researchers have found that generating new text by predicting the most likely continuation of the given input results in text that sounds dull and obviously robotic. Through trial and error, they found that, instead of choosing the most likely result, choosing one with around an 80% likelihood threshold produces results judged most interesting and human-like.

    The point being: AI has stumbled on a method of mimicking the presence of meaning by imitating the ratio of novelty to predictability that characterizes real human thought. But we know that the actual content of that novelty is randomly chosen, rather than being a deliberate message.










  • If you’re trying to solve inequality through tax policy alone, I wonder what the effect would be of having the top tax rate vary with the Gini coefficient. The idea being that the wealthy can theoretically reduce their tax burden if (and only if) they figure out how to use their economic power to reduce inequality by other means.