November 20, 2020

People act like Godel’s Incompleteness Theorem is all about how formal systems are limited in their level of insight about truth”. But actually it’s just the same observation as the Halting Problem: you have a system that tries to reason about the behavior of other systems (like Turing Machines or axiomatic set theory), but it can’t possibly always introspect about its own behavior because you can also configure it to behave differently based on its own conclusion about its behavior. What if I ask you to predict whether you’re going to go left or right, but I also tell you that I’m going to force you to go the opposite direction as what you predict? Then the set of things you can accurately predict is Incomplete. But that’s not a profound limitation of human consciousness, it’s just the familiar paradox of any fortune teller or time travel scenarios.

(source)


Previous post
That was weird:
Next post
QuickPYTHON is a QBasic-like environment for Python. Very cool