It all started with a LinkedIn post. Nothing new — this week’s mandatory opinion, recycled with different words. Typical social media noise. Someone disagreed. Strongly enough to reach for heavy artillery and call the author a “flat-earther.” Boom. And with the recoil, I got hit too.
The Earth is flat!
That rang a bell. I remembered an old, insightful, and funny conversation with AI about… something. The problem was, all I could recall was the conclusion: the Earth is flat.
Nothing to worry about. I had my notes. A small document where I saved AI output worth keeping. I found this:
“Turns out the Earth is flat after all.”
Helpful. Thank you, past me, for trusting future me’s memory so much. Present me now had to reconstruct an entire line of thought from a single sentence. Good luck with that. Spacetime? Pancakes? Nothing clicked.
Then it hit me: if AI was involved, the process would still be there. AI would remember. The search took longer than expected, but eventually, I found it.
It wasn’t about the Earth at all. It was about information gradients—and how social media flattens them. Original ideas create spikes that, over time, get spread, diluted, and leveled across platforms. Until everyone is repeating the same thing, convinced they’ve discovered something new—while collectively ensuring everything becomes flat.
Thanks to AI, I was able to rediscover a thought that would otherwise have been lost. A thought that taught me nothing new—yet somehow felt exactly right.