Yuval Harari’s book “Sapiens” is brilliant. I wish I’d written it. It’s stuffed full of great insights, large and small, and the writing style is crisp, clear and often witty. You can sample his thinking in this TED talk:
One of Harari’s most important insights is that the reason why humans rule the world, and why the fate of every other species depends on us, is our ability to believe things that aren’t true. Our ability to believe that money is valuable, and that nations and gods exist is what enables us to organise in flexible ways in large numbers. That in turn enables us to hunt and kill mammoths, and to build skyscrapers and zoos.
(Social insects organise in large numbers, but not flexibly. If their current mode of organisation suddenly becomes ineffective, they’re done for.)
So fiction is valuable to us. Perhaps that is why JK Rowling’s net worth is greater than the price Nikkei is paying for the FT.
And here’s the link to AI – a tenuous one, I admit. If the ability to believe things that aren’t true is such a vital component of humanity’s use of intelligence to rule the world, will it also be important to a superintelligence? Will the first superintelligence turn out to be the best novelist (or Haiku writer) that the world has ever known?