Discussion about this post

User's avatar
Kevin Kelly's avatar

Fantastic notes. One possibility to resolve the Dwarkesh Dilemma on how come vast knowledge does not equal vast reasoning: Perhaps it is a trade off. Perhaps advance reasoning requires a certain kind of ignorance, so we forget what we learn in order to have novel ideas. Maybe our brains don't store all knoweldge we get so that it can perform novel thoughts.

Expand full comment
Rohit Krishnan's avatar

Excellent list of questions. One underrated fact is that we know how to deal with the mistakes humans make, our entire society is built around it. But we don't know how to deal with the mistakes LLMs make, and will need to build structures around it for it to "take over".

To me that's an incredibly important part of the conversation, and a lot of unknown unknowns that you ask about lie at the other side of it.

Expand full comment
44 more comments...

No posts