Bill Thompson, UC Berkeley: On how cultural evolution shapes knowledge acquisition

March 29, 2023

RSS · Spotify · Apple Podcasts · Pocket Casts

Bill Thompson is a cognitive scientist and an assistant professor at UC Berkeley. He runs an experimental cognition laboratory where he and his students conduct research on human language and cognition using large-scale behavioral experiments, computational modeling, and machine learning. In this episode, we explore the impact of cultural evolution on human knowledge acquisition, how pure biological evolution can lead to slow adaptation and overfitting, and much more.

Below are some highlights from our conversation as well as links to the papers, people, and groups referenced in the episode.

Some highlights from our conversation

“In order to understand the computational processes that give rise to things like complex learned algorithmic behaviors like driving or playing chess or solving a Rubik’s cube or even language and speaking to each other, we need to have some way of reasoning about how knowledge accumulates across people.”

“So this mechanism that we call selective social learning provides a solution to those two problems. So, the problem is that complex stuff is difficult to discover and difficult to pass on. So it just kind of increases the fraction of people who are exposed to the rarer discoveries.”

“One of the things we’ve been working on is trying to integrate those two things and develop a way of thinking about cultural evolution as distributed algorithmic processes or distributed computation. Thinking about population-level processes as distributed computational processes gives you a way of viewing groups and multi-generational societies, in a sense, simple societies, in the same terms that you can think about learning by individuals.”

“If I want to look at how large language models learn to reason, and something I would love to do is start to knock out parts of the training data set and say, okay, when you knock this part of training data set out, like suddenly the reasoning capabilities go away, or suddenly this aspect of your knowledge or this capacity to acquire structured algorithmic thinking disappears. Even just simple stuff like that is not tractable at the moment.”

Referenced in this podcast

Thanks to Tessa Hall for editing the podcast.