One of the great problems of mass-user products is predicting emerging behaviors—both good and bad.
A recent study by researchers at Stanford University provides a solution that takes advantage of large language models (LLM) to predict how large populations can behave and affect each other.
The study, titled “Generative Agents: Interactive Simulacra of Human Behavior,” explores the potential of generative models in creating an AI agent architecture that remembers its interactions, reflects on the information it receives, and plans long- and short-term goals based on an ever-expanding memory stream.
By interacting with each other, the agents can emulate the more intricate social behaviors that emerge from the interactions of a large population.
Key findings:
The study uses agents in SmallVille, a sandbox game environment composed of various objects such as buffets, schools, bars, and more
Each agent is powered by an LLM (like GPT-4) and has a prompt that contains its character descriptions, memories, and goals
The agents can interact with their environment and each other and affect each other’s memories and goals
The researchers designed an intricate system that enables agents to store and retrieve relevant memories to include in the LLM prompt for each interaction
The agents also have a layered planning function that allows them to plan for both the long- and short-term.
Their experiments show that the AI agents learn to coordinate among themselves without being explicitly instructed to do so
The researchers believe their work has far-reaching, practical applications, such as prototyping the dynamics in mass-user products such as social networks
The agents can help predict the effects of spreading misinformation or experiment with different counterfactual events
Read the full article on VentureBeat
For more on LLM research:
It's pretty wild (and cool) to see SIMS and video games, etc becoming the subject of robust scientific study.
All that time nerding out on D&D, Warcraft 2, and Civ 2 has been vindicated! I knew it all along.