Home // SIMUL 2024, The Sixteenth International Conference on Advances in System Modeling and Simulation // View article
Simulating Olson's Bandits: An ABM Exploration of Government Decision Dynamics
Authors:
Chasen Jeffries
Keywords: Agent-Based Modeling (ABM), Olson's Bandits, Complex Adaptive Systems, Government Formation, Decision Dynamics
Abstract:
This study employs agent-based modeling (ABM) to simulate Mancur Olson's theory of roving and stationary bandits, with a focus on governance and economic performance. While empirical research has investigated Olson's theory, real-world case studies have not provided the best natural experiments to thoroughly examine it. This research investigates how bandits, under varying initial conditions, parameter values, and environments, govern and maximize their gains. The model, tested through scenario analysis, finds that stationary bandits consistently perform strongly due to their ability to invest in their subjects, giving them a long-term advantage. In contrast, roving bandits exhibit less stability, with varying conditions determining whether they survive, die out, or transition to stationary bandits. Those that transition manage to survive and thrive without investing in their subjects, challenging Olson's assumption that public goods beyond peace and order are essential for societal stability. While these initial results require further validation, they may offer future insights into governance in contexts of weak state capacity.
Pages: 12 to 18
Copyright: Copyright (c) IARIA, 2024
Publication date: September 29, 2024
Published in: conference
ISSN: 2308-4537
ISBN: 978-1-68558-197-8
Location: Venice, Italy
Dates: from September 29, 2024 to October 3, 2024