Video game designers and players have long been fascinated and inspired by historical narratives and figures. Delving into the history of how video games have utilized — and distorted — stories of the past reveals a persistent demand for historical education through entertainment, a reminder that people are constantly searching for new ways to engage and find meaning in the past.
Efforts to educate through video games began with computers. In 1973, for example, the Minnesota Educational Computing Consortium, which worked with the state university system and Minnesota’s Department of Education, started using computer technologies to improve student learning. A decade later, Minnesota boasted of having 10,000 computers in its public schools, with a ratio of 73 students per computer — reported to be the highest ratio in the country at the time.
The consortium also led the way in creating computer-based courseware, including its most famous release: “The Oregon Trail.” Originally a text-based game for school use, “The Oregon Trail” was released to students and teachers throughout Minnesota in 1975. The game, which later saw release through Apple, Microsoft and others, is a strategy video game where the user embodies a wagon leader shepherding settlers across the frontier during the 1840s. The player is tasked with making important decisions along the way, including choosing the best path, when to hunt and how to avoid illnesses such as dysentery. Designed to encourage skills such as planning, strategy and memory, the game was a success.
Such nostalgia sold well in the 1970s, when in anticipation of the country’s bicentennial and social, economic and political unrest, many Americans looked to the past in new, engaging ways. As Malgorzata Rymsza-Pawlowska has argued, it was in the 1970s that history “became as much about feeling as about thinking, about being inside the past instead of looking upon it.” Immersive video games helped history come alive in the 1970s, much like new period-piece TV shows, historical reenactments, oral history projects and museum exhibitions did.
While Atari and its groundbreaking 1975 game “Pong,” a virtual simulation of a game of table tennis, showed the potential market for home video games, commercial sales were checkered over the next decade. Then, in 1985, Japan’s Nintendo released its Nintendo Entertainment System in the United States. That release soon marked the success of now-classic games such as “Super Mario Brothers” (1985) and “The Legend of Zelda” (1987). By 1990, Nintendo represented 90 percent of the United States’ $3 billion spending on video gaming, with a survey suggesting that its lead character Mario had become more recognizable to U.S. children than Mickey Mouse. These commercial successes also meant that educational gaming would, no doubt, take a back seat to entertainment.
As new consoles entered American homes, including 1989’s Sega Genesis, many games emerged, some of which sought to link historical themes of conquest and empire-building to modern-day skills of success and work ethic. For example, Nintendo released the military strategy game “Genghis Khan” in 1990. The game allowed up to four players to create a conquest strategy on behalf of England, the Byzantine Empire, the Mongol Empire or Japan — all while facing challenges along the way. As a 1989 review of the game’s computer release noted, “Conquerors need to be calculating, charismatic, and cunning, as well as courageous,” and this video game promised its players such lessons.
Similarly, players of 1991’s “Civilization” (originally on MS-DOS, but then released through several other platforms and consoles) were asked to build and grow an empire across thousands of years, seeing the civilization through military engagements, urban growth and settlement. Players-turned-imperialists were sometimes met with oppositional civilizations they might have read about in their history textbooks, from Alexander the Great to Napoleon Bonaparte. These games rewarded players for their persistence and determination to conquer and colonize.
By the 1990s, another video game genre was also firmly established: World War II. In the United States and elsewhere, such attention to war further entrenched a nationalist memory of the battlefield that emphasized the role of individual combat and violence. Frequently, this occurred through the lens of the first-person shooter, who is often divorced from the broader strategy of warfare. For many of these games, the history was often more of a backdrop or setting than a source of education. The focus on entertainment was similarly reflected in the original releases of the successful games “The Medal of Honor” (1999) and “Call of Duty” (2003) and their many sequels and imitations.
The ability to weave a strong nationalist story in video games is by no means unique to the United States or limited to the historical backdrop of World War II. In 2012, the Cuban government released “Gesta Final” to help teach Cuban youths about the Cuban Revolution of 1959. This game also adopts the first-person shooter format to tell a state-approved story about the origins of the revolution and the successes of its people.
In short, video games reveal much about our culture — whether educational initiatives or political agendas. They also have become a way for gamers to engage with important questions about public history and a shared past — albeit, one created for the game itself. Game designers and programmers often use a generic or fictional museum or heritage site, for example, to allow the player an opportunity to learn a particular past necessary to advance the character’s storyline. In this way, while primarily a form of escapism, these features can function much like a museum in the non-virtual world where visitors take away fragments of the past to shape ideas of who they are as a people.
Whether fictional or not, the inclusion of museums and historical and archaeological sites in video games can also tell us something important about changing attitudes toward the accessibility and the gatekeeping of history. While museums have become spaces of reverence and exclusion, their virtual manifestations have provided players a different experience. Countless video games, including 2019’s “World War Z” about fighting zombies, require standoffs at museums or cemeteries that even lead to the full destruction of those virtual settings.
Many museums today have taken a page from video games’ successes to better engage their audiences, especially younger crowds, with interactive and sensory experiences. In 2016, the American Museum of Natural History unveiled “MicroRangers,” an app game that interfaces with the museum’s exhibitions to help children learn.
Video games may make history more accessible, but there is a downside because these historical experiences are often presented without a critical or analytical lens or guidance. Just the same, we might question whether the consumption of alternate and imaginary pasts in the virtual world may serve — even inadvertently — as a decoy or distraction to facing and reckoning with the horrors and inequities cemented in our histories.
Some data suggests something more optimistic, however. A 2020 survey revealed that 93 percent of historical video game players had felt inspired to learn more about a particular event or person in history while 90 percent thought that video games had the power to change people’s perspectives on a historical event.
As the market for video games continues to grow and advance, so too can its ability to create and disperse knowledge of the past. Those who study history would be wise to heed the call.