Somehow, this part of the game has escaped media attention. Serious Games has had to answer for only the "Slave Tetris" portion. According to a statement posted on the Steam forums from Serious Games Chief Executive Simon Egenfeldt-Nielsen, the slave-cramming mini-game was intended to be "insensitive and gruesome," in order to teach about slavery effectively. "The reactions people have to this game," he continued, are "something they will never forget, and they will remember just how inhumane slave trade was."
Playing History 2 - Slave Trade [full version]
Download: https://shoxet.com/2vHbUO
In a game that claims to portray horrors of the transatlantic slave trade, the slave owners in "Slave Trade" are pretty nice. If you complete a simple task for even the meanest white person, they are kinder to you. In fact, some of the rudest characters in the game are not white slavers but black slave characters, who sometimes refer to white characters as "white devils."
Travel back in time and witness the horrors of slave trade firsthand. You will be working as young slave steward on a ship crossing the Atlantic. You are to serve the captain and be his eyes and ears. What do you do, when you realize that your own sister has been captured by the slave traders?
The mode, which can still be seen in Let's Play videos captured before the game's update, flatly asked players to stack dead-eyed African bodies that had been squished into uncomfortable Tetris shapes into a slave ship. (The mode's instructions included an oddly rhetorical question: "How come the slave traders were so inhumane?") Players didn't try to "clear" the board by creating full lines; instead, they accumulated points for fitting more bodies onto the ship before reaching its top line. The mode concluded with an informational note about slaves being "packed to use every square millimeter."
In Slave Trade you "travel back in time and witness the horrors of slave trade firsthand". You work as a young slave steward on a ship crossing the Atlantic, and serve the captain as "his eyes and ears".
In a longer post called "The rationale behind the game slave trade" Egenfeldt-Nielsen also tried to defend the game on a point by point basis, comparing Slave Tetris to Schindler's List. We've included the post in full below.
You should not be able to play slave trade, where you act as a slave owner or slave trader. Actually you don't as such. In the game you are a slave on a slave ship stuck between a rock and a hard place. So as a slave you become pulled into some of these atrocities. You need to done what is said, help with getting slaves and load them.. until at the right time you can make your move.
Slave tetris is a mockery and insensitive. I definitely agree it is insensitive and gruesome. It has to be like this to show what was done to load slave ships. People treated human beings as pieces that just had to fitting into the cargo. The reactions people have to this game is something they will never forget, and they will remember just how inhumane slave trade was. If this is the case then we have accomplished what we set out to do. You may not like the way we do it, but I have seen enough school classes use this to know it has the intended effect - a lot people never think of slave trade as something that just happened in the past.
Some people enjoy playing with slaves, and we are a instrument for these. Honestly, play the game. If someone wants to help a slave escape or 2 hours, and then enjoy 15 secs of slave tetris I think I got a good chance of changing that person's perception of slavery.
Like what the player has to do with 25 slaves when the sea-rations run out, this pushed the game over the edge of controversy. Critics bombarded the edutainment studio with accusations that turning starving slaves into L-shapes "trivializes a serious time in history that shouldn't be fun." In response, Serious Game Interactive removed Slave Tetris from the game while its CEO made a passive-aggressive apology declaring: "The goal was to enlighten and educate people - not to get sidetracked discussing a small 15 secs part of the game." He then doubled down on his non-apology by stating that Playing History 2 - Slave Trade was never even intended for the American market and its PC sensibilities. A bluff that would've been more believable if they weren't talking about a game released on Steam. About a piece of American history. Which has a full voice cast of English actors.
The robust, sustained interest in the history of the transatlantic slave trade has been a defining feature of the intersection of African studies and digital scholarship since the advent of humanities computing in the 1960s. The pioneering work of the Trans-Atlantic Slave Trade Database, first made widely available in CD-ROM in 1999, is one of several major projects to use digital tools in the research and analysis of the Atlantic trade from the sixteenth through the mid-nineteenth century. Over the past two decades, computing technologies have also been applied to the exploration of African bondage outside the maritime Atlantic frame. In the 2010s, Slave Voyages (the online successor to the original Slave Trade Database compact disc) joined many other projects in and outside the academy that deploy digital tools in the reconstruction of the large-scale structural history of the trade as well as the microhistorical understandings of individual lives, the biography of notables, and family ancestry.
In North American and British higher education, an expanding aggregation of quantified data gathered from public records, private archives, and print material developed alongside increased access to mainframes and related technologies of data processing.2 The resulting computer-assisted studies traced the broad geographic and temporal outlines of the transatlantic trade between Africa and the Americas, the demography of the Middle Passage (i.e., total numbers of trafficked slaves as well as sex ratios, age cohorts, mortality, and ethnic origins), and the profitability and economic risks of the Odious Trade. While the frame was not exclusively British, statistical records of English traders garnered sustained attention.3
Although early data-driven analysis of the Middle Passage was generally inattentive to the experiences of common individuals or slave families before or after a transatlantic voyage, it shared with early social-history adopters of statistical software (notably Statistical Package for the Social Sciences, or SPSS, first released in 1968) a strong interest in creating and manipulating raw data about people that could tell stories of historical change from the bottom up. In subsequent years, the statistical evidence and analysis would play an important role in the biographical turn.
Whereas Slave Voyages sprawls across the maritime Atlantic, Afro-Louisiana History and Genealogy dives deep into the interior of an American slave society organized around the port of New Orleans and the Lower Mississippi. Of the approximately 100,000 slave records included in a dataset of 114 fields, about 58 percent come from the county records of Orleans parish, in urban New Orleans. The remainder are from rural Louisiana. Alongside the record of slave ship arrivals to Louisiana (both from African ports and via transshipment from the Caribbean and the eastern United States), the database covers slave sales, estate inventory, probate records, runaway advertisements, mortgages and liens against slave property, death certificates, marriage licenses, criminal and judicial proceedings, reports on slave resistance, and certificates of manumission. With the named person as the primary organizing datapoint, the dataset ranges across life events and intimate relationships that went far beyond the transatlantic trade. Making legible the lives of named individuals, and life incidents from birth to death, Afro-Louisiana History and Genealogy appealed equally to the academics interested in the human saga of bondage from Africa to the Americas as well as to genealogists on the trail of a family past in slave Louisiana.
As noted previously, field research and encoding in the early days of digital work in the history of the slave trade were conducted in isolation. The research method may have been collaborative and team-based as research teams relied upon models of sponsored research and institutional resource utilization that were unlike the traditional lone humanities scholar. Nonetheless, individual research teams still operated independently, with different protocols, technologies, and languages. Machine-readable datasets and statistical software permitted the sharing and repurposing of raw data, but scholarly and popular audiences without adequate training and support in statistics were marginal to the conversation. The lines of communication between academics and genealogists were weak. Isolation began to soften in the early 1990s, and diminished significantly after 1999 with the publication and commercialization of a multisource, multiuser dataset that operated on an operating system compatible with personal computing. The Transatlantic Slave Trade Database has been especially influential in establishing source-based empirical evidence alongside standardized field variables, such as VoyageID, that structure encoding protocols for ongoing and new projects.
Yet, isolation continues to be a challenge for digital work on the history of enslavement. A rapid expansion in research activities in civil, ecclesiastical, government, genealogical, and private collections, many resulting in the autonomous development of datasets in productivity suites by Microsoft, Google, and Apple, have outpaced data standards. Across projects, field naming and metadata conventions have been especially idiosyncratic. As encoding remains uneven, the same individual can appear in more than one dataset, but the absence of protocols for datafields and orthography leave the individual in digital isolation (or, more accurately, in unrecognized digital duplication). The challenges of cross-project search and analysis are amplified in statistical calculations for variables such as ethnicity, race, color, and occupation that have been encoded with widely varying practices ranging from strict standardization to a fidelity to the original source. The encoding of place names has presented unique challenges, as geolocational names found in official reference sources like the Geographical Names Information System and GEOnet Names Server (both having multiple applications in geoinformatics) may depart dramatically from the archival original. The flattening of multilingual original sources into English translations and the variety of languages used in field variables titles have added new elements of uneasy intelligibility from one project to another. Whereas a tension between controlled versus natural vocabulary has been a part of slave trade research since the early encoding of machine-readable datasets, the struggle between clean and messy data is now systemic. 2ff7e9595c
Comments