From the digital landscapes designed for the video game “Metal Gear Solid 2,” the artist Lucas Lugarinho opts for “ready-made truths” as strategies to play with images that, in face of fake news and capitalizable data, revise the fictions that are presented to us via pixels.
When Playstation 2’s tactical espionage game Metal Gear Solid 2: Sons of Liberty was released in 2001, the role of the Internet was still unclear. By that time, most users of the early millennia global networks’ interaction was limited to the discussion of fiction and speculation in forums specialized in various subjects. By dabbling in the image-making tools that were offered by the pixels’ visual possibilities on a screen, one could argue that the web 1.0 was used primarily as a hub for the aesthetic and functional experimentation of images and narratives. A land where fiction reigned supreme. Since then, this “no man’s land” has suffered a fast process of colonization, being occupied by multi-millionaire corporations. As technology and accessibility began to re-design web interaction, this old free-land of fiction was proclaimed a new kingdom of reality. Built atop its ruins of myths were monopolistic platforms that now host most of the world’s existing imaginaries through extractive exploitation.
The director of MGS2, Hideo Kojima, seemed conscious of the inevitable instrumentalization of virtual spaces; consequently proliferating both truths and hoaxes, while dictating reality through humanity’s increased dependence on biased algorithms hiding beyond crystal clear screens. Towards the end of the game—on a cutscene dialogue with its commandant—Raiden (the protagonist used by the player) finds himself being manipulated by Artificial Intelligences, giving him orders to accomplish different political and espionage tasks. In order to explain their motivations for controlling and censoring the information accessing the Internet, the AIs in question comment on the abysmal amounts of disposable data generated by their users: single-use data that is accumulated and reorganized in various ways, always at convenience of the singular reality of the spectator, crystallizing (as) truths for non-critical viewers (at the convenience of their singular realities). In the English version of the game, these are referred to as ready-made truths, a translation made years before the popularization of fake news, which captured public interest mostly after its impact in politics and recent elections. As part of the video game’s narrative, the AI’s then proceeded to explain how their interference in the ‘natural’ order of digital ecologies can reunite the entire country under the banner of progress, finally ending an era of moral decadence and decentralization of common beliefs.
It comes as no surprise that, at the beginning of this new decade, we’re living in quite a state of emergency relative to disinformation; struggling to find consensus on real events while junk data accelerates the next destabilization of social and political organizations of traditional capitalism
The achievement of the localization of MGS2, handled mostly by the anglo-Japanese translator Agness Kaku, was to address an analogy to the current information problematics through a non-biased lens and modern terminology. It comes as no surprise that, at the beginning of this new decade, we’re living in quite a state of emergency relative to disinformation; struggling to find consensus on real events while junk data accelerates the next destabilization of social and political organizations of traditional capitalism—a disturbance that will likely be continuously countered with more austere policies that value matter over life. But to be fair, isn’t the very name fake news favorable to this process, as it celebrates the dichotomy between true and false in a space whose foundations were dedicated to aboard fictions? Especially when we take into consideration the exponential growth and sophistication of post-production techniques such as deepfakes?
Thinking about this issue, the present text proposes the adoption of the terminology ready-made truths presented in MGS2: a name which I believe designates with more precision the nature of this data, and evokes the avant-garde practice of the ready-made, introduced by Marcel Duchamp over 100 years ago. In order to explain the proposing of this new nomenclature, we have to analyze the importance of ready-mades as means of resistance and agency of an individual over their imagination—to virally access others—as well as the recent weaponization of information by vastly decentralized news portals.
Having existed since the very dawn of societal life, misinformation is nothing new. But looking at it from another perspective, we could say that the struggle between a homogenous reality, sustained by a centralized and designed belief, versus a plethora of dissident cells of resistance knowledge labeled as fiction, has had a central role in the history of organized societies. Religious persecutions of ‘other’ beliefs, medicines and sciences, the categorization between sane and insane people, the self-made divine birthright of kings and contemporary political propaganda, all of them have one thing in common: the instrumentalization of images by a status quo social organization (ex: kingdoms, states, churches) in order to achieve the monopoly over a local imagination and proceed to manage its resources. In regard to that, the concept of misinformation becomes dangerously malleable, allowing popular imaginaries to be exploitable as natural resources under the same old fashion cartesian logic that fragments our interdependence within the place we inhabit—being this virtual or physical. According to journalist Steven Poole,
“In his Novum Organum (1620), the natural philosopher Francis Bacon describes for the first time the psychological phenomenon that underlies so much of our modern worries about trust and truth—what would only much later be christened ‘confirmation bias’. Our minds, he notes, tend to lend more weight to ‘affirmative’ (or positive) than to negative results, so a person is likely to ‘seize eagerly on any fact, however slender, that supports his theory; but will question, or conveniently ignore, the far stronger facts that overthrow it’.”
On top of that, borderline or limit content—media that nearly violates policy lines (such as misinformation, hate speech, violence, bullying, clickbait)—became the most engaging kind of content on the Internet; providing an arguably unethical source of profit to social media. This strategy can be loosely comparable to the corruption of ecosystems: the extraction of resources that degrades the necessary links to coexist. This principle of limit content is often used in fiction media, such as the overexploited gimmick of plot twists planned to engage viewers. But in the case of MGS2, director Hideo Kojima utilizes borderline content as a critical tool.
Youtuber ThorHighHeels has recently pointed out in his Metal Gear retrospective videos that Metal Gear Solid 2 is a mock sequence. In resume, Thor speculates that Kojima designed the game as a metacommentary on the fanboy/otaku persona—media enthusiasts that are fanatically attached to the images they consume. As the sequel went into production, in order to directly capitalize on the success of the first MGS, and arguably against Kojima’s intentions, he created a sequel narrative to tease and deceive its players, going as far as showcasing a nine-minute game trailer with fake footage at E3 in the year 2000.
[…] micro-targeted individual truths incentivize digital disagreements while fragmenting notions of kinship and belonging, by exploiting an aspect inherent to images that is very well-known by art history and Duchamp himself: speculation.
It’s not hard to spot the tone of mockery in the actual game’s narrative footage as Raiden—the unexpected unknown protagonist that took the place of the previous one, the great Solid Snake—is constantly being ridiculed throughout the play through stepping in bird’s poop, forced to traverse segments of the map while naked, even being misgendered—he’s a dislikable protagonist in the eyes of a regular fanboy. The story goes as, given the events of the first installment of the MGS franchise, (the great) Solid Snake, made use of the global networks to spread classified data about the Metal Gear project, a cold-war-esque-gundam-like-war-machine-development-research that serves as the main plot device of the franchise. These leaks, compared by HighHells to Julian Assange’s work in Wikileaks, set the events of MGS2 in motion, where Raiden is now tasked by the government to stop Snake with the backstory that he’s now a hero-turned-terrorist that had overrun an offshore cleanup facility. The story then unfolds with Raiden and the player both discovering the inconsistencies of the simulation built around them, culminating in the AI’s dialogue that precedes the game’s final climax.
Setting the arguably political naiveness of the story aside, the game makes its point by exploiting the beliefs and expectations of its fans in order to comment on the manipulative nature of media. This approach to game design evokes certain parallels with Duchamp’s ready-mades, where the artist developed at-the-time hard to digest pieces taking into account it’s settings, viewers, and the historical and poetical implications of its industrialized materials—generating a discussion around the established canon of the art world. While Duchamp’s non-retinal provocations served as critiques of the established legitimation process of artworks, in MGS2, Kojima questions the beliefs of game fandoms—people fanatically devoted to the media they consume and often given the freedom to exercise whatever methods disposable with the excuse of incarnating a hero. In both cases, these works invite their spectators to gain agency of their own imaginations and inject a dose of critical fiction into the established realities of its mediums [(what if) art looked like this (?) / (what if) games were played like this (?)]. This issue, however, reaches a whole new dimension when, instead of assuming itself as openly fictional, a designed medium poses as simulacra of reality, such as the case of news portals.
What Kojima’s MGS2 and Duchamp’s ready-mades do is, to an extent, utilize their mediums to play with their users’ already highly biased imaginaries by betraying their ocularcentric desire. But in contrast to that approach, most fake news agents design borderline content in order to appeal to, or spoil, pre-established beliefs. In MGS2’s story, the government self-born AI controls the information on the Internet to maintain a homogenous public opinion—but in reality, what we observe nowadays is an ever-going multifaceted image war, where old occidental monopolies fail to achieve a favorable consensus. Instead, micro-targeted individual truths incentivize digital disagreements while fragmenting notions of kinship and belonging, by exploiting an aspect inherent to images that is very well-known by art history and Duchamp himself: speculation.
The contemporary battleground of bot wars and media minions is a direct reflection of the initial difficulties that cultural industries faced while trying to profit from this NEW world of the Internet—mostly because of piracy. News channels specifically struggled against the phenomena of the decentralization of the sources of information, as social media promised to be an effective way to (re)organize people without the need for opinionated offshore mediums. With the establishment of the first billionaire platform monopolies, attracting most of the internet’s traffic, news channels strategically tried to place themselves as arbiters of truthfulness, while fighting against the enemy of unverified pirated truths, the “fake news.” But the tides had already turned on our relationship with reality for their efforts to really pay off.
Differently than other types of media giants, that have by now successfully re-designed the way we consume art and fiction (Sp*tify, N*tflix, Am*zon, to mention a few), news conglomerates still have to overcome the micro-realities of Internet communities—users that come together by spoiling each other’s confirmation biases, often organizing themselves as ready-made truth militias conspiring against established narratives. While larger mediums use their scope and the overabundance of data in favor of their own agendas and narratives of the world, non conformed user-bases began a similar process but using images that are buried under the layers of imposed truth instead. More than pirating news, these guerrillas of fiction empower their world-views with ready-made truths manufactured from within the datascape, both by people and by robots. In order to weaponize this data, agents inject speculation into memes, testimonials, and re-contextualized images, to engage in a Gramscian-like cultural war for the resources of Internet users’ imaginations. The counter-production of information serves both as a means of resistance and propagation of ideologies, where the fight between anti-hegemonic and established truths are constantly colonizing and decolonizing digital landscapes and their attached political imaginaries.
In a way, Duchamp’s ready-mades approach the fake news phenomena: both are radical montages of commodities (objects and data) that explicitly question hegemonic impositions/realities. This conjecture might sound like ready-made truths are effective weapons of revolution, but just like in the physical world, digital militias are often (and easily) co-opted by greater schemes, may it be algorithmic shopping singularities, corporative spring-riots, or political whitewashing. The same can be perceived for how Duchamp’s ready-mades aged, becoming themselves canon and a representation of how unreachable the art world is.
Digital guerrillas—of human or algorithmic bots (or both)—often design and frame information in order to compete for social media metrics of productivity (likes, shares, subscriptions) that, in return, attest the truthfulness of a proposed opinion, edited fact or world-view to its observers. The value of design over facts has always been in favor of warfare, such as discussed by Susan Sontag in her book Regarding the Pain of Others and attested by the fact that social media doesn’t curate its content based on legitimacy or truthness but to show the most expected engagement media based on popularity. This aspect has now permeated most of our experience over the web: from get-a-bigger-dick spam ads to forest fires ten-year-old pics in recent international headlines, to teasing presidential tweets. What fake news underlined was how images are weaponized to fight, not over the truth or over reality but over our attention by becoming cleaner, sexier, dangerous, colorful: deceitful portals to either newer worlds, or reminders of the ones we already hold in our minds. Just as the name Fine Arts, or in the case of games the title of AAA, within the name fake news resides a weapon of homogenization that needs to be discussed in favor of a long-term plan of coexistence in a world of constant (visual) fragmentation.
This text is, by no means in favor of misinformation, but is instead, a proposal for a new approach towards the screens that hold the necessary links to each other. Because hidden behind the light that hits our eyelids, the bright images of digital screens also cast its shadows in the form of advertisement and speculation. Digital verification processes of facts and information can barely manage to cover up for the impact of misinformation to biased users: and if they did they would work as homogenization tools, as specific user-bases often reject mediums contrary to their beliefs. I believe that within the name ready-made truths lie a strategy of playing with an image rather than weaponizing it. It’s a way of bonding rather than extracting and instrumentalizing.
What our contemporary times have successfully managed to stage is a world where humans are mediated through screens, where sovereignty and reality are discussed through a medium whose lights appeal to our individual desires rather than effectively link ourselves as a community—a scenario where disagreement reigns. But perhaps we can begin to rethink this paradigm with an easy agreement, a historical and social concordance that dates back generations, beliefs, and cultures, regardless of their occidental technological standards or the complexity of their folklore. An iconoclast strategy dressed up in ocular-centrism. The belief all users of the network assume to be true when we look at a digital screen, as for example while reading this text: the idea that an image is real. With this thought in mind, we can presume that, to trace ourselves new links of coexistence, we need to acknowledge the sovereignty of images as shape-shifting beings rather than resources. We have to bond with images in order to nurture different connections between ourselves in all the realms we inhabit.
This text is one of those selected through our open-call.
Biased algorithms are the foundation of machine learning. They are what drives intelligent machines to make decisions by examining different types of biases most commonly found in datasets by identifying, for example, historical data or stereotypes that already exist in society according to hegemony.
Deepfakes are synthetic media that replaces a person’s likeness in an original source image or video for someone else’s—for both deceitful or entertainment purposes. The term deepfake was coined in 2017 by Reddit users in subforums dedicated to implementing neural networks to enhance computer vision algorithms capable of generating pornographic content using the likeness of famous persons, as well as memes or other forms of entertainment media. The highly deceivable nature of this technology poses a challenge to the future of digital communication and politics.
Steven Poole, “Before Trump: the real history of fake news,” The Guardian, November 22, 2019
The Electronic Entertainment Expo, also known as E3, is a three-day-long world premier event in the city of Los Angeles for computer and video games and related products.
The exploitation of the overabundance of data is a tactic widely employed by ready-made truth militias, but this strategy has actually been in use by media monopolies for a long time now – news conglomerates just redesigned anti-piracy technologies into social engineering algorithms. The music industry for example has pioneered the method of content poisoning, the practice of sharing corrupted data into servers, hoping that users would be discouraged to pirate files by polluting their networks to later preach about the unsafe nature of these cyberspaces. Nowadays news channels have a big interest in topics such as the regulation of fake news and every scandal concerning misinformation on the web, as social media became for the news portals what Piratebay is for entertainment industries.
It is important to notice that confirmation biases are also responsible for the idea that only the ‘otherness’ manages such tactics as false information. For example, the flat earther narrative that states that images of outer space are crafted by governmental space programs such as NASA. But despite political or religious inclination, ready-made truth militias operate within a wide range of socio-virtual groups, including the ones the reader might feel affiliated to: from teenagers indoctrinated by extremist right-winged twitch streamers, to leftist communist online book clubs. I believe the question lies more within a praxis of confirmation around biased information rather than a moral or ethical dilemma. For example this two biased news from left and right political opinions in Brazil: fake news from the right-wing
It is worth mentioning that, in the vast networks that leak ready-made truths, there are numerous agents, both physical and virtual, that have crucial roles in spreading (mis)information. What can be traced back to 4chan’s prank ready-made truth raids to Wikipedia’s pages have now become a whole subindustry of its own, with cases such as the Cambridge Analytica’s electoral scandal. Organizations specialized in farming data to spread ready-made truths have updated old political publicity strategies to better define its targets, but it’s important to notice the ongoing role of both younger and older userbases in these schemes—mostly because of both groups unawareness on how the technology they use harvest their imaginaries. See more
See more in Susan Sontag, Regarding The Pain Of Others (New York: Picador, 2003), p.43: “Not surprisingly, many of the canonical images of early war photography turn out to have been staged, or to have had their subjects tampered with. After reaching the much-shelled valley approaching Sebastopol in his horse-drawn darkroom, Fenton made two exposures from the same tripod position: in the first version of the celebrated photograph he was to call “The Valley of the Shadow of Death” (despite the tide, it was not across this landscape that the Light Brigade made its doomed charge), the cannonballs are thick on the ground to the left of the road, but before taking the second picture — the one that is always reproduced — he oversaw the scattering of cannonballs on the road itself. A picture of a desolate site where a great deal of dying had indeed taken place, Beato’s image of the devastated Sikandarbagh Palace involved a more thorough arrangement of its subject, and was one of the first photographic depictions of the horrific in war.”