Hope, Change, and Disinformation: Some "Realistic" Concerns
Information and its Discontents: Part One
I
A panic over the enemies of information continues to escalate with the profusion of new technologies and their networking. One could assert that this panic emanates from concerns centered on how pervasive and damaging these enemies can end up becoming, whether it pertains to hate, public health, the economy, electoral integrity, etc. While this can be the case, this thinking does not go deep enough. That is because those involved in this panic only look at the content and not the bias of communication.
It is not enough to speak of how the social transmission of content has changed. What we need to do is contemplate how those changes also engender the transformation of the social itself. This transformation has taken place through the dynamics of surveillance capitalism.1 Information, misinformation, disinformation, and malinformation are more valuable and embody more threat through the behavioral surplus, predictive knowledge, and prediction products that foster the guaranteed outcomes at the heart of surveillance capitalism than they do on their own. The panic over information and its enemies is misplaced when the total ubiquity of surveillance capitalism instrumentalizes panic, information, and its enemies.
This content-oriented perspective is best embodied by Barack Obama. While he is not the only player in this panic, nor the most important and powerful (since there are elected and unelected people attempting to deal with it while in possession of greater powers, not to mention the businesspeople leading the companies in question), he is nonetheless one of the major public figures investing a lot of energy in this fight. He says things like, “So much of the conversation around disinformation is focused on what people post; the bigger issue is what content these platforms promote,”2 that “one of the biggest reasons for democracies weakening is the profound change that’s taking place in how we communicate and consume information,”3 and that “disinformation is the single biggest threat to democracy.”4 Statements like these are misguided and hypocritical.5
II
Obama’s statements are misguided because we only see concern about the content that platforms promote instead of promotion itself. If he had instead said, “The bigger issue is not just how content is produced and promoted, but the fact that we have runaway promotion in the first place,” then we could take him seriously. We could also take him seriously if he had expanded his concerns about “the demand for crazy on the internet that we have to grapple with”6 to something like “grappling with what is fueling and shaping the supply and demand for crazy and non-crazy alike (along with what is considered crazy and non-crazy) amidst producers and consumers, how all are being shaped, and why.”
Why should we take these alternatives more seriously? Because the dynamics of promotion bias the (re)production, reach, receptivity, and seductiveness of content itself—information, its enemies, and users—not to mention how and why people act on it. However, Obama did none of these things. His discourse merely greases the slippery slope of control and censorship. Censorship is an act of force, no matter how direct or distant. It does everyone a disservice if folks would rather pursue that course instead of dealing with the monopoly of knowledge that grounds this problem in the first place.7
There used to be an art of promotion,8 which was problematic on its own, but it has now become a “science” fueled and propagated by the accumulation of predictive knowledge via the mechanisms of surveillance capitalism. It is through its logic of the incessant pursuit of behavioral surplus—the accumulation of analyzable exhaust from all behaviors to predict, shape, and reinforce future behaviors—that has created a promotion machine obsolescing the art and extending the science that grounds this panic.
Part of the ubiquity of the logic of surveillance capitalism is the incessant overreach and overextension of predictive knowledge and prediction products. However, “predict” in these cases is an incomplete term to throw around because knowledge itself has been intertwined with prediction for ages. “Predict” in terms like “predictive knowledge” and “prediction products” now means more than just really good guessing. What they now entail is making certain something takes place (and continues to), along with the data gathering that expands and continues those processes. Beyond a totalization of “what” happened, these prediction processes ceaselessly attempt to include the “why” of causal knowledge, for it is useless without it. Zuboff tells us:
In the absence of causal knowledge, even the best predictions are only extrapolated from the past.
The result of this conundrum is that the last crucial element in the construction of high-quality prediction products—i.e., those that approximate guaranteed outcomes—depends upon causal knowledge. As [Hal] Varian says, “If you really want to understand causality, you have to run experiments. And if you run experiments continuously, you can continuously improve your system.”
Because the “system” is intended to produce predictions, “continuously improving the system” means closing the gap between prediction and observation in order to approximate certainty. In an analog world, such ambitions would be far too expensive to be practical, but Varian observes that in the realm of the internet, “experimentation can be entirely automated.”9
This feeds into the instrumentarian power wrought from radical behaviorism, for it operates through intervention since it is defined as:
the instrumentation and instrumentalization of human behavior for the purposes of modification, prediction, monetization, and control. In this formulation, “instrumentation” refers to the ubiquitous, sensate, computational, actuating global architecture that renders, monitors, computes, and modifies, replacing the engineering of souls with the engineering of behavior.10
It is these dynamics—this science of promotion, this science of prediction—that biases the generation and propagation of information, its enemies, the reactions to them, the reactions to those reactions, and so on.
The exposition above is provided to help hammer home another point of misguidance, which centers on the bias of information itself: that information is supposedly supposed to carry all manner of positive connotations, like “good,” “accurate,” etc. While this can be true enough, through the logic of surveillance capitalism, information has nothing to do with these positive connotations anymore when it is used to create and fuel a system seeking to (re)produce guaranteed outcomes.
In the incessant pursuit of guaranteed outcomes, the system is indifferent to whatever desired outcomes are (re)produced, crazy and non-crazy alike, as long as it extends the system and its mechanisms of improvement. The outcomes and their guarantees take primacy over the information, disinformation, misinformation, and malinformation that can be inputs and outputs to these processes. These could be considered good, bad, “evil,” etc. Nevertheless, they do not matter.11 They only matter in the creation, accumulation, and refinement of behavioral surplus, predictive knowledge, and the growing certainty of outcomes; nothing more, nothing less. Thus, while folks like Obama think the world can be navigated authentically through information management, the fact is that it has actually been replaced by outcome management.
Panic over information and its enemies has happened not because of a faulty configuration/balance of them and their acceleration by AI. Panic is an outcome.12 What we deem to be a faulty configuration/balance is an outcome, as are the judgements of this by humans and AI, along with their subsequent measures. This panic and its contents drive engagement and seduce behaviors to be studied, calculated, and formulated into predictable behaviors that shape even more refined engagement, not to mention more of it. Whatever drives engagement will be promoted because it improves the promotion machine and the drive for total certainty on which this is all grounded.
No matter the rhetoric of fighting the enemies of information and “fixing the internet,” we are still caught up in the incessantly invasive pursuit of causal knowledge. This leads to another instance of misguidance from Obama: there has been zero talk about doing anything about the drive for total certainty that the promotion machine is based on. This drive denies a person’s right to privacy and their right to the future tense.
Recall Zuboff’s quoting of Hal Varian above: there must always be ongoing experimentation in order to go beyond the past when attempting to approximate guaranteed outcomes—closing the gap between prediction and observation in the approximation of certainty—not to mention spurring its growth and further refinement. The extraction and prediction imperatives of surveillance capitalism—the pursuit of new sources of behavioral surplus along with the push for greater scale in supply operations and the expansion of economies of scope and action13 through ubiquitous computing—are impossible without this drive for total certainty and vice versa. These imperatives comb the desert of existence, accumulating everything hidden beneath (yourself and/or others) and protruding out of the sand.
Just because some safety measures and accountability might get included through regulation and industry standards does not mean that these dynamics will cease, especially for the sake of privacy. After all, won’t the higher-quality datasets fed into these systems require better surveillance, better methods of extraction? I mean, where and how are they going to get it otherwise? Asking nicely? No, because these companies have always shrouded their methods in secrecy from the people and government because of how creepily invasive their practices have always been (not to mention how jealous the state would get), and thus want as little transparency as possible. Financial compensation? No, because the surveillance capitalist model is impossible without theft and, on its own, your data is not worth much until used to build something of greater value.14 And just because we have scandal after scandal over leaks, sharing, theft, and backroom deals does not mean data and information are going to be treated like oil where we can worry about drainage.15 That is because drainage is obsolete. The economies of scale, scope, and action that surveillance capitalism propagates are far beyond any comparison that could be made with oil.
Apropos one’s right to the future tense, its commandeering is itself a kind of censorship and is far more insidious, not to mention it is only possible through abolishing the right to privacy. What is this right to the future tense? Zuboff says, “it accounts for the individual’s ability to imagine, intend, promise, and construct a future. It is an essential condition of free will and, more poignantly, of the inner resources from which we draw the will to will.”16 Also:
Our freedom flourishes only as we steadily will ourselves to close the gap between making promises and keeping them. Implicit in this action is an assertion that through my will I can influence the future. It does not imply total authority over the future, of course, only over my piece of it. In this way, the assertion of freedom of will also asserts the right to the future tense as a condition of a fully human life.17
Uncertainty is the mark of freedom. Yet, through the drive for total certainty, your thoughts, words, abilities, and rights to overcome uncertainty have been commandeered, whatever they might be.
If you believe in free will and those “inner resources,” they have been commandeered. We used to have those things we called promises that were willed attempts to turn predictions into facts, to transform uncertainty into a completed project.18 However, surveillance capitalism pulls the rug out from under this transformative work because its predictive processes are actually fortune-telling processes in the pursuit of total certainty. Here, it is not the influence of the future, only the playing out of programs. You have no authority, merely the certainty of interfacing. It is not you who have the piece; you are merely given the piece you were highly likely to interface with. It is not the assertion of freedom of the will; it is the foretold operation of an instrument. It is not the condition of a fully human life; it is the human and its life fully conditioned, fully predictable, and totally certain.
If you do not believe in free will, whatever makes the will unfree has been commandeered. Those promises that are not really our own (whatever they are) are unfreely willed attempts to turn predictions into facts, to transform uncertainties into a completed project, where surveillance capitalism still performs the same work for the same result, that being the human and its life fully conditioned, fully predictable, and totally certain.19
In the commandeering of the future tense, no matter its manifestations, other futures are censored along with the abilities to imagine, intend, promise, and construct them (which the surveillance capitalists fall victim to themselves). This does not mean belief has been extinguished, that we are witless automatons and should thus ignore people’s lived experiences, but that belief is simply out of our hands. Whether or not you are walking hand in hand with what has commandeered belief, these dynamics entail the diversion and warping of thought just as much as they entail the crystallization and extension of it.
People want to believe in their abilities to make promises to old ways of life, to make the world a better place, to mind their own business and do their own thing, to cause a fuss, etc. In regards to the culture wars, people want to believe in their abilities to make promises to retrieve, revamp, and restore archaically decadent hierarchies (most notable being the theocratic and fascist ones), to combat these hierarchies, to build up and/or exit the vampire castle, to lay siege to its fortifications, or to attempt to stay neutral. And I want to believe in my abilities to fulfill my promise of writing this series. But these promises are no longer their own, no matter how strong the beliefs behind them are, and we can say the same about the very keeping of those promises. Indeed, they have become integrated into choice architectures.20 They all operate in an instrumentarian dream so effective that it comes off as a lucid one. That is the magic of the promotion machine.
Because of this, we can forget about those imagined communities that Benedict Anderson talked about regarding nationalism and the sovereignty of its spirit, where the community “is imagined because the members of even the smallest nation will never know most of their fellow-members, meet them, or even hear of them, yet in the minds of each lives the image of their communion,”21 and the community is imagined as such because “regardless of the actual inequality and exploitation that may prevail in each, the nation is always conceived as a deep, horizontal comradeship.”22 These are now promoted/guaranteed communities, for the media forms that have biased communities and imaginations alike have been commandeered by surveillance capitalism.
The members of whatever kind of community mediated by the logic of surveillance capitalism live through dynamics where not only will they not truly know each other, meet each other, or hear of each other, but what they even know has been commandeered and will thus not know the algorithmic workings of the community and the imagination. Indeed, the image of “their” “communion” is not their own. The imagination of community, regardless of the actual inequality, exploitation, toxicity, indifference, parasociality, and so on, that may prevail in each, is always engineered as a deep, guaranteed coherence, no matter the different variables integrated.23 And, as Anderson had expertly put it regarding nationalism, “Ultimately it is this fraternity that makes it possible, over the past two centuries, for so many millions of people, not so much kill, as willingly die for such limited imaginings,”24 we can thus say that this guaranteed coherence that has replaced fraternity makes it possible for not only all of the above but also for so much more and less, through such engineered, such instrumentarian imaginings, whether limited or not, big-tent or small-tent, and so on. This is what now biases imagination and community. What becomes of these promoted/guaranteed communities? Distraction, maintenance, extension, and, if necessary, neutralization. And the sovereignty originally derived from the imagination of nationalism via the Enlightenment and Revolution?25 It is now the sovereignty of a future tense that belongs only to the logic of surveillance capitalism.
Maybe we end up accepting the automated hand of the behavioral futures market26 because our beliefs, imagination, wills, and the ideas behind them tend to wane for a litany of reasons we already have little control over. After all, in peak moments of doubt, trauma, exhaustion, and so on, people have fallen on their knees to ask God for a sign, any sign. Though, in this case, God has been replaced by the Big Other of instrumentarian power—“the sensate, computational, connected puppet that renders, monitors, computes, and modifies human behavior”27—that predicts a will, any will, that wills a guaranteed outcome.
These are some of the dynamics fueling and shaping the supply and demand for crazy and non-crazy alike amidst producers and consumers, how all are being shaped, and why.
In focusing on the enemies of information, we end up overlooking what grounds the problem: “reality” playing out through instrumentarian power. It does not matter who you are or what you are a part of; everything is a part of it, including attempts at absence/evasion. Like W.H. Auden telling Marshall McLuhan, “I don’t have a TV and wouldn’t dream of having one,” McLuhan replied, “You merely suffer the consequences of TV without enjoying it.”28 This is no different with surveillance capitalism and the types of media biased by it. You might not participate in all of this, but the world does, and you suffer the consequences all the same.
Feel no shame if you do not notice these things, for McLuhan tells us, “One thing about which fish know exactly nothing is water, since they have no anti-environment which would enable them to perceive the element they live in.”29 Indeed, it is an arduous struggle, especially if the anti-environment you find refuge in was recommended by the environment itself.
The war on the enemies of information is merely an attack on symptoms. Again, this is not to deny the problems of these symptoms and the damages they incur, but to say that the instrumentalization of the world is the greater danger. To assume otherwise is to be misguided, or, better yet, to be guided, nudged, herded, and so on, to assume otherwise….
III
The hypocrisy of Obama’s statements lies in the fact that he has just been another helping hand to bring these problems concerning disinformation and the like upon himself, the nation, and the world. He was right there in the middle of things with surveillance capitalists helping him achieve victory for the presidency; just as it was later with Trump and Biden; just as it was in propelling most of their respective opponents into presidential relevancy; just as it was with everybody else everywhere else in electoral politics who hopped on the bandwagon; and just as it was regarding the kinds of content and the metrics behind them that political and social media platforms would seize onto for dear life and propagate.
It is not much of a stretch to assume why he did not see disinformation as a huge problem30 before and during most of his time as president. The content he now complains about emerged through tools that built, curated, tested, and distributed it for “good” reasons: the “good” of efficiency, the “good” of the nation, the “good” of the market, his campaigns, all the while growing the revolving door between government and Big Tech.31 Indeed, Obama might have been speaking tongue-in-cheek to the students of Stanford University when he said:
I might never have been elected president if it hadn’t been for websites like, and I’m dating myself, MySpace, MeetUp and Facebook that allowed an army of young volunteers to organize, raise money, spread our message. That’s what elected me.32
That is because he’s leaving out the fact that these companies not only helped him find volunteers but, most importantly, the voters themselves. By Obama’s own reasoning, it was the logic of surveillance capitalism that actually got him elected. On their own, volunteers, raised money, voters, and messages pale in relevance compared to the form of media that ends up biasing them all. Zuboff tells us:
First, Google demonstrated that the same predictive knowledge derived from behavioral surplus that had made the surveillance capitalists wealthy could also help candidates win elections. To make the point, Google was ready to apply its magic to the red-hot core of twenty-first-century campaigning, beginning with the 2008 Obama presidential campaign. [Eric] Schmidt had a leading role in organizing teams and guiding the implementation of cutting-edge data strategies that would eclipse the traditional political arts with the science of behavioral prediction. Indeed, “At Obama’s Chicago headquarters... they remodeled the electorate in every battleground state each weekend... Field staff could see the events’ impact on the projected behaviors and beliefs of every voter nationwide.”
Research by media scholars Daniel Kreiss and Philip Howard indicates that the 2008 Obama campaign compiled significant data on more than 250 million Americans, including “a vast array of online behavioral and relational data collected from use of the campaign’s web site and third-party social media sites such as Facebook...” Journalist Sasha Issenberg, who documented these developments in his book The Victory Lab, quotes one of Obama’s 2008 political consultants who likened predictive modelling to the tools of a fortune-teller: “We know who... people were going to vote for before they decided.”
…
Schmidt’s role in President Obama’s election was but one chapter in a long, and by now fabled, relationship that some have described as a “love affair.” Not surprisingly, Schmidt took on an even more prominent role in the 2012 reelection campaign. He led in fundraising and in breaking new technical ground, and he “personally oversaw the voter-turnout system on election night.”
Political correspondent Jim Rutenberg’s New York Times account of the data scientists’ seminal role in the 2012 Obama victory offers a vivid picture of the capture and analysis of behavioral surplus as a political methodology. The campaign knew “every single wavering voter in the country it needed to persuade to vote for Obama, by name, address, race, sex, and income,” and it had figured out how to target its television ads to these individuals. One breakthrough was the “persuasion score” that identified how easily each undecided voter could be persuaded to vote for the Democratic candidate.33
The facts of behavioral surplus and its predictive power were kept top secret in the Obama campaigns, just as they are in Google, Facebook, and other domains of information dominance. As Rutenberg observed, “The extent to which the campaign used the newest tech tools to look into people’s lives and the sheer amount of personal data its vast servers were crunching remained largely shrouded. The secrecy … was partly … to maintain their competitive edge. But it was also no doubt because they worried that practices like ‘data mining’ and ‘analytics’ could make voters uncomfortable.”34
Yes, forget about platforms providing a space so like-minded folks could find each other on their own, connect, and help spread the message of “Hope” and “Change” (which, like “Make America Great Again,” might as well be considered misinformation and disinformation). The point is that these folks knew how to find who to spread that message to that was predicted to make a specific choice in line with their goals from the start, before their targets had any idea about anything while being algorithmically encouraged to do so, in which every significant moment observed helps regenerate the predictive models to a greater degree of refinement, the greater certainty of a particular outcome. Obviously, this is not mind control, yet its efficacy has been nothing short of substantial, especially when we consider that the competition of Obama and the Democrats, domestic and foreign alike, have done the same with these tools and with great, if not greater, success to boot.
Obama was most certainly correct when he spoke of how “Change does not come from Washington. Change comes to Washington.” That’s because it was through companies like Google. It was never the outsider, the person “shaking things up,” as Obama was billed to be.35 Nor was it you, the voter. Apropos Obama’s connection with voters in 2008 and 2012, he didn’t build that.
If one takes the time to go through Obama’s speech at Stanford beyond what has been reproduced here, you will find zero talk about the problem of prediction itself apropos the pursuits of behavioral surplus and guaranteed outcomes. In fact, the word “predict” only comes up once when he says:
Algorithms have evolved to the point where nobody on the outside of these companies can accurately predict what they’ll do, unless they’re really sophisticated and spend a lot of time tracking it. And sometimes, even the people who build them aren’t sure. That’s a problem.36
But what has been characterized here is only a problem of folks not knowing how algorithms do what they do. While it is a problem, it is not the real problem. The real problem is that they are doing it in the first place. Companies and governments can know what different people are like and not like, thus being able to predict what content they will or won’t like, not to mention what people will and won’t do and can be nudged, tuned, herded, manipulated, and modified to do or not do, regardless of their interests, problematic or not. Is this itself not problematic?
However the algorithms work matters less than the fact that they do something, whether that be what they are supposed to actually do (which is the supposed desire of these players in this panic, which is sinister37 on its own), or that they perform operations that seem to be “good enough” but take place through discriminatory and inefficient biases in the reproduction and extension of decadent and problematic social relations.38 When Obama and others pursue knowledge about the black box of algorithms and their predictive abilities, it is also their pursuit to use it themselves (directly or by proxy), where the selective censoring of its products is only an extension of the central concern that is (the illusion of) being in control.
Obama’s concerns for protecting democracy are milquetoast because, again, he does not want to touch the promotion machine, merely some of its outputs. He told CNN’s Christiane Amanpour that democracy:
depends on the engagement of citizens and an active mobilization of people around the belief, not just in any particular issue, but the belief in self-governance and rule of law and independent judiciary and a free press. All of the civic institutions that go into making a democracy work. And it is indisputable that a combination of forces had put enormous strains on democracy and that we've seen a backlash against democratic ideals around the world. It’s not unique to any one place.39
However, he failed to tell her and the audience that the instrumentarian power of surveillance capitalism he invited to run afoul is the force biasing the combinations straining democracy, not to mention promoting the backlash against it and its ideals. Indeed, it annexes any notion of mobilization since people and issues engage through its algorithmic auspices. It annexes the notions of “self” and “governance” in self-governance, since the radical behaviorists of surveillance capitalism want to erase the risk, messiness, and friction that come with both, thereby essentially avoiding politics. It annexes the rule of law because it is not recognized by a new territory and its mapping, something akin to digital colonialism. It annexes the notion of a free press, since that had first been commandeered and stripped down by the profit motive through neoliberalism, but more so now through the click motive, where its grounding is double-downed through algorithmic validation that subordinates reporting facts and investigating the world at large to it. And it annexes the institutions it deems slow, cumbersome, ineffective, and essentially obsolete.40 Indisputable indeed.
Obama told us that in order to strengthen democracy:
we’ll have to come up with new models for a more inclusive, equitable capitalism. We’ll have to reform our political institutions in ways that allow people to be heard and give them real agency. We’ll have to tell better stories about ourselves and how we can live together, despite our differences.41
Yet he does not realize that our political institutions have already been reformed through his own open arms to the logic of surveillance capitalism. There is no more “we” when “we” have been commandeered by a system that finds and makes us to be predictable, all too predictable. “We” are equitable as human capital, but now, more importantly, as behavioral surplus. Yes, no more “differences,” only calculable variables. No more “living together,” only the flows of social physics. No more “real agency” and “stories about ourselves,” only the destinies of operations.
IV
Is free speech even free anymore when people and platforms are governed through predictability and instrumentality rather than what (they think) they think themselves, however skewed and biased that may be? Doesn’t this predictability and instrumentality take priority over the combinations and contexts of phonemes, whether truthful or dishonest, benign or toxic, harmless or hateful? Moreover, doesn’t this predictability and instrumentality take priority over the forms of media in which the above takes place? And, even if one feels immune to these dynamics, is one not still contending with them?42 We must remember Harold Innis when he told us:
Freedom of the press and freedom of speech have been possible largely because they have permitted the production of words on an unprecedented scale and have made them powerless. Oral and printed words have been harnessed to the enormous demands of modern industrialism and in advertising have been made to find markets for goods.43
Whether they seduce censorship or not, all words have been harnessed to the enormous demands of (re)producing guaranteed outcomes. The (re)generation of all words via the promotion machine, in which their original powers (whatever they are) are liquidated and instrumentalized by it, has them operate to find behavioral futures markets, to find kernels of engagement, to find futures and work towards their taking place, along with the continued refining of certainty of their taking place at the expense of others and their right to the future tense.
This is exactly what Innis did not want to happen in the world. Innis scholar John Bonnett tells us that the great concern for Innis during his media studies was the problem of information management and that cultures that failed at this task would themselves fail.44 Bonnett echoes Innis, saying:
When cultures lost control of the information circulating within them, they invariably became rigid or unstable in their thinking, and often turned to violence prior to their collapse.45
This loss of control of information—associated with positive feedback and increasing returns46—pertains to information overload and its deleterious effects. Ironically enough, however, folks like Obama do not respect this problem and wish to ramp up the production of information even more in order to subdue the enemies of it or simply censor. The use of force via censorship and such will not be an effective method, for, even if necessary, it will only be the expression of a sign of rigidity and instability.
While we must still pay heed to Innis’s concerns about information overload, another, and greater, problem has emerged: the responsibility of managing/overcoming overload is not even in our hands anymore. Sure, our hands are involved, but this is merely in the sense of efficient cause along with the absence of any environmental awareness (formal cause) and human purpose (final cause). All of this now belongs to the logic of surveillance capitalism. Indeed, even the all-important oral tradition has been commandeered. The oral tradition being:
that human expressive instruments could be used to conceive, and then conceive anew. It was a doctrine that suggested that expressive instruments and expressive forms were entities that – when properly applied – could serve to enhance the cognitive flexibility and agency of their users. When it came to the assumption of new forms of thought, Innis argued that it was humans who were in charge.47
Of course, we are no longer in charge, no matter how involved we are, where cognitive flexibility and agency have been obsolesced for the sake of guaranteed outcomes. The irony here is that surveillance capitalism will suffer similar consequences regarding information glut coupled with rigidity and instability through the monopoly of knowledge for guaranteed outcomes, with itself and those objects a part of the Internet of Things we call humans.
This is the “free,”48 robust, programmed, and polarized exchange of ideas that Obama and many others have helped usher in. This is “the profound change that’s taking place in how we communicate and consume information.”49 It has replaced the remnants of a democratic society in favor of an instrumentarian one, where the only “better outcomes” for an instrumentarian society are those that extend its dynamics.
Because people focus on the user and not the medium, we end up failing to ask why speech of all kinds has been usurped and how the power of words and the agencies of their utterers have been commandeered. After all, it is completely true that harmful and hateful perspectives and content have always existed, but no one is asking about how we are to combat a system that is unprecedented in its efficacy at shaping them, curating them, fomenting them, distributing them, and, most of all, making them predictably effective, which includes our very selves. Good speech,50 bad speech, and their speakers are only content biased by a form that works through instrumentality and predictability, which goes the same for the money funding everything.
Some critical concerns about the discourse in this panic have centered on the fear of interventions that will impinge on legality. We should take heed of them. The problem of legality comes into play with the desire for a state of emergency, where legality ends up being thrown to the wayside for the sake of security. Whatever threatens the state tends to threaten security (which typically includes whatever threatens the accumulation of capital), and when segments of authority come into conflict with one another and/or themselves, then you have the proliferation of major problems.51 Regarding information and its enemies, if these segments of authority cannot communicate internally and with other segments because of the signal interference perturbed by whatever contrary content and/or information overload, then their abilities become neutralized, ineffective, inefficient, and so on, as does security.
This is why this panic is so trendy when discussing situations ripe for a state of exception like the pandemic, January 6th, and the Freedom Convoy: they are case studies regarding how they were originally handled, along with handling the enemies of information, and how situations like these “ought” to be handled when something similar happens in the future. Of course, the concern over the enemies of information matters little in the face of the promotion machine itself, because that was the all-important factor. People doing problematic things based on falsehoods is nothing new, but what is novel is the fact that the promotion machine can accelerate and expand those actions and the effects they have. This has a destabilizing effect on security structures and is deeply problematic because it also interferes with dealing with the respective issues people were exacerbating, thereby exacerbating things even more. It will not be the profusion of information and its enemies that inspire a state of exception, but the promotion machine of surveillance capitalism.
The pandemic (regardless of whatever origin story) had incessantly unconstructive bickering, piss-poor dispersal of funds, flimsy implementation of many measures along with the overkill of others, and the outright refusal of effective measures at the cost of thousands upon thousands of lives. By promoting nonsense and shaping discourse away from the safety of the state, the promotion machine jeopardized security for those who took the pandemic seriously, since they wished to avert disaster and ensure public safety, combat opposing views, and get back to normal in order to get back to business. Yet, in combating opposing views, the forces for public safety did not help their cause by telling white lies and dismissing and attacking legitimate concerns, which then fueled greater opposition and distrust as a result, thereby promoting more nonsense. At the same time, those who did not take the pandemic seriously wanted to go on business as usual, felt their particular kind of security was jeopardized by opposing views, and went out of their way to dismiss the facts and punish those who stood up for them, while needless deaths and preventable long-term health complications took place. Whether one wanted to prevent needless death and overall destabilization or not wreck the vibe of freedom, the hammer came down on the opposition. Will this not inspire more severe measures if something similar happens, whoever happens to be in power?
January 6th was a harvest of misinformation and disinformation from The Big Lie. It had insurrectionist law enforcement and politicians leaking information to other insurrectionists (not to mention insurrectionists being given tours by politicians beforehand and allowed in the day of), while information that poured into various agencies of law enforcement about the impending date did not flow adequately and/or was outright ignored. This is not just about your run-of-the-mill bureaucratic neglect and stiffness, which did take place, but the fact that these agencies themselves also have members who ended up sympathetic enough to the opposition to forego their mandates due to the long-cumulating effects of the promotion machine. This continues to jeopardize security as justice and denial continue to butt heads. Not only this, but there are consequences regarding capital: January 6th was cited as a significant influence on the downgrading of US debt.52 Because of problems like these, will we not have more issues regarding security before, during, and after elections, along with power in America?
The Freedom Convoy gathered steam through the promotion machine and, like January 6th, had many levels of law enforcement unable to communicate because of similar effects of contrariness, overload, lethargy, mishandling, and sympathy in their ranks. The slow mobilization and action of law enforcement led to the first ever invocation of Canada’s Emergencies Act when it probably wasn’t necessary if they had just done their jobs in the first place, even with reduced numbers and leadership. Beyond the actions of the Convoy that were left to do their thing and subsequently make a case for the Act’s invocation, the real emergency was the decrepit organization and syncing of the security structure. Security was jeopardized (and, more importantly, billions of dollars were lost); what else did you expect to happen?
These examples display the consequences of out-of-sync security structures, whether or not you like who attempts to control them, and why people fantasize about or fear a state of exception. While the logic of surveillance capitalism did not plot and organize these examples, obviously, it did make everything about them accessible, organizable, realizable, plottable, fundable, and predictable, whose reach affects all, including security structures. The runaway effects of the promotion machine desynced security in recommending unsafe health strategies; in recommending insurrection (i.e. fascism); in recommending the Freedom Convoy; in recommending the discourses that promote them and lend sympathy and ideological “coherence” to them; in recommending different supportive attitudes towards anti-democratic measures in combating them (like attacking freedom of speech); in recommending folks who are already on the respective algorithmic hooks for the above and below; in recommending apathetic distractions; and in recommending the liquidation of the world and society via neoliberalism that primes people to eventually despise the system (in whatever recommended ways) and become susceptible to the sorts of examples found above in the first place. All of this is going to keep happening, and we will be recommended to do things that subsequently recommend embracing a state of exception.
V
The deluge of problematic content is not the scandal (though please continue to have concerns over hateful/harmful content, invasions of privacy, the evisceration of your civil liberties, and no doubt many more). The real scandal is that the unimpeded logic of surveillance capitalism and its technological crusade of inevitability created and fueled the promotion machine—an automated prediction machine—that Obama and every other player in this panic do not seem to wish to do anything about.
When McLuhan told us that we shape our tools and then those tools shape us, we must focus our concerns on the tool that is the promotion machine and the tools that built it. These are what shape us and the content we and these tools both consume. We operate in feedback loops that not only shape us but also shape virality and our penchant for virality, which goes the same for toxic content, counter-content, and so much more.
If a realistic perspective is to be pushed, then surveillance capitalism is the biggest threat to democracy, not the enemies of information.
Surveillance Capitalism is “A new economic order that claims human experience as free raw material for hidden commercial practices of extraction, predictions and sales,” where a new global architecture of behavioral modification takes the stage from the production of goods and services, whose new collective order hinges upon total certainty.
Zuboff, S. (2019). The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power (The Definition). PublicAffairs.
This is all the more ironic because he has read about surveillance capitalism since he put Zuboff’s book on his list of favorite books in 2019. It is not that I am going to pretend he did not read it, but based on his discourse, I am of the opinion that he has not read it very well.
A part of this monopoly of knowledge are the insane knowledge asymmetries detailed by Zuboff. An initial description is below:
Surveillance capitalism operates through unprecedented asymmetries in knowledge and the power that accrues to knowledge. Surveillance capitalists know everything about us, whereas their operations are designed to be unknowable to us. They accumulate vast domains of new knowledge from us, but not for us. They predict our futures for the sake of others’ gain, not ours. As long as surveillance capitalism and its behavioral futures markets are allowed to thrive, ownership of the new means of behavioral modification eclipses ownership of the means of production as the fountainhead of capitalist wealth and power in the twenty-first century.
The Age of Surveillance Capitalism, p. 11.
Think of Edward Bernays, the golden age of Madison Avenue, what Chomsky and Herman cover in Manufacturing Consent, and more.
The Age of Surveillance Capitalism, p. 298.
The algorithms will end up predicting what will be perceived as good, bad, “evil,” and the like anyway. And let us not forget to mention what those concepts “are” as well.
Besides, real panic would not be called panic; it would be called terror.
Economies of scope involving:
a new set of aims: behavioral surplus must be vast, but it must also be varied. These variations are developed along two dimensions. The first is the extension of extraction operations from the virtual world into the “real” world, where we actually live our actual lives. Surveillance capitalists understood that their future wealth would depend upon new supply routes that extend to real life on the roads, among the trees, throughout the cities. Extension wants your bloodstream and your bed, your breakfast conversation, your commute, your run, your refrigerator, your parking space, your living room.
Economies of scope also proceed along a second dimension: depth. The drive for economies of scope in the depth dimension is even more audacious. The idea here is that highly predictive, and therefore highly lucrative, behavioral surplus would be plumbed from intimate patterns of the self. These supply operations are aimed at your personality, moods, and emotions, your lies and vulnerabilities. Every level of intimacy would have to be automatically captured and flattened into a tidal flow of data points for the factory conveyor belts that proceed towards manufactured certainty.
The Age of Surveillance Capitalism, p. 201.
Economies of action where:
Behavioral surplus must be vast and varied, but the surest way to predict behavior is to intervene at its source and shape it. The processes invented to achieve this goal are what I call economies of action. In order to achieve these economies, machine processes are configured to intervene in the state of play in the real world among real people and things. These interventions are designed to enhance certainty by doing things: they nudge, tune, herd, manipulated, and modify behavior in specific directions by executing actions as subtle as inserting a specific phrase into your Facebook news feed, timing the appearance of a BUY button on your phone, or shutting down your car engine when an insurance payment is late.
Ibid., p. 203.
And if there were to be a movement towards the eminent domain of data, we would then have to ask about what these public purposes are and how they will be managed. We must also express skepticism about whether the just compensation will even be fair. If there is a public-private partnership, will there be adequate oversight and communication? Also, even with these sorts of arrangements and their attempts at oversight, what ensures that either of them will handle data less like drunken jugglers running up and down escalators than they already do? Of course, all of these concerns still gloss over the primary problem, which is the drive for total certainty.
Below is an excerpt from the Wikipedia entry for Paul Thomas Anderson’s film, There Will Be Blood, regarding oil drainage. Or, if one prefers a video, here is the scene at the end of the film that explains it.
[Paul Thomas] Anderson said that the line in the final scene, "I drink your milkshake!", was paraphrased from a quote by former Secretary of the Interior and U.S. Senator from New Mexico, Albert Fall, speaking before a Congressional investigation into the 1920s oil-related Teapot Dome scandal. Anderson said he was fascinated "to see that word [milkshake] among all this official testimony and terminology" to explain the complicated process of oil drainage. In 2013, an independent attempt to locate the statement in Fall's testimony proved unsuccessful—an article published in the Case Western Reserve Law Review suggested that the actual source of the paraphrased quote may instead have been remarks in 2003 by Sen. Pete Domenici of New Mexico during a debate over drilling in the Arctic National Wildlife Refuge. In those remarks, Domenici stated:
The oil is underground, and it is going to be drilled and come up. Here is a giant reservoir underground. Just like a curved straw, you put it underground and maneuver it, and the 'milk shake' is way over there, and your little child wants the milk shake, and they sit over here in their bedroom where they are feeling ill, and they just gobble it up from way down in the kitchen, where you don't even have to move the Mix Master that made the ice cream for them. You don't have to take it up to the bedroom. This describes the actual drilling that is taking place.
The Age of Surveillance Capitalism, p. 20.
Ibid., p. 332.
Ibid., p. 331.
And if you believe in neither, then necessity has been commandeered.
The term choice architecture refers to the ways in which situations are already structured to channel attention and shape action. In some cases these architectures are intentionally designed to elicit specific behavior, such as a classroom in which all the seats face the teacher or an online business that requires you to click through many obscure pages in order to opt out of its tracking cookies. The use of this term is another way of saying in behaviorist language that social situations are always already thick with tuning interventions, most of which operate outside our awareness.
The Age of Surveillance Capitalism., p. 294.
Anderson, B. (2016). Imagined Communities: Reflections on the Origin and Spread of Nationalism (p. 6.). Verso Books.
Ibid., p. 7.
One should recall the total confidence Steve Bannon had regarding “Bernie Bros” helping get Trump elected. His work through Cambridge Analytica is what grounded this confidence.
Steve Bannon: Bernie Sanders Fans Made Trump President ‘and They’ll Make Him President Again’
Imagined Communities, p. 7.
Ibid.
Which is “a new kind of market that trades exclusively in future behavior.”
The Age of Surveillance Capitalism, p. 96.
Ibid., p. 376.
McLuhan, M., & Fiore, Q. (1968). War and Peace in the Global Village: An Inventory of Some of the Current Spastic Situations that Could be Eliminated by More Feedforward (p. 175). New York ; Toronto : McGraw-Hill.
Since:
In [Eric] Schmidt’s 2014 book, coauthored with longtime Google executive Jonathon Rosenberg, the CEO aggressively developed the theme of government as the shill of incumbents colluding to inhibit change, with Good on the outside: an upstart and a disrupter. The authors voiced their disdain for politicians and lobbyists, writing, “This is the natural path of politicians since incumbents tend to have a lot more money than disrupters and are quite expert in using it to bend the political will of any democratic government.”
The Age of Surveillance Capitalism, p. 124.
Schmidt, E., & Rosenberg, J. (2014). How Google Works p. 255. Grand Central Publishing.
I wonder what kind of persuasion scores were calculated and extended for the guy who participated in the insurrection and was arrested in Obama’s neighborhood after his address was leaked by Trump…
Jan. 6 defendant arrested near Obama’s home had guns and 400 rounds of ammunition in his van
Trump posted what he said was Obama's address, prosecutors say. An armed man was soon arrested there
The Age of Surveillance Capitalism, p. 122-24.
Which goes the same for fascists like Trump.
Imagine an alternative to the scene from the “Argestes” episode of HBO’s Succession, where Tom is informed by Greg of the negatives behind a new slogan of “ATN: We’re Listening,” because they literally are, and so ends up changing it to “ATN: We Here for You,” after messing up the desired play on the word “hear.” In the alternative, instead of a massive company like the fictitious Waystar-Royco, you have a very successful online therapy company (whose success is evident in actual positive outcomes, not just profit) telling you, “We Know What You’re Going Through.” Of course, you would love to hear this from empathetic therapists, but it is the company telling you that they actually know what you’re going through via the whole gamut of data-mining and prediction methods.
Ironically enough, when hypothetically inclined to change the slogan, they could use the alternative slogan from Succession as well. After all, the mask of therapeutic care covers over the algorithmic “we,” the “here” of the company’s all-knowing presence, the methods through which they are “for” “you,” where “you” are the predicted, if not guaranteed, therapeutic double.
See The Blueprint for an AI Bill of Rights and its examples of negative outcomes in policing, health, education, labor, etc.
Blueprint for an AI Bill of Rights: Making Automated Systems Work for the American People
Of course, if one is cynical/realistic, these notions were always corrupt and immune to any idealistic transformations. What differs now is that this corruption has become corrupted faster than it can be reined in by those very forces of corruption.
After all, everything spoken of here does not have to work on 100%, 50%, 25% of people. It can work on even less and its effects would still be adequately exerted. If the “immune” are still interacting with these affected elements, no matter how small the percentage, then they are still a part of the process.
Innis, H. A. (2018). Political Economy in the Modern State (p. vii). University of Toronto Press.
Bonnett, J. (2013). Emergence and Empire: Innis, Complexity, and the Trajectory of History (p. 7). McGill-Queen’s University Press.
Ibid.
Ibid., p. 12.
Ibid., p. 187.
Free in the sense of being given away by the user and/or taken and shared by the companies (and usually unbeknownst to the user in toto or through the cynicism/indifference of not even bothering to care/know).
Where I will make the assumption that “good speech” can reflect a blend of truth and/or parrhesia: that being speaking truth in the sense of Cartesian evidential experience and/or speaking truth at risk to oneself, where one must worry about the drowning out of good speech via arbitrary chatter, falsehood, and dangerous forces.
The Meaning and Evolution of the Word “Parrhesia”: Discourse & Truth, Problematization of Parrhesia - Six lectures given by Michel Foucault at the University of California at Berkeley, Oct-Nov. 1983
Strange, S. (2015). States and Markets (p. 51-3). Bloomsbury Publishing.