Communication as a toy.
Communication as a tool
Communication as a weapon.
It can be difficult to use a medium most used to not only present and promote stereotypes, but one that has for decades successfully sold stereotype as product. In a society, identity is a strong intoxicant. People wish to be given a program, a set of instructions with titles that inform them of how they should be behaving.
Even those of us within more radical behavior spaces must ward against buying into programming that suggests how we should act and speak. United States of America is replete with rebels without a cause. We applaud passionate blindness that speaks in Marx but steps in Rockefeller.
Given our present media landscape is predominately networked and programmed interfaces, I thought it was necessary to return to some of Douglas Rushkoff’s writing. In Rushkoff’s manifesto, “PROGRAM OR BE PROGRAMMED: Ten Commands for a Digital Age”, he outlines ten rules to guide how media trust should be applied in online spaces. Each rule in his outline is fleshed out with practical reasoning.
I quote at length here:
Do Not Be Always On
The spirit of the digital age still finds its expression in this reappropriation of time. Our cutting and pasting, mash-ups and remixes, satires and send-ups all originate in this ability to pause, reflect, and rework.
We work against the powerful bias of a timeless technology, and create a situation in which it is impossible to keep up. And so we sacrifice the thoughtfulness and deliberateness our digital media once offered for the false goal of immediacy—as if we really can exist in a state of perpetual standby.
The results aren’t pretty. Instead of becoming empowered and aware, we become frazzled and exhausted. We have no time to make considered responses, feeling instead obligated to reply to every incoming message on impulse. We reduce the length and complexity of our responses from paragraphs to sentences to txts, making almost everything we transmit sound like orders barked over a walkie-talkie in a war zone. Everything must happen right away or, better, now. There is no later. This works against the no-time bias of digital media, and so it works against us, even though it might work for the phone company programming the device and inducing our dependence and compliance. (Yes, each variety of beep is studied and tested for its ability to entrain our behavior.)
Yet this very discomfort and anxiety compels us to seek still more: The possibility of one great email from a friend, or one good contract offer somewhere down in that list of unanswered messages keeps us compulsively checking our inboxes, iPhones and BlackBerrys like classically conditioned gamblers at the slot machines.
It’s not the networking of the dendrites in our skulls that matters so much as how effective and happy we are living that way and, in the case of digital media, how purposefully we get ourselves there. Recognizing the biases of the technologies we bring into our lives is really the only way to stay aware of the ways we are changing in order to accommodate them, and to gauge whether we are happy with that arrangement. Rather than accepting each tool’s needs as a necessary compromise in our passively technologized lifestyles, we can instead exploit those very same leanings to make ourselves more human.
Our computers live in the ticks of the clock. We live in the big spaces between those ticks, when the time actually passes. By becoming “always on,” we surrender time to a technology that knows and needs no such thing.
Live in Person
While the intent of digital networks was not to disconnect a high school girl from her real world friendships, the bias of the networks were absolutely intended to favor decentralized activity. After all, the net was developed as a communications platform capable of withstanding nuclear attack. Messages—whether text, audio, or video—move through the network as “packets,” each taking different routes from node to node until they find their destination. The network is still controlled centrally by an authority (we’ll get to this later), but it functions in a decentralized way.
As a result, digital media are biased away from the local, and toward dislocation. Just as television is better at broadcasting a soccer game occurring on the other side of the world than it is at broadcasting the pillow talk of the person next to you in bed, the net is better at creating simulations and approximations of human interaction from a great distance than it is at fostering interactions between people in the same place.
Technology and media traditionally worked to make commerce more global, favoring big business over local interests. Mass production distanced workers from the value they were creating. Instead of making a product from beginning to end, each worker on an assembly line completed one small task in the overall process. The product moves from person to person—or even nation to nation—as it is assembled. Each person means less to the production cycle. One’s skill level becomes less important as repeatable processes replace craftsmanship and expertise. Workers become cheaper and replaceable, while corporate pricing power puts local companies out of business. Towns become ever more dependent on foreign-owned factories for employment.
Mass-produced products require mass marketing to sell them. Instead of buying oats from Bob the miller, people—now “consumers”—were to purchase them from a big company a thousand miles away in Ohio. The face of a Quaker on the package helped to re-create the kind of bond people used to enjoy with the fellow community members with whom they previously exchanged goods. Finally, a mass media arose to promote these long-distance brand images to an entire nation. Through radio and television, non-local companies could seed a market with their brands and mythologies before the package even made it to the shelf.
While cable television and, now, Internet marketing give smaller businesses a way to peddle their wares in the same media as their corporate counterparts, it may actually work against their real strength as real world, local companies. The power of a local business—or any local enterprise—is its connection to a particular region and people. Its locality is its strength. By turning to a decentralized medium to engage with people right around the corner, a local business loses its home field advantage. Its banner ads will never look as good as those coming out of a marketing agency anyway.
This misguided tendency to depend on long-distance technology to enhance up-close encounters is completely understandable and forgivable. The more connected we feel in digital spaces, the less securely connected many of us feel in real ones. After days or weeks connecting with people through video chats, the sensation of someone’s eyes actually looking into our own in real life can be overwhelming and disorienting.
By recognizing digital media’s bias for dislocation, we are enabled to exploit its strength delivering interactivity over long distances, while preserving our ability to engage without its interference when we want to connect locally. Many businesses—particularly the biggest ones—already exist in a non-local reality. The entire history of industrial corporatism, from colonial empires to the railroad barons of the nineteenth century, depended on disconnecting people from their local strength and commanding them from afar. For them, it is just as ridiculous to use the net to feign that they are local enterprises as it is for local enterprises to use it to act in the manner of national brands. Powerful global companies become weak local ones, while promising local companies become weak global players.
The digital age offers us all the opportunity to recognize the dislocating bias of our interactive media. With that knowledge, we may choose when we wish to live and work in real places, with one another and—unique to living humans—in person.
You May Always Choose None of the Above
We all want the freedom to choose, and the history of technology can easily be told as the story of how human beings gave themselves more choices: the choice to live in different climates, to spend our time doing things other than hunting for food, to read at night, and so on. Still, there’s a value set attending all this choice, and the one choice we’re not getting to make is whether or not to deal with all this choice.
Whether it’s an online bookstore suggesting books based on our previous selections (and those of thousands of other consumers with similar choice histories), or a consumer research firm using kids’ social networking behaviors to predict which ones will someday self-identify as gay (yes, they can do that now), choice is less about giving people what they want than getting them to take what the choice-giver has to sell.
Withholding choice is not death. Quite on the contrary, it is one of the few things distinguishing life from its digital imitators.
You Are Never Completely Right
Thanks to its first three biases, digital technology encourages us to make decisions, make them in a hurry, and make them about things we’ve never seen for ourselves up close. Furthermore, because these choices must all be expressed in numbers, they are only accurate to the nearest decimal place. They are approximations by necessity. But they are also absolute: At the end of the day, digital technologies are saying either yes or no.
Yes, thanks to the digital archive we can retrieve any piece of data on our own terms, but we do so at the risk of losing its context. Our knee-jerk, digital age reaction against academic disciplines finds its footing in our resentment for centuries of repressive hierarchies. Professors, gurus, and pundits made us pay for access to their knowledge, in one way or another. Still, although they may have abused their monopolies, some of the journeys on which they sent us were valid. The academic disciplines were developed over centuries, as each new era of experts added to and edited the body of what they considered to be essential knowledge. By abandoning the disciplines—however arbitrarily they may have been formulated—we disconnect ourselves from the multigenerational journey leading up to this moment. We are no longer part of that bigger project, or even know what it is we are rejecting.
This makes digital technology—and those of us using it—biased toward a reduction of complexity.
In the more immediate sense, facts devoid of context are almost impossible to apply sensibly. They become the fodder for falsely constructed arguments of one side or other of the social or political spectrum. The single vote of a politician is used to describe his entire record, a single positive attribute of caffeine or tobacco receives attention thanks to public relations funding, and a picture of a single wounded child turns public opinion against one side in a conflict rather than against war itself.
Both sides in a debate can cherry-pick the facts that suit them—enraging their constituencies and polarizing everybody. In a digital culture that values data points over context, everyone comes to believe they have the real answer and that the other side is crazy or evil. Once they reach this point, it no longer matters that the opposing side’s facts contradict one’s own: True believers push through to a new level of cynicism where if the facts are contradictory, it means they are all irrelevant. The abundance of facts ends up reducing their value to us.
As a result, we tend to retreat into tribes, guided primarily by our uninformed rage. And we naturally hunger for reinforcement. Television news shows rise to the occasion, offering shouting matches between caricatured opposites competing for ratings. Elected officials are ridiculed as “wonks” for sharing or even understanding multiple viewpoints, the history of an issue, or its greater context. We forget that these are the people we’re paying to learn about these issues on our behalf. Instead, we overvalue our own opinions on issues about which we are ill informed, and undervalue those who are telling us things that are actually more complex than they look on the surface. They become the despised “elite.”
Models are necessarily reductive. They are limited by design. This does not negate their usefulness; it merely qualifies it. Digital reduction yields maps. These maps are great for charting a course, but they are not capable of providing the journey. No matter how detailed or interactive the map gets, it cannot replace the territory.
One Size Does Not Fit All
On the net, everything scales—or at least it’s supposed to. Digital technologies are biased toward abstraction, bringing everything up and out to the same universal level. People, ideas, and businesses that don’t function on that level are disadvantaged, while those committed to increasing levels of abstraction tend to dominate. By remembering that one size does not fit all, we can preserve local and particular activities in the face of demands to scale up.
On the net, everything is occurring on the same abstracted and universal level. Survival in a purely digital realm—particularly in business—means being able to scale, and winning means being able to move up one level of abstraction beyond everyone else.
Because the net is occurring on a single, oversimplified and generic level, success has less to do with finding a niche than establishing a “vertical” or a “horizontal.” Going vertical means establishing oneself as the place to do everything in a particular industry: the one-stop place for hardware, or cycling needs, or home electronics. Going horizontal means to offer a service that applies to every category’s transactions…. In either case, “scaling up” means cutting through the entire cloud in one direction or another: becoming all things to some people, or some things to all people.
What all this abstraction does accomplish here on earth, however, is make everyone and everything more dependent on highly centralized standards. Instead of granting power to small businesses on the periphery, the net ends up granting even more authority to the central authorities, indexers, aggregators, and currencies through which all activity must pass. Without the search engine, we are lost. Without centrally directed domain name servers, the search engines are lost. Further, since digital content itself needs to be coded and decoded, it requires tremendous standardization from the outset. Far from liberating people and their ideas from hierarchies, the digital realm enforces central control on an entirely new level.
Activism means finding a website, joining a movement, or “liking” a cause—all of which exist on a plane above and beyond their human members. Learning, orienting, and belonging online depend on universally accepted symbols or generically accessible institutions.
Abstraction has been around since language, perhaps even before. Money, math, theology, and games would all be impossible without abstracted symbol systems, accepted standards, and some measure of central authority. The digital realm is no different in that regard.
Our digital experiences are out-of-body. This biases us toward depersonalized behavior in an environment where one’s identity can be a liability. But the more anonymously we engage with others, the less we experience the human repercussions of what we say and do. By resisting the temptation to engage from the apparent safety of anonymity, we remain accountable and present—and much more likely to bring our humanity with us into the digital realm.
In a hostile, depersonalized net environment, identity is one’s liability.
The less we take responsibility for what we say and do online, the more likely we are to behave in ways that reflect our worst natures—or even the worst natures of others. Because digital technology is biased toward depersonalization, we must make an effort not to operate anonymously, unless absolutely necessary. We must be ourselves.
We don’t put words into the digital realm unless we are willing to own them.
Do Not Sell Your Friends
In spite of its many dehumanizing tendencies, digital media is still biased toward the social. In the ongoing coevolution between people and technologies, tools that connect us thrive—and tools that don’t connect us soon learn to. We must remember that the bias of digital media is toward contact with other people, not with their content or, worse, their cash. If we don’t, we risk robbing ourselves of the main gift digital technology has to offer us in return for our having created it.
Almost immediately after the first computer networks were developed for Defense Department use, their system operators noticed something strange: the scientists who had accounts were spending more time and bandwidth talking about their personal research interests and favorite science fiction novels than official business.
While the Internet—then Arpanet—was a technological success, it had become overwhelmed by social use. The government decided to give it away. AT&T turned down the offer to take it over. In what may have ultimately been a kind of wisdom, they couldn’t see a business application for what appeared to be an academic social scene. The government ended up setting the net free, to a large extent, with the proviso that it only be used for research purposes.
No one thought the net would end up going anywhere—not least of which because people conversing with one another over networks seemed to be a financial dead end. The net was compared to Citizens Band radio—a two-year fad that faded even before a movie about the truck drivers’ lingo and culture could be shot and released. My own first Internet book was actually canceled by the publisher in 1992 because they thought the net would be “over” by 1993, when the book would finally hit the shelves.
The social, noncommercial net continued to grow and grow. By 1994, studies showed that families owning a computer modem were watching an average of nine hours less television per week. Worse, at least to marketers, they were spending it in a completely commercial-free medium. Finally, after a series of violations by small businesses looking to promote their services online, the net was opened for commercial use. Legislators used the argument that it couldn’t be held back effectively anyway.
Friendships, both digital and incarnate, do create value. But this doesn’t mean the people in our lives can be understood as commodities to be collected and counted. People are not things to be sold piecemeal, but living members of a network whose value can only be realized in a free-flowing and social context. We have yet to find out what that value might be.
Content was never king, contact is. Yet the possibilities for new levels of human connectedness and collaboration offered by networking technologies have hardly been tapped. We are too slow to realize that people are not a form of content—a resource to be bought and sold; they are fellow cells in the greater organism of which we are all a part but are barely aware. We value our increased contacts for what they might provide and miss the greater value of the contact itself.
But it is this contact, this desire to construct a social organism together, that has been the driving force of digital technology all along. The instinct for increased contact is the evolutionary imperative we feel to become something greater than ourselves. Just as atoms combined into molecules, molecules clustered into cells, and cells collected into organisms, we organisms are networking into greater levels of organization.
This is the real draw of the many interactive devices we have brought into our lives. In a sense, the people dismissing the net as another form of CB radio had it right: We are still just finding novel ways of talking to one another. From the ridiculous faxes people used to send to each other containing lists of bad jokes to the tweets we now transmit by cell phone, each new communications technology provides a new excuse to forge connections.
The content is not the message, the contact is. The ping itself. It’s the synaptic transmission of an organism trying to wake itself up.
Tell the Truth
The network is like a truth serum: Put something false online and it will eventually be revealed as a lie. Digital technology is biased against fiction and toward facts, against story and toward reality. This means the only option for those communicating in these spaces is to tell the truth.
We all have relatives who mistakenly pass on ridiculous viral emails about corporations that will give a donation of a million dollars if you pass the email to others, or a kid in a hospital who needs a blood transfusion, or a threatening virus that will eat your data if you don’t shut down you computer immediately. It’s sweet that they want to share with us; it’s just a shame they don’t have anything real to share. Viral media fills this need for them, giving them fake facts with which to feed digital media’s bias for nonfiction contact.
Those who succeed as communicators in the new bazaar will be the ones who can quickly evaluate what they’re hearing, and learn to pass on only the stuff that matters. These are the people who create more signal and less noise, and become the most valued authorities in a digital media. But the real winners will once again be those who actually discover and innovate—the people who do and find things worthy of everyone else’s attention. They’re the ones who give us not only good excuses to send messages to one another, but also real ways for us all to create more value for one another.
The way to flourish in a mediaspace biased toward nonfiction is to tell the truth. This means having a truth to tell.
Share, Don’t Steal
Digital networks were built for the purpose of sharing computing resources by people who were themselves sharing resources, technologies, and credit in order to create it. This is why digital technology is biased in favor of openness and sharing. Because we are not used to operating in a realm with these biases, however, we often exploit the openness of others or end up exploited ourselves. By learning the difference between sharing and stealing, we can promote openness without succumbing to selfishness.
Program or Be Programmed
Digital technology is programmed. This makes it biased toward those with the capacity to write the code. In a digital age, we must learn how to make the software, or risk becoming the software. It is not too difficult or too late to learn the code behind the things we use—or at least to understand that there is code behind their interfaces. Otherwise, we are at the mercy of those who do the programming, the people paying them, or even the technology itself.
Instead of teaching programming, most schools with computer literacy curriculums teach programs. Kids learn how to use popular spreadsheet, word processing, and browsing software so that they can operate effectively in the high-tech workplace. These basic skills may make them more employable for the entry-level cubicle jobs of today, but they will not help them adapt to the technologies of tomorrow.
Their bigger problem is that their entire orientation to computing will be from the perspective of users. When a kid is taught a piece of software as a subject, she’ll tend to think of it like any other thing she has to learn. Success means learning how to behave in the way the program needs her to. Digital technology becomes the immutable thing, while the student is the movable part, conforming to the needs of the program in order to get a good grade on the test.
Programming is the sweet spot, the high leverage point in a digital society. If we don’t learn to program, we risk being programmed ourselves.
Early computers were built by hackers, whose own biases ended up being embedded in their technologies. Computers naturally encouraged a hacker’s approach to media and technology. They made people less interested in buying media and a bit more interested in making and breaking it. They also turned people’s attention away from sponsored shows and toward communicating and sharing with one another. The problem was that all this communicating and sharing was bad for business.
So the people investing in software and hardware development sought to discourage this hacker’s bias by making interfaces more complex.
Fully open and customizable operating systems, like Linux, are much more secure than closed ones such as Microsoft Windows. In fact, the back doors that commercial operating systems leave for potential vendors and consumer research have made them more vulnerable to attack than their open source counterparts. This threat is compounded by the way commercial vendors keep their source code a secret. We aren’t even to know the ways we are vulnerable. We are but to trust. Even the Pentagon is discouraged from developing its own security protocols through the Linux platform, by a Congress heavily lobbied to promote Windows.
Like the military, we are to think of our technologies in terms of the applications they offer right out of the box instead of how we might change them or write our own. We learn what our computers already do instead of what we can make them do. This isn’t even the way a kid naturally approaches a video game. Sure, a child may play the video game as it’s supposed to be played for a few dozen or hundred hours. When he gets stuck, what does he do? He goes online to find the “cheat codes” for the game. Now, with infinite ammunition or extra-strength armor, he can get through the entire game. Is he still playing the game? Yes, but from outside the confines of the original rules. He’s gone from player to cheater.
Until push comes to shove and geopolitics force us to program or perish, however, we will likely content ourselves with the phone apps and social networks on offer.
For those who do learn to program see the rest of the world differently as well.
“PROGRAM OR BE PROGRAMMED:Ten Commands for a Digital Age”, Douglas Rushkoff(2010)