Users for Sale: Has Digital Illiteracy Turned Us Into Social Commodities?

Facebook branding image

The web-based email, chat and message boards that came online in the ’90s were poised to revolutionize interpersonal communication. Then companies tried to monetize the Internet.

“The dot com boom failed because people didn’t want to buy shit online. They were just talking to each other,” said Douglas Rushkoff in a recent keynote speech at the WebVisions conference in Portland. “Content was never king. Contact was always king.”

Today, the dorm room experiments of the digital generation have become billion dollar corporations. We’ve finally figured out how to monetize social interaction, and Rushkoff, an award-winning author and media theorist who writes and speaks regularly on these topics, has reservations.

He holds no grudge against technology itself, but rather our widespread programming illiteracy - that is, our fundamental ignorance of how these web platforms operate, share our information, and profit from our online relationships. Whereas the social web was once a Wild West of BBS chatter, so many of us now experience it through discreet corporate channels like Google and Facebook - platforms that may not always have our best interests (or the interests of healthy online discourse) at heart. Rushkoff poses the question: Are we ceding mankind’s last best hope of free and open information to marketers and corporations?

At the end of the day, is marketing around content (be that a TV show or a status update) really any different now than it was 50 years ago? Or have we entered into a kind of hypercapitalism that overlooks the web’s greatest promise in favor of the next investment-baiting app? We spoke to Rushkoff about the current state of web culture and his crusade to encourage programming literacy.

Q&A With Douglas Rushkoff

Douglas Ruskoff Image

Author of Program or Be Programmed: Ten Commands for a Digital Age

You argue that users are not the true customers of social networks like Facebook. What are the ramifications of this?

Well, on the very simplest level I’m concerned about most people’s total lack of awareness. We move through online spaces with little knowledge of what they are for. I think when we walk into a store in the real world, most of us are aware that the rent is being paid by a person or company that wants to sell us goods. I think we have awareness (I could be overestimating us here, but I really don’t think I am) that when we cross the threshold from the street into the store, that we have moved from a public place into a private place. We understand that the job of the person working in the Gap is to sell us clothes.

“Usually, the people paying are the customers. So on Facebook, the people paying are marketers.”

But we don’t apply this same very basic logic to online spaces. The easiest way to figure out who the customer is in an online space is to figure out who is paying for the thing. Usually, the people paying are the customers. So on Facebook, the people paying are marketers. That makes them the customers. And it means we are the product being delivered to those customers.

Do we get something in return? Sure. We get a certain kind of communications tool. We get a way of describing who we are to the rest of the world - in the terms that the marketers are paying for us to use. It’s not some weird conspiracy or anything. It’s just business. But if we’re not aware of the business, of what the tool is for, of the fact that Facebook’s job is to sell our social graphs to companies, and to get us experiencing ourselves in terms of our social graphs, then we are much more susceptible to changing the way we think of ourselves and our relationships. We are more likely to use our Facebook profile as a mirror, chalking up its deficiencies to the technology itself. We don’t consider that the ways in which Facebook screws with the way we see ourselves is its function, rather than some random artifact of social networking.

Is this different from TV networks selling commercials against popular shows that they deliver over the airwaves for free?

In many ways, not at all. And I do remember the moment when, as a child, I realized that the things we call “TV shows” are really just the stuff that gets put between commercials. Later, I came to see that the kinds of things that get on “free” TV are shows that help sell products. That’s why most mainstream TV sucks. And it’s why pay-TV like HBO and Showtime, and even some of the cable channels that have subscriber-based revenue sources, are capable of producing such better content. In those cases, we viewers are the customers.

But imagine what it would be like if you didn’t know that the evening news was funded primarily by Big Pharma. You would actually believe the stuff that they’re saying. You might even think those are the stories that matter.

When (if ever) are these free technologies worth trading a bit of privacy for?

You don’t have any privacy to trade, anyway. You think they don’t already know everything? The only thing standing between you and total surveillance is the fact that they don’t yet have the processing capability to mine their data effectively.

And computers just create a bigger trail. Digital technology allows the content of your phone calls and letters (emails) to be collected and stored. But data mining companies have had the goods on us for many decades. They used to use notecards in giant file drawers, and then employ factor analysis to model behaviors, discern likely customers, and help direct mail marketers save on postage. And that was using housing records, drivers licenses, medical records and credit card bills. Stuff that was public, but also things they bought.

The net just amplifies this ability. But it also makes the process more transparent to us. Yes, this has always been going on, but now in some ways we are becoming more aware of it.

In answer to your question, engaging with people costs us privacy. It always has. I think the only way to behave is as if nothing is private. And then fight to make what you care about legal and acceptable.

You warn against the dangers of “selling our friends” by connecting our social graphs to various networks and apps. How does this damage our relationships, even if we’re doing it unwittingly?

Well, it’s certainly worse to do it wittingly. I get a kick out of companies that offer us the ability to get a cash kickback for selling our social graphs. As if cutting us in on the value proposition makes it OK to turn our friendships into marketing contacts. They act as if it is a way of fighting back against corporate power, when it’s really just turning social behavior into corporate behavior. We get to be as dehumanizing and desocializing as our marketers. Revolution!

Unwittingly, well, it’s more like when your friends keep inviting you to FarmVille or LinkedIn. When they unwittingly turn over their address book to one of these companies that’s really just in the business of swelling their subscriptions so that they can go have an IPO. And after a while, you want to shut down your networks and make yourself less available because the bandwidth and time is being abused.

It’s kind of like a Tupperware party, where our social bonds are being exploited to sell plastic. Eventually, you stop answering that person’s calls because it means being subjected to the Tupperware sales.

You advocate “programming literacy” in the online platforms we use every day. How much can the average web user be expected to understand?

I don’t think the average web users of this century will achieve basic programming literacy. They will be more like the people of those first five or six centuries after the alphabet, who just couldn’t or wouldn’t learn the 22-letter alphabet. It just seemed too hard to them. In their defense, though, books were really expensive and rare. They weren’t interacting with text every day, so it didn’t really matter that they couldn’t read it.

We are interacting with programs all the time, so I think there’s a legitimate argument in asking that people know something about those programs. If they don’t know how to make the programs, then I’d at least want them to know what the programs they are using are for. It makes it so much more purposeful. You get much more predictable results using the right technologies for the right jobs.

But if people can’t learn programming, I just want them to know what it is. That it exists. I want people to be able to read the programs and online environments in which they spend so much time. I want people to be able to ask themselves, “What does this website want me to do? Who owns it? What is it for?” [It’s] really simple stuff like that, which doesn’t occur to people if they think of the net as a natural space. It’s not. It is a created space.

I think people can understand this much.

You note how our traditional social contracts (e.g. I can steal anything I want, but I won’t do it out of shame, fear, etc.) break down due to the anonymity and distance of the web. How can we change this and still maintain an open online culture?

The way to change this is to rehumanize. People have to become aware that there are other humans here. Once they do that, they can create a new social contract that respects the humanity of the other people.

People have a natural desire to share, and the main reason we don’t do that today is because we’ve accepted the notion that things are scarce. We have an economic operating system based in scarcity - that’s how we create markets - so we don’t have a great way yet of sharing abundant resources. Ironically, we are making things like air and water scarce largely because of an economic system that requires growth and hoarding and scarcity. We’ll get a physical environment that matches our economic ground rules, and that will be a real pity. It’s a problem of imagination, not reality. We have imaginary boundaries.

“We have the opportunity and obligation to build technologies that are intrinsically liberating - programs that reveal their intentions, and that submit to the intentions of their users.”

The job of changing these perceptions, though, is that of the programmers. We can build technologies and networks that are based on economic and social possibility rather than on maintaining the status quo. It’s a lot harder to get paid for building these sorts of things, which is why I’ve convened the Contact Summit this October, where I’ll be giving away $30,000 in awards to some of the best ideas, hooking people up with the best CEOs and inventors out there, from Dennis Crowley and Scott Heiferman to Clay Shirky and Bre Pettis - as well as artists and thinkers like RU Sirius, Astra Taylor, Arthur Brock, Caroline Woolard.

There are many many great people out there who have dedicated their lives to seeing the net restore the social fabric that top-down media, banking, and politics have destroyed. It’s just a matter of bringing them together and announcing that we’re all here to do the same thing.

I see two groups that present educational challenges ahead: digital natives who are more susceptible to marketing and faulty online information (as you have cited), and adults who are still trying to navigate a digital world they didn’t grow up in. How can we ensure these two groups are using the web responsibly?

We can’t! And they aren’t!

But rather than getting people to use the web responsibly and intelligently, it may be easier to build networks that treat the humans more responsibly and intelligently. Those of us who do build stuff, those of us who are responsible for how these technologies are deployed, we have the opportunity and obligation to build technologies that are intrinsically liberating - programs that reveal their intentions, and that submit to the intentions of their users.