The rise of populism in the attention economy

We only have so much attention to give and as such, it’s a valuable resource. Everyone wants our attention: social media, advertisers, politicians, family and friends. Attention is a limited resource and technology gobbles up at lot of it; just look at the number of people glued to their screens on any street or in any cafe.

Herbert Simon image: Wikipedia

Noble Prize winning political scientist Herbert A. Simon described the concept of the attention economy in 1971. The growth of information dilutes our attention. Simon says:

“What information consumes is rather obvious: it consumes the attention of its recipients. Hence a wealth of information creates a poverty of attention and a need to allocate that attention efficiently among the overabundance of information sources that might consume it.”

More recently, James Williams has researched how technology absorbs our attention. Williams is a doctoral researcher at Oxford University but before that he also spent 10 years working for Google. He believes that the liberation of human attention may be the defining moral and political struggle of our time.

Williams spoke to CBC’s Spark about the misalignment between the goals that we have for ourselves and the goals that our technologies would impose on us. Technology attracts attention that we would really like to apply elsewhere. He told host Nora Young:

“The things that we want to do with our lives, the things that we’ll regret not having done, the things that I think technology exists to help us do aren’t really represented in the system and aren’t really the sort of incentives that are driving the design of most of these technologies of our attention today (June 1, 2018).”

Seen from the goal of attention-getting, U.S. President Trump makes a lot of sense. He does whatever it takes to get our attention because he understands the impact that it has on his ratings. The content of his Tweets may be sheer fabrication but that’s not the point. His years as a TV showman taught him the effect that outrage has on tribalism. What is factually true is irrelevant.

“This is what people didn’t realize about him [Trump] during the election, just the degree to which he just understood the way the media works and orchestrated it,” says Simon. “But I don’t think there is going back, as long as these media dynamics remain as they are. In a way, I think we have to be more concerned about what comes after Trump than what we have with him.”

Trump is not interested in unifying the country –he wants to divide it so the largest tribe is his.

Research published in the February issue of American Sociological Review reveals the way Trump supporters view his acknowledged dishonesty. Participants in a study were told that one of Trump’s tweets about global warming being a hoax had been definitely debunked –that global warming is real. Trump supporters saw the tweet, not as literal, but as a challenge to the elite (Scientific American, September, 2018).

Canadian philosopher and public intellectual, Marshall McLuhan, foresaw the impact of technology:

“We shape our tools and thereafter our tools shape us,” and “The new electronic independence re-creates the world in the image of a global village.”

Four decades later, McLuhan might have added: “Populism is the politics of the global village.”

Advertisement

Anonymity is not enough in apps

You can set your privacy settings on apps so that personal data is not shared. But even anonymous data can threaten security.

Take the case of the fitness tracking app Strava. Their website tracks exercise routes of users and plots them on a map of the world. The routes show up as bright lines; the brighter they are, the more they are used. You can’t pick out individuals on the map because they are only sharing data anonymously. They are revealing in ways that were never intended.

In this Strava map of Kamloops, you can see familiar areas of the city that where people have been exercising. There’s the downtown grid, Rayleigh, and Sun Peaks on the upper right. Some areas are a bit mysterious, like in the lower left. I went to Google Maps to see if there is a community there but couldn’t find any. Someone, or group, exercises near Chuwhels Mountain above New Gold Afton Mine. Is there a camp that I don’t know of?

 Strava map

Australian student Nathan Ruser was doing some similar browsing, comparing exercise routes on Strava to Google Maps, when he came across exercise routes around U.S. military bases in Iran, Syria and Afghanistan. The Strava map revealed much more than the Google map did: it exposed troop movements. It probably never occurred to soldiers how much they were lighting up the base.

While the locations of the military bases are not exactly top secret, the movements of soldiers could compromise the operational security. The fitness app could highlight sensitive outposts and troops’ habitual routes during military drills and patrols. Ruser, who is also an analyst for the Institute for United Conflict Analysts, tweeted:

“If soldiers use the app like normal people do, by turning it on tracking when they go to do exercise, it could be especially dangerous. This particular track looks like it logs a regular jogging route. I shouldn’t be able to establish any Pattern of life info from this far away (January 27, 2018)”

Air Force Colonel John Thomas, a spokesman for U.S. Central Command, told the Washington Post that the military was looking into the implications of the Strava map.

It probably didn’t occur to soldiers that they were compromising base security by simply turning on the fitness tracker. After all, none of their personal information was being shared.

This way of thinking ignores the greater good according to Arvind Narayanan, a computer scientist at Princeton University.

“This assumes that my behaviour affects my privacy,” Narayanan told CBC Radio’s Spark, “but really I think what Strava story has shown is that it’s more than that. That’s when privacy becomes a collective issue (February 2, 2018).”

The privacy settings can be confusing. Someone going out for a run doesn’t want to spend time trying to figure out which boxes to check.

Beyond the actions of individuals and their privacy settings, there is the vulnerability of big corporations.

“Strava has been in the news but there are dozens of companies sitting on sensitive data. There’s not a lot of public oversight around these super sensitive databases about billions of people,” adds Narayanan.

The future of smart radios

I imagine that the future of radio will combine traditional fm with the technology of smart phones.

I’m not talking about the distant future: the fm broadcast protocols already exist and most cell phones already have an fm radio chip, although you’d never know it. Chris Burns wonders why. In his article for SlashGear.com and he explains how you can find out if your phone has the chip:

“A whole bunch of smartphones out on the market today have FM radio capabilities – but their owners don’t know it. There’s no real good reason for this lack of knowledge save the lack of advertising on the part of phone makers. . . Today we’re listing the whole lot of phone devices that can run FM Radio right out the box.”

I first heard about the fm chip in cell phones last year on CBC Radio’s Spark. Barry Rooke explained how useful they could be. They could be used where no cell service exists and in an emergency when cell towers are down as in the wildfires of Fort McMurray in 2015.

Rooke is the executive director of the National Campus and Community Radio Association and he’s formed a consortium of broadcasters, including CBC, and radio listeners who would like to see the FM radio chip activated.

It doesn’t even have to be a smartphone to receive fm. A friend bought a simple cell phone in Mexico with the fm chip activated for $22 dollars, and that included free calls for eight days -no contract (it galls me how much more Canadians pay for cell phones, but that’s another column). You can hardly buy an fm radio alone for that amount.

The innovation that I imagine would be the use of graphics in smartphones. Some of the fm audio spectrum would be partitioned off for text and lo-res graphics. The text could include lyrics of the song being played and a picture of the artist, news, weather, sports, traffic, stock reports. In poor countries where the phone is more common than radios, it could include voting information, crop and commodity reports. Text and graphics could be saved for future reference.

The graphics would be stacked on the original signal with a subcarrier much in the way that left and right channels are now carried on regular fm as described in Wikipedia. The protocol already exists for car radios and would need to be adjusted for smartphones.

The best system would be a digital overhaul of the fm modulation signal. But that won’t happen because radio stations must be received by regular receivers as well as the new smart radios.

Broadcasters would never transmit a signal that can only be received by relatively few. That’s what happened when stereo radio was introduced. The new stereo signal had to be received by old mono radios as well as the new until the new technology was adopted.

The push for smart radios won’t come from cell phone service providers –they would prefer that you pay for data. It must come from broadcasters and listeners.

Facebook knows you best

Does Facebook know you better than your friends do, or even better than you know yourself? Lily Ames conducted an experiment to find out and reported the results to CBC Radio’s technology program Spark.

like

The personality program she tried is called Apply Magic Sauce, developed by The University of Cambridge Psychometrics Centre. It takes your Facebook “likes” and gives you a score based on a database of six million social media profiles.

When Lily ran her Facebook likes through the program, she was surprised at how well it scored on most of 20 things. It nailed her age within two years, religion, gender, education in journalism.

Then she compared those results with a standard test from Cambridge. It categorized her personality in five areas: openness, conscientiousness, extraversion, agreeableness, and neuroticism. She agreed with the test results with the exception of extraversion which she thought was low, especially when she considered extraversion as one of her defining traits.

Then five of Lily’s friends filled out a questionnaire on her personality. Surprisingly, the Facebook likes corresponded more closely to the standard test than either her own opinion or those generated by her friends.

Not so surprising says David Stillwell, a researcher at the Psychometrics Centre. Who we are is a philosophical question. Then, maybe we are not just one but different personalities; our self-impressions, the digital projection of ourselves online, and our personality as perceived by friends.

I was curious about what my Facebook likes would reveal about me so I tried the Apply Magic Sauce algorithm only to find that I didn’t have enough likes on Facebook to make an assessment. “Sorry, we are unable to generate a prediction.” was the reply “An insufficient number of your Likes match with those in our database, and we don’t believe in guesswork. Please take our full personality test, if you would still like to receive scientific feedback on your traits. Thanks!”

So I did. I took the full personality test and here’s the results. I scored highest on openness,73%, which reflects intellectual curiosity. Next was agreeableness, 69%, which suggests that I’m easy to get along with. Then conscientiousness, 66%, a measure of how organized I am. Extraversion, 54%, a gauge of social interaction. Finally neuroticism, 24%, my response to life’s demands. “Based on your responses, you come across as someone who is rarely bothered by things, and when they do get you down the feeling does not persist for very long,” the assessment elaborated.

It seemed fairly accurate, but then, why wouldn’t it when I’m the one who answered the questions?

Social media such as Facebook contain a wealth of data about ourselves that we may not intentionally reveal. Lily couldn’t even remember liking the Saskatchewan Roughriders. And a she was only being ironic when she “liked” new fashion trend.

No problem, says Stillwell. “From a prediction perspective, it doesn’t matter, as long as there is a link between people liking something and their personality. If everyone likes it because they are being ironic, then maybe it would be related to low agreeableness. But it doesn’t matter because the prediction still works.”

Get the message out call your kid “Bud Light”

I naively thought that my FaceBook posts appeared in the order sent. Not so. Look closely and you’ll notice that some posts hang around forever and others you don’t see at all.

It’s because an algorithm controls them, tailored to you; your “likes” and postings you respond to.

algorithm

When you think about it, it’s bizarre communication system. “It seems like a science fiction dystopia,” in which Big Brother controls our perception warns Christian Sandvig on CBC radio’s Spark.

Yet many FaceBook users aren’t even aware of the algorithms. They wonder why some friends never post anything, oblivious of the machinations.

Others know how to “game” the system by using key words that boost their postings; words such as “congratulations” even when it’s inappropriate. For example: “Congratulations, Canada is at war in Iraq.”

Or they use brand names to punch through the happy filter says Sandvig. “Isn’t my baby, Bud Light, cute,” jokes the professor of Communications at the University of Michigan.

Zeynep Tufekci has a similar concerns. While Twitter lit up with postings over the racial tension in Ferguson, Missouri, when a young black man was shot by a white police officer, FaceBook was strangely silent. It wasn’t lack of interest.  Tufekci found out that FaceBook friends were very active with postings about the explosive atmosphere.

Tufekci realized that FaceBook was deciding what should be of interest to her. “There was this disquieting moment because I really don’t want a world in which FaceBook decides which of my friends’ postings I am going to see.”

Algorithms are not necessarily bad. But in the happy FaceBook world of congratulations, wedding announcements, and baby pictures, algorithms are crafted with a certain motive in mind. After all, FaceBook is a commercial enterprise. They are selling your eyeballs to advertisers. While their algorithms are not transparent, their motives are. ”I wouldn’t be surprised if FaceBook algorithms are designed around likes and purchase behaviour.”

Social media have a moral obligation to users, not just  commercial obligations to advertisers. “They are not just selling shoes. We made them successful through use of our friends.”

We need to know things that may not be delightful. “What if a friend is contemplating suicide, and FaceBook decides I don’t want gloomy thoughts.”

Not only does FaceBook have a moral obligation to users as a clear channel of communication, it risks financial decline by being boring. In comparison, the drama of the unfiltered world of Twitter makes interesting.

However, even Twitter is thinking of tweaking their chronological stream though algorithms to stem the torrent.

Well designed algorithms are a useful thing. Google does a good job of anticipating what I’m looking for. Apparently, so does Netflix. My braking algorithm does a better job of stopping my car than I could under difficult conditions.

FaceBook should give us more access to our algorithm so that we can customize it. The tweaking allowed now is laughable. You can adjust the news feed from “top stories” to “most recent” but as my niece tells me, FaceBook switched her back three times in the last few weeks. Choice is a temporary option, it seems.