Taking back control from online technologies
Julita Vassileva wants to liberate us from the siren call of technologies such as smartphones that have some of us missing assignments, souring relationships, walking into walls and worse.
By Michael RobinVassileva, a professor of computer science and expert in personalization and persuasion technology at the University of Saskatchewan, wants to help us live healthier and more fulfilled lives. Right now, that power often rests with online social media giants who want your attention—and they’re very, very good at getting it.
“They have a very accurate model of your interests, based on where you are, what you have been searching for and they correlate it with models of millions of other people,” she said. “They know how to show you exactly what you want to see, what will please you, and what will keep you coming back for more.”
Online companies, big or small, depend on catching people’s attention. Attention, and action, is their main source of revenue, whether it’s downloading a new app or simply staying on Facebook longer to be exposed to more ads. People obsessively check their online status, even when they know it affects personal relationships and work performance.
“People know it is bad; they feel guilty about this behaviour and still they do it,” Vassileva said. “It’s like an addiction, like alcohol or cigarettes. For employers, it is like having an employee that goes every half hour to smoke for 10 minutes. With Facebook or game addiction, it may become hard to keep a job.”
But what if we had the power to use our computing devices to help achieve our goals rather than those of outside interests? Vassileva and her team aim to do exactly this, with strategies and software tools that allow users to decide what is important to them.
Such tools already exist, particularly in the area of fitness. There are apps that count steps, monitor heart rate, as well as track calorie intake and sleep quality. Other apps track the use of time-hungry social media sites and issue warnings or lock these out during business hours. Vassileva herself tried an app that would alert her to how much time she was spending daily on social media sites.
“In the beginning it was cool, but in the end, you switch it off,” she said. “It makes you feel guilty, but it really doesn’t change your behaviour.”
This underlines how hard it is to design tools that try to change behaviour: people must want to use them. Backed by an NSERC Discovery grant, graduate students Shi Shi, Sayooran Nagulendra and Wesley Waldner developed different ways to give users control over the filtering of the personal newsfeed on Facebook, Twitter, and other similar sites.
A program that blocks certain sites can elicit annoyance and resentment, or be seen as a partner to help the user. Vassileva likens it to being an effective parent or teacher. It’s not enough to tell the user what is in their best interests; they must be persuaded to do it.
“Persuasive technologies inform, explain, persuade, monitor and cue the user to engage in desired behaviours,” she said. “They use models of the user to personalize the appearance, tone and content of their messages to maximize effect. They often have elements of games to make them more engaging, and use incentive methods based on principles from social psychology and behavioural economics.”
Persuasion also has an ethical aspect. The tools Vassileva and her team are developing are intended to help people change their own behaviour for good. But she explained companies are already mining user data and using behaviour modification tools for their own ends. For now, their purpose seems to be mostly commercial, but these tools could be used by political interests, governments and special interest groups.
“These behaviour change techniques, they’re dangerous if they’re in the wrong hands,” Vassileva explained. “They can make us purchasing zombies, voting zombies. It’s a huge ethical issue with this kind of technology.
For what purpose it is used? Very intimate motivations and patterns of individual behaviour can be discovered through datamining and manipulated to a great harm of individuals and societies as a whole.”
By understanding how to effectively persuade people to take action, Vassileva hopes to put this power back into people’s hands so that they can use it for their own ends. And perhaps, those same people will be willing to pay for that power.
“If we want our software to be commercialized, there will be companies that will make money with it,” she said. “But maybe users will be willing to pay for honest software that allows them to achieve their own goals, realize their potential and do good, rather than use free software that exploits them.”