By: Diane Stirling
(315) 443-8975

Online behavioral advertising may seem like a smart way for advertisers to reach customers and a useful way for consumers to learn about new products and services, but the underlying online processes involved are fraught with privacy concerns, aren’t as transparent as privacy advocates prefer, and can be potentially “creepy” to Web users.

Those are key findings of a paper authored by the Cy-Lab at Carnegie Mellon University (CMU), a group that includes a new assistant professor at Syracuse University’s School of Information Studies (iSchool).

Assistant Professor Yang Wang, formerly of CMU, who joined the iSchool in the fall, is one of the authors of “Smart, Useful, Scary, Creepy: Perceptions of Online Behavioral Advertising.” The research paper was one of eight selected from 35 submissions last fall as a leading study, and voted to be most useful to policy makers, by the Advisory Board of the Washington, D.C.-based Future of Privacy Forum (FPF). Each year, FPF compiles winning privacy papers into an accessible digest and showcases it to policy makers, privacy professionals, and the public.

The paper looks at the understanding of average Americans towards online behavioral advertising—online ads that advertisers target to specific consumers based on a person’s web browsing history.

According to Wang, the findings indicated many interesting perspectives and realities:

* Study subjects had very strong concerns about privacy issues and behavioral ads.

* Most subjects either rejected the practice of behavioral advertising altogether or believed it to be permissible only in limited situations

* Although many subjects understood conceptually that web sites can use a person’s browsing history to target ads to online users, many were also surprised to discover that the practice actually takes place  

* Part of that situation may be because behavioral advertising has a tendency to be hidden, or not as readily transparent and recognizable to consumers, as it might be

A second component looked at users’ responses to and successes in using online tools to boost their online privacy or help opt out of behavioral ads. The findings included:

* Although users were motivated to alter their privacy settings and attempted to do so, they were not always successful at it

* Part of their failure was because of a lack of transparency in the tools and processes for altering privacy settings

* There was a discrepancy between awareness of how to use a tool to alter privacy settings and the successful re-configuration of the settings

* Users either did not understand the tool; the tool gave them a false sense of security regarding their privacy settings; or they held the belief that simply installing the tool was enough to achieve increased privacy

* Many users were unaware that in addition to the installation of a privacy tool, the extra process of configuring the tool also was necessary

* Even for those who recognized that they had to configure the tool to increase their privacy settings, many still had a hard time accomplishing that goal.

The study is useful in the policy realm because of its far-reaching implications for regulation and policy change, Wang said. “This is one of the first empirical studies that really demonstrate that the current industry self-regulation for behavioral advertising is not working, and the notice-and-choice approach in its current implementation is not working.” He added, “The question for policy-makers and privacy advocates then becomes, ‘If our protection of ordinary American users is based primarily on self-regulation, and that’s not working, what we should do,”

“The industry is saying that they can do a better job to improve their self-regulation, and that should solve the issues,” Wang added. “While I’m sure they can improve it to some extent, that’s to be seen,” he said.

Still, the Federal Trade Commission (FTC) is considering whether to establish new regulations, Wang said, and the issue has set off a chain of pushbacks from the ads industry. “It’s a very interesting area for public policy on technology, and the “Do Not Track” definitions being developed by the W3C Internet standards group,” he added.

Co-authors on the paper with Wang are: Blase Ur, Pedro G. Leon, Lorrie Faith Cranor, and Richard Shay.

Another paper submitted by Wang and the CMU group was one of four nominated by the Future Privacy Forum for honorable mention.