Social Media Advertising Tools And User Consent: What Are The Requirements?

Perhaps you’ve seen them, those television and radio ads that talk about the “creepy” nature of some adverting on the Internet that follows consumers across their social media. According to Pew Research, most Americans believe their online activities are being tracked and monitored. 

The fact is, most companies can and do share data with social media platforms to ensure targeted advertising reaches receptive audiences. As more tools become available and the variety of data sources grows globally, platforms and advertisers are re-examining their rights and obligations when it comes to something as simple as matching customers’ email addresses with their Facebook accounts. 

Facebook’s Customer List Custom Audiences (“Custom Audiences”) tool is one such tool that has the potential to expand an advertiser’s liability for unauthorized use of customer data. For EU customers, a German Data Protection Authority ruling requires a individual’s explicit consent to such sharing.

The Facebook Custom Audiences tool enables advertisers to create targeted advertisements to Facebook users by combining Facebook data with the advertiser’s data such as email addresses and phone numbers. To use marketing tool the advertiser must comply with the consent and privacy expectations of individuals who have provided email addresses.

Consent to Use Email Addresses

While the use and disclosure of email addresses is regulated in some countries, the U.S. does not have a uniform data privacy protection scheme. U.S. privacy rights are protected through a patchwork of laws addressed to specific types of harm, such as unauthorized access and disclosure of financial (Fair Credit Reporting Act (FCRA), 15 U.S.C. § 1681) or healthcare-related (Health Insurance Portability and Accountability Act of 1996 (HIPAA) 42 U.S.C. § 1320d–2) data. While the CAN SPAM Act (15. U.S.C. § 7701 et seq.) specifically regulates email, the Act excludes communications based on a previously existing relationship. 

Importantly, for most purposes, permission of the e-mail recipient is not required. However, messages MUST contain a mechanism to request to opt-out of future email messages. If email addresses are acquired from third-party sources, such as marketing databases or social media, ensure users are given reasonable notice and choice about the use of such data.

The Federal Trade Commission endorses a market-style model of ensuring the fair use of information that allows individuals to participate in decisions on the disclosure and use of their personal information. As articulated by the FTC, the elements of this approach are notice, choice, access, security and enforcement.

Contractual Requirements of Facebook Custom Audiences 

In order to use the Custom Audiences tool, the advertiser must agree to additional terms and conditions. Facebook’s Custom Audiences terms require that the advertiser have both “all necessary rights and permissions” as well as a lawful basis to disclose and use the email addresses “in compliance with all applicable laws, regulations, and industry guidelines.” 

Recommendations

Review your Privacy Policy, Website Terms & Conditions, and membership/subscription applications to confirm the existence of a clear mechanism to opt-out of future email messages. If email addresses are acquired from third-party sources, such as marketing databases or social media, review data gathering practices, review scope of permissions granted to the sources of data and ensure users are given reasonable notice and choice about the use of such data.

Your Money or Your Life: Mobile Marketing & Privacy (Part 1 of 3)

Free content is not without a cost.

As our lives have become more digitally enmeshed with content, immersive entertainment and devices, the economic bargain that makes it possible has gone largely unnoticed. Simply put, the collection, analysis and sharing of personal data is driving the digital economy. Mobile applications (Apps), digital content and entertainment – from TV shows to games – are available for “free” but subsidized by income from online ads that are customized using data about customers. Vendors, advertisers and platforms compete for “eyeballs” based, in part, on the quality of the information they possess about users to whom the ads are targeted.

Across this interconnected landscape of users, content providers and devices, the issue of online privacy has become a major talking point for app developers, marketers, consumers and legislators. Recently, a wide range of stakeholders, from large institutions to smaller developers, have been accused of mishandling personal data. As the volume of public debate has increased, legislators have introduced a raft privacy initiatives. The Obama administration has called for a Privacy Bill of Rights, an industry consortium of leading web sites and search engines has proposed its own privacy best practices and the Electronic Frontier Foundation has published a consumer-oriented Mobile User Privacy Bill of Rights.

Part 1 of this article looks at several recent and high-profile revelations about how personal information is collected and used, often without the user’s knowledge and consent. Part 2 discusses the legal risks faced by vendors that don’t take adequate precautions to protect consumer privacy and Part 3 concludes with strategies and tactics that help leverage the power of personalization while avoiding the pitfalls of privacy and data security.

1. The current state of information gathering

The scope of personal information gathered is unprecedented and largely unknown. For years, “free” web-based content has been available because of the implicit compromise between content providers and content consumers. Advances in technology have made it easier to track a user’s web browsing habits, mobile browsing habits, and even real-time geospatial location (check in apps and GPS). In the last few months, we have learned that some apps not only gather this mostly non-personally-identifiable data, but also upload a user’s address book contacts and even photos.

On Wednesday Feb. 2012, software Developer Arun Thampi “outed” Path, the purveyor of a self-titled journaling app, for sending users’ address book contents to the company. Path lets users share what they’re doing with a select group of friends and gives users the option to find friends on the app through contacts or other social networks. Thampi disclosed the clandestine data transfer in a blog post after discovering that his phone’s entire address book, including full names and e-mail addresses, was being sent to Path without his explicit consent. According to Path, this data was necessary to in order to quickly notify users when people they know join Path.

Not too long ago, Google earned itself a similar PR (and legal) black eye when it launched its social network, Google Buzz, in 2010 through its Gmail web-based email product. At launch, users were not informed that the identity of individuals they emailed most frequently would be made public by default. Google Buzz automatically disclosed the email addresses of a user’s contacts by default. Google settled with the FTC over allegations that Google used deceptive practices and violated its own privacy policies.

On Feb 17 2012, WSJ reported that Google Inc. and other advertising companies have been bypassing the privacy settings of millions of people using Apple Inc.’s Web browser on their iPhones and computers—tracking the Web-browsing habits of people who intended for that kind of monitoring to be blocked. The companies used special computer code that tricks Apple’s Safari Web-browsing software into letting them monitor many users. Safari, the most widely used browser on mobile devices, is designed to block such tracking by default.

A major topic for discussion just this week is the “Target Snafu.” As originally reported in the New York Times, Target used customer data and predictive analytics to determine that one of their customers was pregnant, and even her specific trimester. The girl’s father learned of the pregnancy when the retailer emailed her promotional material and coupons.

It used to take days or even weeks to gather, synthesize and extrapolate data about a customer’s buying habits and receptiveness to particular products or services. Now it takes milliseconds. A targeted ad can be sourced and served in the time it takes to hit “refresh” on a web browser. Companies are using massive amounts of data to predict what their customers are going to want next. More importantly, gathering that data is getting easier, cheaper and more ubiquitous as the source of that data moves from the desktop to mobile devices.

So where is the middle ground between privacy and targeted advertising? Is it spying simply because the user doesn’t know what data is being collected even though the user accepted a broad and ambiguous Terms of Use agreement? Is knowingly contributing data without boundaries sufficiently transparent?