What Is Privacy Forward and Why Does it Matter for Marketers?

John Francis, DBA

By John Francis, DBA

Original Publication: Marketo

Date of Publication: November 12, 2018

When I worked as a webmaster at AOL in the early days, every employee had visibility into every member’s account. Then in 1999, a naval officer was outed by one of the employees after he logged into an alternate profile with a different screen name. AOL’s course of action was to immediately shut down access to all members’ data and hire an integrity assurance officer. It was a moment of reckoning for online privacy.John Francis, DBA

I’ve been thinking about that anecdote a lot recently following the revelations about how Cambridge Analytica accessed and deployed Facebook data to impact the U.S. election. There’s been a lot of discussion about how much Mark Zuckerberg knew and whether Facebook should have done more to prevent or stop the data theft. Those are absolutely questions worth considering, but the issue of data theft is nothing new. Tech companies have always created platforms with a certain degree of naivete about the possibility that user data could be exposed or exploited.

As a marketer, it’s important to understand privacy as it is part of the customer experience. In this blog, I’ll tell you about what a walled garden is and why it’s a myth, how to engage with customers, and what a privacy forward strategy looks like for marketing teams & their customers.

The Myth of the Walled Garden
At a recent Videonomics symposium, I heard representatives from many tech companies discuss how advertisers couldn’t get into their “walled gardens.” A walled garden is a closed ecosystem where operations are controlled by an ecosystem operator. The term is frequently used, but based on decades of experience with dot-coms and digital advertising, I consider a myth.

The fact is that it’s very simple for someone to take first and third-party data, link it up and retarget consumers with ads. That information combined with a user’s history can help build a persona around them. Even though Facebook shut down the ability to take data from third-party data brokers, companies can still put cookies on other websites that collect activity from users. They may not know who the person is, but if they have an IP address and can link those two together with Facebook, you get a full 360-degree view. The data that is already out there, whether it’s been released or stolen, can then be correlated and shared.

Moreover, data breaches appear to be accelerating in severity and scale. Breaches at Yahoo, Sony PlayStation, and Alteryx, for example, resulted in compromised data for hundreds of millions of people. All that information is available to anyone. We live in an age of “data promiscuity.” Walled gardens and online privacy are nice to think about, but privacy could soon become a relic of the past, which is why a new crop of data privacy regulations and guidelines are emerging to create a privacy-forward landscape.

Data Targeting
Questions around data privacy have particular relevance for marketers and advertisers, who rely on data to improve their targeting capabilities. Robust data allows them to put their ads in front of people and create brand awareness, which helps sell products. Secondly, marketers can use data to put targeted messages in front of people who actually want the product, instead of people who don’t.

However, participants in the advertising ecosystem need to have data integrity assurance incorporated within online platforms that actively works to protect private information. Our industry is making progress towards increasing the capacity to distribute information freely. In addition, the platforms they create to spread this information are very user-friendly. I’m not a programmer by profession, but available analytic tools can be easily configured to exploit private information based on the conspicuously available private data. Cambridge Analytica’s brazen use of a Facebook application to gather insights on millions of users is a prime example of this dynamic at work.

Engaging with Relevant Content
Moreover, Facebook uses a process called content-based targeting, whereby related content and ads are delivered to members based on their likes, shares, and follows. Facebook collects much more data about members’ engagement than what is made privy to advertisers.

Targeting the right audience doesn’t (and shouldn’t) require theft and privacy violations. Data privacy and marketing do not have to be mutually exclusive. Marketers care that an action was created, but not about who created it. All that matters is what the consumer did and why.

Digital analytics and web traffic tools like Google Analytics and Matomo place pixels on a website. The pixel provides timestamp information when an action is taken. Say a commercial aired on Lifetime for a Gerber product. If somebody sees the call to action and types in the URL on their computer or mobile device, then we know what ad they saw, where they were located, and the time and device they used. We also know that a commercial sent a certain amount of money at cost-per-click or per action, which is useful for looking at a marketing budget and figuring out where best media spends are.

A Privacy Forward Approach
In June 2018, California passed the California Consumer Privacy Act (CCPA) of 2018. The policy grants consumers the right to request the data that businesses collect on them and to ask companies not to sell their data. The law imposes strict rules about how businesses disclose data collected from consumers. It also empowers the state Attorney General to fine companies for noncompliance. Needless to say, it was opposed by major media, telecom and tech companies, including Amazon, Google, Microsoft, Comcast, AT&T, and Verizon. Facebook initially opposed it but eased off after the Cambridge Analytica scandal broke.

The CCPA was inspired by what is happening in Europe with the General Data Protection Regulation (GDPR), which imposed new rules on controlling and processing personally identifiable information, or PII. There was skepticism that the privacy forward principles of the GDPR would catch on in the U.S., but it has, starting with California which is setting the standard other states will soon follow. Dot-coms are following suit as well, as evidenced by the pop-ups about policy changes on what feels like every ecommerce and news site.

Transparency=Trust
These initiatives have entered the term “privacy forward” into the modern lexicon. A privacy forward approach is best described as the guidelines for identifying data that should be considered classified. Classified information includes IP addresses, contact information, and genetic and biometric data. It also encourages organizations that collect personal data to conduct mapping and maintain a 360-degree view. Customer information is not a commodity, but rather a personal bond of trust between an organization and its customers. This also extends to what is shared with outside vendors and third-party data analytic tools and their associated platforms. Transparency is paramount.

In 2001, with the merger of Time Warner and AOL, the FCC ordered AIM, which had over 90% of the market (and thus user data) to become interoperable with other chat platforms. Today, Facebook is participating the Data Transfer Project, a collaboration of organizations, including Google, Microsoft, and Twitter, committed to building a common way for people to transfer data into and out of online services. It’s a big and exciting step towards making privacy forward the norm.

Digital Advertising and Consumer Privacy: Three Trends to Watch

Digital Privacy

Author: Jessica Hawthorne-Castro, CEO

Original Publication: MarTech Advisor

Date Published: March 9, 2018

Digital Privacy
Digital advertising has exploded in recent years, with the latest eMarketer data forecasting $83 billion in revenue this year and continued growth on the horizon. According to the eMarketer report, digital comprised more than 36 percent of total media ad spend in 2016, and the forecast predicts digital will account for half of ad spend by 2021, with mobile driving the majority of growth. Amidst all that growth is a burning issue: consumer privacy shares, Jessica Hawthorne-Castro, CEO, Hawthorne Direct

More and more advertisers are using consumer data and advanced digital technologies to deliver campaigns personalized to individual consumers. But with these developments come concerns. How can advertisers deliver compelling, targeted campaigns, and still honor consumer privacy?

The opportunity to deliver exactly the information consumers are looking for while promoting our clients’ businesses and products seems like a win-win situation. And because of the immense opportunity, companies are jockeying for ownership of consumer data. As of November 2016, Facebook, Google and Verizon were boosting their consumer data tracking efforts to consolidate personal information, browsing history, app usage, offline information like addresses and phone numbers, browser cookies, device data and more into a cocktail of target data that would make any advertiser’s mouth water.

And there’s no question that digital advertising is on the rise. Digital advertising earnings hit $19.6 billion in the first quarter of 2017, according to the latest IAB Internet Advertising Revenue Report. There are also more platforms, like the growing Internet of Things (IoT), on which to deliver digital ads. This year, we’ll see a whopping 8.4 billion connected things in use worldwide in 2017, according to research firm Gartner. Throw in the adoption of digital assistants like Amazon Echo and Google Home, and the virtual reality shopping opportunities on the horizon and the possibilities for targeted advertising are endless.

But consumers want their personal data and privacy protected, and they’re wary of industry efforts to capitalize on their data. More than two-thirds of respondents in a recent Gigya survey don’t trust brands to handle their personal information appropriately. Regulators are taking note and taking action. In late 2016, Federal Communications Commission (FCC) privacy rules placed restrictions on how consumer data could be lawfully used and sold.

As such, the industry is coming to a crossroads. Questions remain around the role of regulators, the feasibility of self-monitoring, protections consumers should have vs. what information is acceptable for businesses to track, and the appropriate guardrails for data collection and monitoring.

To navigate this changing landscape, I’ve identified three key consumer privacy trends that advertisers and their clients should watch:

Increased FTC monitoring and intervention

The Federal Trade Commission (FTC) enforces federal laws related to protecting consumer privacy and security and pursues violations of Section 5 of the FTC Act barring unfair and deceptive practices affecting commerce. When industry self-monitoring fails, the commission will take action.

In August of this year, Uber agreed to implement a privacy program and obtain regular, independent audits to settle charges by the FTC that the ride-sharing company “failed to monitor access to, and provide reasonable security for, consumer data.” The FTC also has had Do Not Track legislation in the works since 2010, which, if passed, would give consumers the power to opt-out of data collection about their internet activity, which is frequently used to deliver targeted advertisements.

If companies continue to cross the line from fair to deceptive, we can expect to see more actions and possible legislation from the FTC.

Improved internet content monitoring

Before the internet, “fake news” and propaganda certainly existed. But the internet has created opportunities for people to post whatever they want without accountability for accuracy, fact-checking and credible sources. We’re at a point in the internet’s maturity where it really needs to be policed for truth and accuracy. This July, the News Media Alliance, representing nearly 2,000 news organizations in the United States and Canada, requested congressional permission to negotiate jointly with Google and Facebook as a strategy to battle the rise of commoditized fake news.

It’s vital that advertising agencies, marketers, the media, companies and other entities take responsibility for the information they’re publishing in the public arena, and social and search clearly have a responsibility to help frame content as well.

Determining ISP and website accountability for digital consumer privacy

There has been a lot of thrash in this space over the past year, and rules and responsibilities don’t seem to be fully determined yet. In October 2016, the FCC enacted privacy rules requiring internet service providers (ISPs) to ask consumers for opt-in permission to gather and share data deemed sensitive. These rules differed from the requirements imposed on website operators like Google and Facebook, spurring much discussion about regulatory clarity and consistency. Then, on April 3, President Trump signed a resolution nullifying the FCC’s privacy rule for ISPs altogether.

There are still a lot of questions about which consumer privacy responsibilities rest with ISPs, which rest with website operators and which rest elsewhere. As the conversation continues, expect to see more shifts in these areas, particularly as digital ad revenues continue to grow.

Privacy rules will continue to mature; however, one constant remains: Ethical, consumer-focused business practices will weather privacy storm. Advertisers that keep the customer top-of-mind, minimize the spread of misinformation online, and infuse their campaigns with accurate, truthful information will continue to amplify their success.