Skip to content
Permalink
Branch: master
Find file Copy path
Find file Copy path
Fetching contributors…
Cannot retrieve contributors at this time
267 lines (135 sloc) 47.3 KB
title
SRCCON 2019 Session Transcript — Let’s build some ethical advertising

← SRCCON 2019 Session Transcripts

Let’s build some ethical advertising

Session facilitator(s): Ian Carrico, Amanda Hicks

Day & Time: Thursday, 11:15am-12:30pm

Room: Ski-U-Mah

AMANDA: Hey, everybody! Welcome. I'm Amanda. This is Ian. We are talking about building ethical advertising. So we wanted to start by giving you a little background about us. I'll let you start. With background about you, actually.

IAN: Oh, fantastic. My name is Ian Carrico. I work for Vox Media as the revenue engineering manager. We build various forms of ads. All of the codes put ads on the page. All of the ads we do in-house, and I am leaving in about 28 days to go to law school, because I feel that the entire industry around advertising and technology needs more people who understand the law and tech to fix bad things going on.

(applause)

AMANDA: Yeah! I'm Amanda. I am the director of Ad Product at the Washington Post for our RED team, which stands for Research, Experimentation, and Development. I was in the newsroom for quite a while, moved to the ads side about a year and a half ago, still kind of have imposter syndrome on the ad side. And as I've learned more about ad tech, I've started to realize that it is broken, it is icky, there are very unethical practices. So when Ian asked for help with this session, I was gung ho about it. The reason we wanted to do this was because we wanted to bring some smart people into the room who are targeted by unethical advertising, have first time experience flagging bad ads, and are journalists who can get the word out about what's really going on in the industry. So that's what we're gonna talk through. So we just got through the intro. We're five minutes in. We're right on schedule. We would like to break you guys out into small groups. How many groups, do you think? Tables? I don't know. Four to a table. Yeah. To talk through what an unethical ad looks like to you. Some examples you've seen in the industry, if you want to pull up examples to share with people, if you have laptops, that's cool. Then we're gonna share those stories around the room. Then Ian has a great presentation about unethical ads that he's seen and why things are broken. And then we're gonna dig in, come up with a framework for more ethical advertising, try to figure out who's responsible for fixing this. Is it advertisers? Is it us? Is it the government? And then we're gonna wrap up and talk about next steps.

IAN: So... What we first want to do is we're mostly in groups. If we want to just... You two go with those two and kind of make slightly larger groups. Write down both examples of bad ads that you've possibly seen, things that you find abusive, creepy, what have you. Or in general, behavior that you have seen in advertising that you believe is unethical in some way. What we really want is to come up with as many different ideas so we can kind of group together what we see as unethical in the industry right now, to see if there's any commonalities or strings that we can pull together at the end. So what ads you've seen and kind of what unethical behavior you have seen online. You have some Post-It Notes in front of you, if you'd like to write down different ideas as a group. Go and chat amongst yourselves. Let's talk about ethical ads.

All right, guys. Time to wrap up. Okay. If you could take a mobile device or a computer, if you have one at your table, and we would like you to actually do a quick experiment. We were gonna do this later, but... We have bit.ly URLs that you can hit to look at what Google and Facebook think they know about you. This is Ian's profile. It's pretty funny. But go ahead and take a look at your personal profile. This is what Google thinks it knows about Ian, based on what he's clicked on, what he's liked. What he's searched for.

So in Facebook in particular you can go to different tabs.

AMANDA: So I think looking at this tells you that they know a little more about you than you thought they did, but also they probably didn't peg you the right way. Probably. So now we'll like to go around the room. So five minutes apiece, we'd like to hear from each group about what you discussed, starting with that table right there.

AUDIENCE: All right. Oh, man, we could read you a manifesto of bad advertising. So under unethical advertising, we talked about... Every couple months, we get a note from a user that said some ad loaded malware onto their computer, which may or may not be true.

AMANDA: Who do you work for?

AUDIENCE: That was me. Star Tribune. We'll get a complaint from someone.

AUDIENCE: A lot of objection to third party trackers. As a person who sees advertising, you have no control over what data they collect. But more specifically what happens to that data once it goes into the whole advertising ecosystem. And I think that's sort of the root of a lot of the objections that we have. And as a user, you're sort of unable to know the trade-offs of all of that. And... We noted that sometimes tracking can follow you from a physical presence to online. So you might be standing in front of a doughnut display in a store and then the next day you're on Facebook or something and you see ads for doughnuts. Is that just a coincidence? Chances are it's not a coincidence. That's definitely in the creepy category. Hard to say if it's unethical, but definitely heavily creepy. In fact, we said that any targeted advertising is unethical. That was a bold statement put out there. And that the GDPR has actually been pretty good, because it severely limits, within certain geographic parameters, the ability to do targeted advertising. Which is better for those users. At least, from what we've experienced. And we talked a little bit about the ecosystem that a lot of the big tech companies are pushing onto publishers, namely like AMP and Apple News, which kind of don't directly relate to advertising, but get into the tracking aspect of it, where you basically have to give your content to those big companies, and then you as both a user of that, a reader, or as the news publisher, you don't have a lot of control or no control over what those tech companies kind of do with the data they collect on the users who are reading that content. And that is bad.

AMANDA: Yep.

AUDIENCE: Yep. We all run ad blockers, too.

AMANDA: Everybody? So what came up when you pulled up your Google stuff and your Facebook stuff?

AUDIENCE: I had it all turned off. My Facebook stuff is pretty innocuous. Quicken Loans, apparently. How about you guys?

AUDIENCE: Home and Garden ads, I guess, because I have to Google how to do things.

AMANDA: All right. Let's get a table in the back.

AUDIENCE: So we identified three main things in the unethical ads. One is bad experiences that the ads cause. Things like close buttons that actually cause you to go to the ad, or redirects. So you go to a website and you get an ad that causes you to say you just won an iPad.

IAN: Did you get the iPad?

AUDIENCE: I've never gotten that iPad!

AUDIENCE: We all got the iPad!

AUDIENCE: Ads that are really fake. So things like fake localized content and the Taboola-style...

AMANDA: Fake news? A headline and an image --

AUDIENCE: That looks like it's supposed to be related content. And is very clearly fake. But then also things like -- on Facebook, where you could -- for a while, ads would have "recommended by this person". And it would have a picture of someone that you're connected with in your network. That didn't actually ever recommend it. Would never share something like that. And then digital redlining. Any sort of ads that are really targeting in very racist, sexist ways. And we also all run ad blockers. And the primary way that ads -- the communication between journalists and AdOps is Slack and email. Same as everything else.

AMANDA: Can I get a show of hands of who runs an ad blocker in here? That's like 80%. Okay. Cool. Table in the middle?

AUDIENCE: I think we kind of came up with more questions than hits like the last two tables. Some of the things that we talked about were like: Expectations around tracking. Knowing that okay, maybe the sites I visit or what I Google is gonna be tracked, but the difference between that and the doughnut display example or being on a Wi-Fi network and you're like... Wait a second. Someone else on that Wi-Fi network was something for something. That led to what I've heard from friends like... I said something to my Alexa or Echo, and two days later, I saw an ad for that. Is that clearly the case? Or does it just seem like that?

AUDIENCE: Even without the Alexa or Echo. We talked about who's really responsible for this, and we talked about some companies, but overall, it seems to be... We seem to have a consensus that having one company do things is not really that great. We should have specific standards and rules to follow. Yeah.

AUDIENCE: One of the big ones was like: So many things are now like -- yes, you are giving opt-out opportunities. When we signed into the Google ads thing... But by default, you are opted in. It creates this opt-out burden across all these different companies, all these different services. Also all these different accounts. Like on my Google personal I was not being tracked, but on my Google company stuff I was being tracked and wasn't aware of it. So it just multiplies like crazy.

AUDIENCE: What are you even opting out from? We don't know what we're opting in for.

AMANDA: Right. Table in the front?

AUDIENCE: We had a lot of things. There's a lot of ways to be unethical in ads. So deceptive native advertisements where it looks like it's an article by the publication but actually it's an advertisement. Deceptive chumboxes in general. Cross site tracking. There's a lot of bot traffic happening, where it seems like people are looking at ads, but really they're just bots that are getting the money. Some of the ads themselves have images or other IP that is stolen and not actually owned by the people running the advertisement. So Aaron's example is: You know that ad for signs you're gonna have a heart attack and it has a picture of somebody's leg and it's kind of indented? They do not own that image and it's not of someone who had a heart attack.

AUDIENCE: They ripped it off a 2002 blog post. Somebody was blogging about their recovery after shooting themselves in the foot.

AUDIENCE: Another deceptive thing is to have ads that you know will have poor visibility, either a postroll autoplay or putting the ads so far down the page they can't possibly be seen. The visibility tracking itself is done unethically, because they lie about how well they're getting the visibility metrics in. Yeah. Problems with labeling of ads. Problems with the BitCoin and the malware and the redirects and the pop-ups. And going back to the bot issue, just in general, the fact that nobody really wants to know exactly how many people they have. They want to have the number they have now and they want to have a bigger number. But if you were like... Oh, let us tell you the real number, which is smaller... Whoa. We don't want to know that number.

AMANDA: All right. Table in the back?

AUDIENCE: Yeah. We talked about different ads that we see, that are problematic. You can often see ads for... Oh, do you want an engagement ring? Do you want to get married? Do you want to freeze your eggs now? Often at a much younger age than you might even be thinking about those things. And that is obviously heteronormative in a lot of ways and enforces gendered views on society on people who may not even be thinking about that, or ever think about that. And also with news coverage, when there's a tragedy or a disaster, the adjacency problem. What's appropriate when. And obviously readers feeling like... Oh my God, this is incredibly jarring. And how to talk to different people in your newsroom about that. When you're in the newsroom, sometimes the advertising team is be very... Can seem very far away.

And a tragedy is happening in the middle of the night. The journalists are covering it. But is advertising around at that hour? What else? Small publications not necessarily having the literacy to understand what they're signing up for. The local TV stations and newspapers obviously... And needing money, right? And not necessarily having the infrastructure to deal with those kinds of things. And obviously reader literacy. About ads. Targeting vulnerable groups. People who are struggling with addiction, pharmaceuticals, people struggling with gambling, people already in debt, and like... This is just like a game. But it's actually gambling. That's most of what we're talking about.

AMANDA: Okay. So Ian has a couple of those examples incorporated into his presentation. He's really excited. So we're gonna run through some good examples of bad ads.

IAN: Yeah. That'll be the phrasing I use now. That last one you just said hits me hard. Hopefully we'll get to this. So like everyone knows, this was brought up -- that you're being tracked on Google. Google is tracking you on... Of the Top 10,000 sites, 70% of them have Google Analytics, and you're being tracked on all your actions on those sites by Google. Of the Alexa top 500 sites, 65% have trackers. You're not just being tracked by Google and Facebook but all these small data providers. So you might turn it off on Google with one account, but in your other account, you have literally 100+ other data providers who have data on you in some shape or form. On the Verge, there are 15 trackers. I'll shame my own company on this one.

But we don't put those on that site. We have four data trackers we put on our site. Those are added by third party ads that are constantly being loaded. It's fun. Here's a fun one. Amazon. Amazon not only leaves a cookie on your computer when you click one of those affiliate links. Most affiliate links will be there 15 days. Amazon holds onto it for 90 days, and when you buy something with an affiliate link, not only do they tell the person... Let's say publication B puts an affiliate tracker on there. Affiliate B finds out not only that you bought something, but how many items you bought from that affiliate link. So they give more data than they have to. A lot of physical places. Target being a really good example. Tracks everything you do.

Everything you buy. You have an individual customer ID linked to you, with your phone number, credit card number, email address, take your pick. They have it. CVS does a similar thing. All those rewards programs. Yeah. They're selling that data! It's great. For the doughnut? Was it the doughnut shop commercial? There are two different ways I've seen that happen. Geofencing, where you've been in a certain location. They do that. One I find even more disturbing are beacons, which I believe Aaron said, where there are ultrasonic signals being played in different locations that your phone or other devices you have on you can listen to, using just small signals, and through that, they can determine exactly where you are. So it's used heavily in music festivals and stuff like that, to know which stage you're at, so they can get analytics of who is where. It's very strange.

AUDIENCE: Some stores are using it now.

IAN: Yeah. There's a great project at Yale Law School, where they're starting to document where people hear this. It's crazy. Absolutely insane. Actually, I don't know if you want to say this is fun... This was also in television commercials for a bit. Where they were using devices to listen to television commercials to see if you were actually paying attention. It's terrible. Fitness trackers. Fitness trackers track. They sell that data. This is both a concern of personal privacy, you can find out someone's home address very easily if you look at their profile. You can sell health information in weird ways. National security concerns. This was an issue with Strava, where they secretly revealed all of the black sites across the globe. Woo! Even worse, John Hancock will only sell life insurance policies to people if they have a fitness tracker. This goes into weird capitalism areas that I do not feel comfortable with. Health care. HIPAA protects you, right? No!

Pharmacies can sell your information to health care giants for marketing purposes. So they can't sell it to anyone, but for health care companies, for marketing purposes, your prescription history can be sold. Other smart devices. This is one I saw recently. A thermometer that measures your thing... Tracks your temperature... I don't know why you would need a smart thermometer. But they sold that data to Clorox, to know where to marketplaces that had high amounts of the flu. So they could do targeted marketing based on neighborhoods that more accurately have sicknesses of some kind. This is my most horrifying one. This was a while back. But Mike Seay got a letter in the mail that his data group was... Daughter killed in car crash. Yes, his daughter was killed in a car crash, and he got this from OfficeMax. Some data provider gave them all this information. It's just terrible. There's one more that you said at the very end.

AUDIENCE: Gambling?

IAN: Gambling. This was a terrible one. I don't have a slide for this. But there has been in the past... Scammers use data providers. There are hundreds and hundreds of different data providers, different companies buy from, where you can get lists of various kinds for marketing purposes. They got elderly and most likely with Alzheimer's. And targeted them for various scams. And like, these are not data lists that are hard to get. They cost less than $100. You can probably go get them yourself. And just the amount of abuse that can come from all of this microdata on people... That list might have had 10,000 people on it. And it only needs to be accurate for half of that list for it to be just a wildly dangerous set of tools. I do this, because there's nothing else I can do besides smile loudly, or I just get sad.

So I wanted to share this as... All these different pieces of use... I was making a list of everything that people were saying... Are like... Known. They are seen. There are reasons behind them. Oh, there was one other piece. Is my device listening to you? Alexa, obviously, yes. But you used to get fears of people who like... Oh, I said this one thing in a conversation and I got Instagram ads for it. Is my device listening to me? Chances are, no. Because it just takes a lot of processing power. But the truth is actually to me a lot worse. Where you're buying purchases from CVS, and that data is being compiled with who you're hanging out with, and your friends and what your friends are looking at, and all these different pieces coming together that... Yeah, you can accurately predict whether someone is going to be looking for a very specific item. One in particular I read a while back was... It was like flu medicine or something. They gathered from CVS, they got tissues and a couple other things. And then they put together... Oh, you probably have the flu. So here's a pseudoephedrine ad or something like that. Weird and creepy. So we looked at our profiles. Do you want to start this bit?

AMANDA: Yeah. So we've got a lot going on here. So we know that we need to make money. And the way that we make money is through advertising. And we're now in this situation where to make money through advertising, we have to have third party trackers. We don't have to, but we would be shortchanging ourselves if we didn't allow third party tracking on our sites and if we didn't allow programmatic ads on our sites. Ian and I were talking about how much money Outbrain and Taboola bring in. We get that complaint from the newsroom all the time. These ads are terrible. Why can't we kill those? That's 15 or 20 reporter salaries. It's a really tough problem to solve. And we as publications, as people in the industry, are somewhat responsible for trying to solve this problem. And it's hard.

Browsers are trying to stop the tracking. GDPR is in effect, but that's only for Great Britain. California Privacy Law is coming into effect in 2020, that helps California, and might help the rest of the country at some point, but still, technology is going to try to find a way around this issue. So that leaves the consumer... The consumer can try to... You guys all have ad blockers. But it is a tough problem to solve. So if we could come up with a list of best practices, that would be helpful. Starting with how we use third party data. Like, how the third party data that is collected might be used.

IAN: I actually have one question here. How many people are on the tech half of this world? How many are in the journalism half? Fantastic. I actually find... So being the tech person myself, I find y'all's perspective most interesting, because there is this really weird issue here. We all know there's all this abusive behavior. There is only so many things a tech team can do to fight it. But what is that ability that we can pull of... As a journalist, what is the expectation for the reader? Of how do we address concerns? How do we fulfill this tension between what would be the ideal, which is... I don't know.

AMANDA: No ads?

IAN: No ads, or ads from a very different perspective, versus... We do need a way to keep journalism alive. And sadly, this is it at the moment. And how do we create a framework to... I don't want to say center users' ideas. But be able to push feedback in a direction and help make the site and ads better, while also understanding and listening to people of these behaviors. So that's...

AMANDA: That is the question.

IAN: That is the question. So let's start with this. We talked about all this abusive behavior. What are some things that y'all's newsrooms have either done or are thinking about doing or would like to do to either combat or go against some... Protect against some of these possible abusive behaviors on our own sites?

AUDIENCE: One thing we discussed is that we have a flag in our CMS called the tragedy tag. And when the tragedy tag is true, I thought it was just programmatic advertising, but maybe it's just no advertising will be displayed for that part.

AMANDA: Just none at all. So it's not negative keyword. It's none? Okay.

IAN: We have something similar at Vox for big... We can turn off advertising for any sort of... Shooting or massacre or war thing. It wouldn't look right for Crest Whitestrips to be next to.

AUDIENCE: We can't turn off the programmatic advertising on AMP.

IAN: Interesting.

AUDIENCE: At the Times, where we both work... Not we, but other people at the organization have been doing this thing they call the Privacy Project, where they investigate a lot of this, and that internally prompted a review of all the analytics and all the junk we put on our site. Which isn't all directly related to advertising, but it's very advertising-similar, in terms of the analytics collection. Which sort of prompted us to realize... Oh, hey. We don't need half of this shit. There's a lot of overlapping stuff from various trackers that just kind of accumulates over time. And we eliminated some of it. And so that's doing a lot of this. And I think as a developer -- and this isn't necessarily where we are now, but I've been asked to put advertising or tracking or stuff on things, and having a responsibility to push back and make sure that the powers that be know the implications of all that stuff, because it's not always clear to the people... What they are.

IAN: And it's fuzzy. We were talking yesterday... Two years ago, we got a request from our sales team, like... Sales team was talking to the other sales team. The other sales team was like... You know what would be really great? Why don't y'all send us a unique identifier for all of your users so we can make sure we're giving them the right ads, and we'll increase your guarantee by such and such if you do that, and he came to me like... No. There's no way we're going to... Because that would defeat the purpose of a lot of ad blockers. A lot of people using Incognito mode. It would totally ruin that bit. But the sales guys had no idea. None whatsoever this was a thing. Yeah?

AUDIENCE: Piggybacking on there, I feel like you have to be internally consistent around your own tracking. We do a lot of ad insertion for podcasts, and we anonymize that data, but we don't just anonymize what we show advertisers for reporting stuff. We anonymize our own records. We don't trust ourselves to always be better. We make ourselves better. We actually take that step. You know?

IAN: Yeah.

AUDIENCE: A soapbox that I always drum about is that if you think back before computers, newspapers, magazines, television, billboards, all of those worked without direct user tracking. Of course, they tried to get demographics, and they tried to estimate how many people there were. But they had no direct tracking, because it wasn't possible. With computers, it is possible, and so we do it, but I believe that actually it is against the interests of publishers to do it, because if you are a publisher, your goal is to make sure the advertisers overspend on advertising. Right? And if you're an advertiser, your goal is to spend as little as possible and get as high of an ROI. But now what's happening is that the publishers are getting cut out. That they are essentially becoming middlemen, in a way. They help you find out that this person has an interest in whatever. But to actually show them the ad, you don't have to show it on the publisher who reveals the interests. You don't have to show it on their site. You can show it on some other random site. So I think for publishers, it is in their interests to, as much as possible, push for either legal or at least industry-wide standards that are against tracking, because we've seen that for newspapers, for magazines, for television, for billboards, for radio, that tracking was not necessary to have a healthy publishing ecosystem.

IAN: Yeah. Right over here.

AUDIENCE: You know what? You kind of said what I...

IAN: Yeah?

AUDIENCE: On iPhone, we actively intercept malperformant ad scripts and rewrite their JavaScript for them, to make them better performing.

IAN: I'm very interested in that later.

AUDIENCE: A battle that we've won in our newsroom the last couple years was... It used to be that we just had to have ads on everything and had very little say on to what the ads were, what the shape was, where they would get inserted into an article. That was particularly problematic where we were trying to do special projects that were supposed to be given a higher grade sort of display. So we were able to make it so that we have templates now where we basically get to decide where the ads go. And how many. Usually it will take the form of just kind of one thin ad up here on the top, one thin ad here at the bottom. Nothing intrusive in the middle. Nothing crazy on the sides. And that's made our user experience a lot better. People really like those article pages. We really like those article pages too, because we don't have to design around... Intrusive ads popping up all over the place. That's kind of been a step towards mitigating the disaster that our website can be, considering how many ads there are for non-subscribers.

IAN: I'm gonna go back to you.

AUDIENCE: Thank you. This resonates with me. I'm at the Times too, and I've been working on the project to replatform our Apple News coverage. And it was actually not surprisingly difficult to account for the ad placement and styling around it, because Apple News is a proprietary system, and if you're developing locally, you can not display ads. This is not just an ad problem. This is not just a tech problem. This is not just a publishing problem. It's an economic problem. That the force is to just drive the most revenue, and there are all these other environments that we have to play with, like AMP, and the tragedy tag or whatever, and the Apple News, and the more of those we have -- that do not have the publishers' interests at heart... My point is that this is ads, but this is also way bigger.

IAN: Yeah, yeah. Here?

AUDIENCE: I have a question. Obviously since... In the last 10, 15 years, publishers, news organizations, have obviously gotten very interested in audience strategy and tracking their readers and moving to paywalls and who is a subscriber, who is a registered user, who is coming to us anonymously from wherever, search... It's all person tracking in some fashion. Do you feel the news organizations are squaring those things at all, trying to square them, thinking about... Here are all the issues with advertisers tracking. But for our audience strategy, we want to track to some level.

AUDIENCE: I actually have an answer for you, since we work at the same place.

(laughter)

AUDIENCE: Maybe this is a dumb question to ask out loud.

AUDIENCE: And I worked on the project for audience tracking, as you know. I was on audience tracking for a year and a half. And we have our own internal tracking system that only tracks whatever we want and fires its own events and does not depend on any external sources. I kind of think that that's good. And we could have just... The team that works on that, that was adjacent to me, could just define whatever they want, and they were also in the same physical space as the data governance team. So if there were any concerns, they were so physically close that it was very easy to have this internal consistency.

IAN: And there's a piece of that -- that I find really fascinating. You work for the Times, right? The Times, Washington Post, Vox Media, we're gonna be fine. At the end of the day, what is hurting is... The Times can do that.

AUDIENCE: We have the people.

IAN: The American Statesmen, Star Tribune, all these local papers, which are first off getting decimated anyway, but they don't have the infrastructure of... Someone said this earlier... Of selling their own ads. Because it's such a crazy different world than they're used to. They can't set up their own data system, have a data governance team. These are all things that work for us. But it makes me question other pieces. Right here?

AUDIENCE: Yeah, so I have a fully different perspective on this.

IAN: Fantastic.

AUDIENCE: I most recently worked at a technology and media company with local news letters in five cities right now. So we're very much focused on local and we built our own technology to load newsletter ads into our newsletters, so that's fully customizable within a certain number of characters. For advertisers, it's a self-serving platform. And we've seen a lot of success with that. Our point of success is for it to be a fully sustainable product on its own. And I actually think... We've been experimenting with other ways of doing better advertising, and I actually think local is really well positioned to do a good job with that, because going back to your question... I think advertising can be a very good thing. It can be a useful service. It can allow people to understand what products and services exist in their communities that they should take advantage of, and local is in a very good position to provide that to their users, and that's what we're trying to do.

AUDIENCE: I would say along those lines... If you are in this, as a journalist... One of the experiences I highly recommend -- I did before I was an engineer on the journalism side -- was do a ridealong with local salespeople. Because you'll see local sales is asking a lot of the same questions for advertisers about the community that local journalists do. Like, the intent of these local sales folks are to provide a service to readers that is not entirely dissimilar from what journalism does. It's to provide a resource, to provide something useful. And that's why local journalism and local advertising has traditionally been so strongly affiliated. The idea of giving away journalism on the internet for free being a bad thing... Is ridiculous when you look at all of the local journalism outlets that gave their stuff -- that were not subscription-based. Right? There are tons of them. And it's because of this. And the problem of data is like why they've hurt so much. Because it ends up being... They don't have the resources to provide the service of sales that they used to in the digital age.

IAN: Yeah. Back here.

AUDIENCE: I just wanted to touch on the data governance that was mentioned earlier. That's part of the solution to this problem. You mentioned you kind of get pixel creep. Because some sales or marketing guy is like... Hey. No offense to sales or marketing. Hey, can we throw -- some vendor that I'm working with wants us to throw a pixel. Can you help us out? They'll talk directly to the developer. It goes on forever. I've worked at another organization where there was a problem. But now it's centralized. We have a team that conducts an audit with all our pixels and got rid of a whole bunch of them, and there's a review process where you want to add any pixel or connect with any vendor. They check off the boxes to make sure they're doing the right stuff with the data. And this is totally unrelated, but I wanted to sort of second what a table over there was saying. That I really... I suspect that targeted advertising is financially bad for news publishers, and that if you got rid of it -- even though we obviously generate a lot of money via targeted advertising, in a universe with no targeted advertising, you would generate more revenue. Because now we own the platform, and that would be the new best way to target. So I'm curious about other people. That might be... I've always thought that. But that might not actually be true.

IAN: Does anyone else have a thought?

AUDIENCE: Yeah.

(laughter)

And there's actually math. Programmatic ad stacks extract 70% of the value. Right? But your direct sales ad stack is going to extract substantially less. So even just on that basis alone, if you can eliminate all of those middle men who are doing all of that user tracking, all of that value goes to publishers. Right? There's this idea that the tracking is necessary. But it's not like marketing budgets will disappear tomorrow. They were there before user tracking. They'll be there after user tracking. And it's just the percentage of it that goes to publishers has disappeared. But if you can sell direct sales on $100,000 and only have 10% of your value extracted, you've made more money than if you sold $200,000 worth of programmatic that's had 70% of its value extracted, right? And that's a big balance.

AUDIENCE: Haven't we seen some empirical evidence for this with the GDPR? That revenues have actually gone up?

IAN: Yeah, but there's this giant boulder of inertia. I don't know if anyone has ever looked at the number of tracking companies out there or data providers and stuff. It's just like...

AUDIENCE: Over 7,000.

IAN: Yeah, it's insane the amount of Silicon Valley startups that have been built on this premise.

AUDIENCE: And the growth has been... It's over 7,000 now, and in 2011, it was under 200.

IAN: Yeah, and that's where your 70% of the money is going. To them. Yeah. Right here.

AUDIENCE: This is actually one of my questions and one of the things... I don't work in a newsroom, but work in technology. And that is: Creating cultural pressures against this kind of work. If you are a talented engineer, beyond those 7,000 companies, there are 800,000 other places that would love to hire you. I'm really curious... I don't have the answer here... But who are the people who elect into a job based around tracking individual users? It just seems like...

AMANDA: It's people like me and Ian and Aaron. It's people who think we can fix it. There are people who think they can fix it. There are people who want the money. There are people who just see it as a challenge and it's a job.

AUDIENCE: I mean, especially in American ad tech companies, there's an immense engineer burn rate, where they go through engineers who stay there for a very brief time and then go forth. And the reason usually is because they can pay a lot more to engineering people than the standard engineering entry level engineering jobs. And part of the reason is because they actually fund themselves by raising CV money off of hiring engineers from prestigious institutions. So that's part of it.

AUDIENCE: Why is the burn rate so high?

AUDIENCE: Because I don't know if you've ever tried to get engineers to work on add tech, but engineers don't want to work on ad tech.

AUDIENCE: So they take the job they know they don't want and they leave six months later?

AUDIENCE: Once their stocks vest and they have...

AUDIENCE: Or they just paid out enough or they're more senior. There's actually a really great article that Wired did in 2016 called The Benefits are Great, but Don't Ask Us What We Do, which interviews a whole bunch of people who worked at a Philly-based company who produces those hijack toolbars. They don't anymore, but it's basically like... A lot of these people didn't even realize that that was what they were doing. They interviewed one of the engineers and he was like... I worked here for a year and somebody explained to me that we worked on those things that hijacked your grandmother's computer to have a whole bunch of popover ads, and it's terrible. They didn't even realize it. And they interviewed people who did realize it and they were like... I don't give a fuck. It's just an interesting problem. That's a direct quote from the article.

AUDIENCE: On the other side, not from the abusive ones that take over, but on the targeted side, getting people that would really be interested on it, it helps the other side of like... The small businesses that can't really pay $10,000 to be on the front page of your publishing website, so that targeting allows them to exist and pay less money to reach their communities. So that's the "good side" of it. Not the taking over the whole website and doing that. But that's the only positive side, and how to look forward, when you're looking to build advertising that's not that bad.

IAN: There's a flip side to that. It gives small businesses that ability, but also, both Facebook and Google, almost in a predatorial way, go after small businesses. I registered a small DBA, just as a business that was just me, and the minute you do that, you get a mailer from Google saying... Here's $50 of free adwords. You should check it out. 5,000 people looked at my ad. That's great. And they start pouring money into Google. It gives them the ability to look at their market, but I question how much of that is actually affecting their marketing budget.

AUDIENCE: Being seen by humans. 30% to 90% of digital ad revenue is fraud.

AUDIENCE: In many other fields, to kind of get back to the ethical aspect of it, there are people who work on... Are in those fields are licensed or have a certification in some way, and a lot of that licensing and certification goes around ethics. Making sure that you're working in the best interest of the people that you're serving. And that doesn't exist in engineering. Or tech. We call ourselves engineers, but a real engineer actually has to have that licensure, which ensures some of that.

IAN: Yeah, don't call yourself an engineer in Canada. It's illegal. Over here. And then I'll do a wrapup.

AUDIENCE: I want to say: One of the things is to realize that it is a cultural battle that's being fought over. And it doesn't have as much visibility as it should. And there's a lot of marketing around how it's talked about that goes unchecked. I think one of the really great examples to me is always -- because I used to work in game development. They would be like: Yeah, you want whales for a lot of these gambling mechanics. And you're like... What's a whale? When they talk about it in a public setting, they're like... Oh yeah, it's just somebody who has a lot of money and spends a lot of it on a stupid game. But the reality is a whale is somebody who has financial issues and is willing to get in huge amounts of debt. And so suddenly, it doesn't sound so cool when you're like... That's who they're really going after with those things. I think knowing... Not being naive about it, knowing the language, and being willing to explain that language, like our CTO will talk to our CEO. They'll have meetings where he just explains... This is why what this ad company is asking for is unethical. Here's the talking points you need to know. To argue with them. And really building that language and that literacy within your company and that knowledge within your organization.

AMANDA: That is good.

AUDIENCE: It's also helpful to have an engineer in the room on some of the sales calls.

AMANDA: Yeah, we get a lot of RFPs that are just bogus, and things that we could never do, and would never ethically do, that we have to push back on.

IAN: Yeah. So... When this concept for a talk... We put it together... I was very fascinated where people saw the issues. And I'm not gonna lie. This entire thing was an excuse to have other people tell me what their issues are. And I am so glad you all came. I was never quite certain how we could actually wrap this up. Could we actually walk away with some sort of framework or understanding? Could we come up with some ideas of how to do this better? From what I'm hearing, there are a bunch of really great ideas of things, but it's almost that... The industry has... We know the industry has an issue, and all of us are... We're slightly confused as to what to do next. Do you have any other thoughts in that direction? The slides actually... I did a lightning talk at Vox earlier this year, and the last slide I had was... I was told to remove it for this. Call your Congressperson. Because as much as I hate to admit it, at this point, there needs to either be some sort of large reckoning, where media companies come together writ large, and we put a firm drop dead date in a GDPR-esque way, that after this point, we're going to change things ourselves... Barring that, the only body in this country that can really make a difference is Congress. I don't know if I can say any other positive happy... This is gonna be great. But what I would love is, as all y'all are moving forward in journalism, and as ad tech people and as engineers, I would love to hear stories over the next few months, years, et cetera, of what you're doing. If you'd like to come up, I would love to give you all my email address and have you talk to me at some point in the future, even if it's just a note that's like... Oh my God, you would not believe what I just saw.

AMANDA: Or send him the bad ads!

IAN: Yeah, I'll create a bad ad email. This is a piece I find fascinating, and looking at the point that I realized I needed to go to law school... Was the Congressional hearing where Chuck Grassley asked Zuckerberg: Hey, Facebook, how do you make money? To which you had android Zuckerberg say advertising. With a smile. And it slowly dissipated, even for him, to a look of despair. This is the chairman of a Senate Committee who does not understand the core basis of how the internet works. How the internet makes money. And we need more people, more people writing letters, giving testimony, pushing forward ideas, that understand that, and are willing to go and make change. Yes?

AUDIENCE: How can you do that with being mindful of folks from different classes and different literacy levels around technology? Because a lot of these things sound like, to me, middle to upper class issues, with like... I make all these purchases, and now the internet is selling me stuff. But a lot of these things are targeted towards more vulnerable populations who sometimes don't speak English.

IAN: I think what you hit right there is the worst part. I was talking about some app, and it showed up on my Instagram feed. People complaining that way. But someone earlier said... Digital redlining. Which is a huge concern of the folks who are being targeted because they have some sort of neurodiversity disorder, which happens all the time. Or you have people who are older, in poor communities, who are getting targeted for loans, predatory loans. It happens in terrible ways, in all these different communities. Does that partially address...

AUDIENCE: But I'm asking you: How can you make a plan to include... To mobilize those folks and to also make sure that you pay attention to their issues as well? Their specific experience of this issue.

IAN: Yeah. I don't know. But thank you, actually. Truly.

AUDIENCE: May I make a suggestion? We are in a room of journalists. Right? Ad tech, I think, is going to be the big journalistic topic, and explaining it to people of all classes and how it affects them and impacts them and what they can do about it is going to be a big journalistic imperative, especially with the upcoming election. There was just a report in Ad Week, that the democratic candidates are starting to hire in-house programmatic teams. This is going to help decide the next election. And as people working in journalistic organizations, we have an opportunity so say: Tracking. Here is what is going wrong. Here's how to talk to people, help them understand. Let's do events in low income communities to explain what is happening and what their options are. There's a really interesting Mozilla experiment that just got released this or last week, where it turns your browser cookies into the cookies of a person of a different class. So you can turn your browsing experience and what ads you receive into the ads of somebody who is very wealthy, for example. And using tools like that to explain what the impact is of something like digital redlining I think is a big imperative that journalism organizations should be doing. The Times has the Privacy Project right now. But there's big opportunities to do it in all of our journalistic institutions. Because it impacts everyone. Everything is ad tech. Ad tech is everywhere. It's on our buses. It's on our taxis. It's out there beaconing you in grocery stores. This is impacting everyone, everywhere, in all sorts of unexpected ways. And explaining to all communities -- but especially those communities without the resources to necessarily understand it right off -- is a specific journalistic function.

AUDIENCE: I would say that... I'm kind of in this in-between space of tech and journalism. I work as a digital security trainer for Freedom of the Press Foundation. And a lot of what we do is helping journalists and others use their computers more safely. And in this case, a lot of what we do is talk to people using things and being confused by them. And in this case, we don't just talk to journalists. We kind of in many ways also offer a lot of things to the general public in terms of digital security training on really introductory stuff. Things like new internet users, the elderly, folks from all over the diverse Borough of Brooklyn, who are basically coming into the world of technology because they have no choice, because this is what they have to do, to be able to participate in the economy. And in this case, we have learned a lot about how to make our own education around security more accessible by talking to users that have a smartphone, and who are asking questions that we never thought to ask. And something I wish technology companies would do more of: Just watch people who are not them use their stuff. And I think it would be great to have that kind of... Do X kind of testing or visibility, user interviews process that we've seen in some places like that. Or informally, we do it in newsrooms as well. Show people who are not reporters... This is our website. This is what we do. Watch them scroll down, click on the fake news ad, and start believing that Hillary Clinton was a lizard person or whatever.

IAN: That was wonderful. Looking at how other people use...

AMANDA: We've got to wrap up, because we're three minutes over. Thank you all.

IAN: Thank you all so much.

(applause)

You can’t perform that action at this time.