Find file Copy path
Fetching contributors…
Cannot retrieve contributors at this time
324 lines (162 sloc) 44.9 KB
(Intro music: Electro Swing)
0:00:14.1 **Adam Garrett-Harris** Welcome to BookBytes, this week we’re talking about “Technically Wrong.” I’m Adam Garrett-Harris.
0:00:19.1 **Jen Luker** I’m Jennifer Luker.
0:00:20.5 **Safia Abdalla** I’m Safia.
0:00:21.5 **Jason Staten** I’m Jason Staten.
0:00:22.9 **Adam Garrett-Harris** So, we were going to split this up into 2 weeks by talking about the apps and websites and things like that, the technology this week and then next time talking about the company culture. Alright, so what stood out?
0:00:39.7 **Jason Staten** One particular thing that stood out to me was the nature of our apps to always be celebrating or always be, sometimes excessively excited, and cheer for some of the smallest things. For example, say you go and you make your first post on something and you get confetti that’s bursting on the screen saying “Hooray!” and it may be a good thing, but perhaps your first post is not something that is so exciting or shouldn’t be celebrated. That is something that I feel like is really systemic in software that I encounter today. There’s always lots of cheering, sometimes maybe even because it’s gamified to give you that little dopamine hit or something, but that is one issue that I think that the book brings up about apps.
0:01:32.5 **Adam Garrett-Harris** Yeah, I actually saw that at work recently with a website that would have confetti falling across the screen and it was every time you went to that page, not just the first time, but if you hit refresh it would fall again. And we killed it this week, it’s gone.
0:01:50.1 **Jen Luker** Yay! I think the interesting aspect going along with that wasn’t just the fact that there was a lot of celebration and a lot of, you know, cheering of all of the things, but it was literally all of the things, even the awful things like someone died, or there was a divorce, and because of all the cheering and of all of the sharing and all of the “Remember when it was great last year?”, you know, you end up sometimes being reminded of all of the horrible things that happened as opposed to just all the happy things, and that can end up being really traumatic for those people.
0:02:31.3 **Adam Garrett-Harris** Yeah, I remember seeing the video, and this wasn’t a traumatic example, this was a silly example, but the example of the guy who was in a car wreck and so all of the photos were like, the same photo of his car being wrecked over and over and over in this Facebook video.
0:02:48.4 **Jen Luker** Yeah, and he only had one friend.
0:02:49.5 **Adam Garrett-Harris** Yeah.
0:02:50.3 **Jen Luker** It was his friend that was in the car crash with him.
0:02:52.5 **Adam Garrett-Harris** I remember when that video came out and I saw it and I thought it was just funny. Like, I didn’t even consider all the situations where that would be devastating.
0:03:02.2 **Jen Luker** I actually have a personal example. We had a friend who died at the age of 30 out of the blue, and I recently got a huge cheer notification that we’ve been friends for 5 years.
0:03:15.5 **Adam Garrett-Harris** Oooh.
0:03:17.2 **Jen Luker** It’s like, almost to the day of the anniversary of her death.
0:03:19.7 **Adam Garrett-Harris** Ow.
0:03:21.1 **Jen Luker** So, that was not pleasant, and it was in the middle of, you know, bowling night and my phone buzzes and I pick it up and it’s the reminder that I’ve been friends with someone for 5 years who is no longer around, so that was fun in a very un-fun sort of way. So, it really hit home that concept of not all things are necessarily meant to be celebrated, and not all things are right to be celebrated. And yes it’s great that I got to spend the time with her that I did, however I didn’t necessarily need to be reminded on almost the anniversary of her death that she’s not here anymore. So, that, you know, was one of the semi-examples that they used, but to have personally experienced that and kind of had that like, “Gee, that was really cruel.” So…
0:04:08.1 **Safia Abdalla** Yeah, and I think it conveys the extent to which corporations that are often behind these products value profits over people, because the notion is always that if you keep somebody happy and in kind of a state of complaceny, they’re more likely to engage with your content, and then click ads, and you make money, and all of that fun stuff. And I think it’s one of those situations where the near blind pursuit for clicks and engagement can cloud the fact that the internet and digital experiences aren’t always rosy places for people and that your motivation shouldn’t be, “How do we get somebody to click on an ad and put them in an emotional state where they’re more likely to click on ads?” Because sometimes it can backfire on you. I think that kind of was the- seemed to be the underlying thread behind all of these stories was how companies make choices that are designed to increase how much money they earn, not necessarily how much joy or health their customers derive from a product.
0:05:18.6 **Jason Staten** I think the book brings up a good example of when you don’t only want to look at just engagement, in the particular story of Nextdoor.
0:05:29.5 **Jen Luker** Yeah.
0:05:31.5 **Jason Staten** Where they have suspicious activity reports where they could say, you know, somebody happens to be breaking into cars or something like that. But generally what was happening was people would go and put in, simply a… Maybe a racial profile and that’s about it, and because it was just a big blank form they had a certain engagement ratio, and Nextdoor actually took some steps to go and counter that by first when you are filling out the form, rather than making it just be a big blank text area, instead asking specific questions to encourage you to include other details other than those that could be racist or sexist, and then their engagement with that feature of the application actually fell, which is actually a good thing. So, simply just measuring engagement and clicks can actually be a negative thing at times, and keeping that in mind is important.
0:06:40.4 **Jen Luker** I really appreciated the fact that when developing those products, as opposed to saying that “there are edge cases”, really focusing on the “there are stress cases”. That completely changed the paradigm in my mind for how products are developed. It’s like, what are the stressors that these people may be under at any given time, because the products are meant for people, they’re not meant for the AI’s that are solving whatever solutions that the companies want, you know? So, you know, if we have these happy cheerful things, what are the stress cases? Well, there could be a death, there could be divorce, there could be a house burned down, there could be any number of things that go wrong and then how do we prevent cheering for those awful things?
0:07:31.8 **Adam Garrett-Harris** Yeah, like the Twitter feature that shows balloons.
0:07:36.6 **Jen Luker** Mm-hmm (Affirmative).
0:07:37.5 **Adam Garrett-Harris** I’m sure the designs for that, because we often design for the happy path, and we don’t think about these stress cases, so the designs for this probably just have a normal person’s profile, maybe it’s got some birthday tweets on there and then the balloons. They look great. But you gotta consider real people and any number of tweets that could be on their page when those balloons are happening.
0:07:59.5 **Jen Luker** Especially when it’s something, you know, like the example in the book where it was protests and being arrested for being part of those protests on someone’s birthday. So there’s all these balloons but you’re looking at riot gear and prison cells and angry people, really, really contrasting there.
0:08:24.8 **Adam Garrett-Harris** I like the example where it’s talking about Glow, which is a fertility/pregnancy tracking app.
0:08:32.1 **Jen Luker** It was a period tracker.
0:08:33.7 **Adam Garrett-Harris** Yeah, but they were focusing only on getting pregnant as the goal.
0:08:38.8 **Jen Luker** Hrm. It’s like they had started out with just it being a period tracker but as they added new and more features they ended up focusing much more on pregnancy and preventing pregnancy and, you’re right, the fertility aspect of it which can be very alienating for those that are not trying to get pregnant, not trying to prevent from being pregnant, or aren’t in a relationship that could result in a pregnancy. That was a good example.
0:09:05.4 **Adam Garrett-Harris** Now, I thought it was especially interesting, like even though they kept making it more and more inclusive, it still… They still kept missing the mark. So, it’s not necessarily a- It’s kind of used as a bad example, but I think it’s also showing that this is kind of a work in progress where you’re not gonna get it perfect the first time and you need to keep making progress to include more and more people.
0:09:29.0 **Safia Abdalla** One of the things that was mentioned in the book that I thought was interesting and it’s similarly around kind of the health space, was they cited the smart scale that had a partner app that automatically assumed that you were trying to lose weight. So, when it would send you notifications about your weight it would often say things like, “Congrats! You’ve dropped X number of pounds!” or “Hey, you only weigh this much!”, and I thought it was interesting because right after, or a couple of days after I read that section of the book my new Fitbit had arrived that I’d just purchased and I kind of started to place a really sharp focus on the experience of setting up the Fitbit and what kind of information it was requesting about myself and my body, and one of the things that I noticed and I think was a really good design decision on the part of Fitbit that contradicts with what the smart scale did, I’m not sure what company or brand that was, was that Fitbit kind of took a neutral position to how they asked you about your weight goals. So, when you set it up it asks you, “Do you want to lose, gain, or maintain weight?” and then it asks you to, you know, set the goal weight that you have, and I’ve been using it for a couple… About a little over a week now, and one thing I’ve noticed is I haven’t gotten any notification that have text that kind of insinuated or assumed I had a particular goal of losing weight, and the entire experience of the product was kind of like, very neutral. It wasn’t making any assumptions about what I wanted to do with my body or what my goals were. So, that was kind of a cool thing to see and be on the lookout for after I’d read that section of the book, was how were a lot of the health apps that I’m using making assumptions about my body and what I want, and whether or not they’re harmful. So, good job Fitbit! If anyone from Fitbit is listening.
0:11:29.5 **Adam Garrett-Harris** Nice, yeah. Your body is like, a very personal subject so it’s…
0:11:35.3 **Jason Staten** And unique.
0:11:36.4 **Adam Garrett-Harris** Yeah! Talking about edge cases, right?
0:11:39.9 **Safia Abdalla** Yeah.
0:11:40.1 **Adam Garrett-Harris** Like, if everyone is different, everyone’s body is different and it’s very personal, and so for a company to say something about your body or make assumptions about what you want to do with it, it’s just very personal.
0:11:52.4 **Jen Luker** I remember way back in the day when Michael Jordan was amazing and on the front page of every sports thing, there was the argument that BMI was not very accurate because Michael Jordan, in all of his full glory, was considered morbidly obese because he was 300 lbs, and it turns out, you know, he’s 300 lbs of muscle and he’s very tall, but he was considered morbidly obese based on BMIs. And then there was the example in the book that says, you know, “Oh! You lost a baby. Hurray! You’re at your lowest weight!” or “Oh, you’re anorexic. Hurray! You’re at your lowest weight!” You know? Or, “Oh you’re 29 lbs because you’re 2 years old. Oh, I’m sorry that you’ve gained a couple of pounds, but here’s some things that you can do to get them off!” You know? So, you know, it’s really… It is very true that we need to take into account each individual body and I really, really enjoyed the example where they were trying to find the average size of a pilot and therefore they were making the cockpit fit that size. So what they ended up doing was taking every pilot in a 300 group… Was it a 100 or a 300 group set? And they took measurements from every single pilot from height to wrist circumference to chest circumference and head circumference and measurements in between trying to determine what size this was, and they found the average. And then they tried to find the pilot that would fit that average, and they figured out that there was not a single pilot in the set who actually qualified for more than 2 of the measurements. So, the reality of “average”, the “average” person, doesn’t exist. There is no such thing as the average person. What they ended up doing is making the cockpit fit both the largest and the smallest person with adaptations in between in order to make each pilot comfortable, because there is no average.
0:14:01.8 **Adam Garrett-Harris** Yeah. I love that story, and there's a really great episode of “99% Invisible”, which is a podcast, about that, and if you want to look it up, it’s episode 226.
0:14:13.6 **Jen Luker** You’re not going to go into more detail about that?
0:14:13.6 **Jason Staten** I remember listening to that one.
0:14:15.4 **Adam Garrett-Harris** Oh, well I mean, it’s the same story that you just said, but it’s just a great episode that talks about it.
0:14:19.8 **Jen Luker** Ah.
0:14:20.3 **Adam Garrett-Harris** What about the default settings?
0:14:22.2 **Jen Luker** Yeah! The default settings were really interesting.
0:14:25.2 **Safia Abdalla** Mm-hmm. I was, as a current college student, soon to not be college student, I was completely shook as the kids say these days, by the origin of the standard Times New Roman 12-point font that professors or teachers will often require for their essays and the fact that that actually originated from the defaults that were set in early word processors, and that was just a technical decision that kinda carried on and became a cultural and educational habit, too. So, it’s so kind of strange to see how that decision made a crossover and kind of became a bit of a cliché in every paper requirement that you read. That was the one that really, really got me about defaults.
0:15:18.0 **Jen Luker** I found the example of gender diversity an interesting- In that once you enter like Facebook for instance, you have the options of having different pronouns assigned and different gender selections assigned so it’s not just the binary male and female, you have other options. However, when you sign up the very first time, you only have that male and female choice, and that allowed the company to say that they had that gender diversity and the freedom to be able to make yourself who you are, but they got the male or female binary from you at the very beginning on sign up in order to appease the advertisers who wanted to use those metrics as focus groups in order to advertise to you. So, they started out with the binary and once you got in you got to change the defaults, and since most people won’t change the defaults once they get past that point, you’re still qualifying for a lot of that advertising.
0:16:29.1 **Adam Garrett-Harris** Yeah, you know, and the book’s very careful to point out that a lot of these examples are not done on purpose or with malicious intent. A lot of these people have good intent, they’re just not thinking about people, I have to wonder about something like that, if they’re really trying to do this to appease the advertisers or someone just missed or overlooked it, because the gender selection is in 2 different places and someone didn’t search all the code enough or it was in 2 different places, you know? They just didn’t realize it was there…
0:17:00.5 **Jen Luker** Yeah, but you have to figure that by the time this book kind of came around that they would have had enough time to-
0:17:06.6 **Adam Garrett-Harris** Yeah.
0:17:06.6 **Jen Luker** Not only figure out that they were different, that someone would have complained, someone would have asked, and they probably would have said, “Well once you’re past that part then you can make your choice, just pick one for now.”
0:17:18.6 **Adam Garrett-Harris** Yeah.
0:17:19.2 **Jen Luker** And someone knew. Like, I understand that, “Oh, you have a mistake.” You know working in the tech industry some things are easy, some things are hard, but it’s a form field, and if you can change it somewhere else, you can change it there. And if you haven’t done it after that much time, then do you really intend to? It’s just it’s… Especially when you’ve changed it somewhere else already. And I mean-
0:17:40.5 **Adam Garrett-Harris** And another thing about that, I feel like, is even if you change it later, they may keep that data of your original selection, you know? Just because you delete something on a site, or change something on a site, doesn’t mean they don’t retain it. So, they might still be using that for advertising data even if you change it later.
0:17:56.7 **Jen Luker** Yeah, and I agree that-
0:17:58.1 **Adam Garrett-Harris** It’s like, “Oh this person says they’re binary or whatever, but-”
0:18:00.5 **Jen Luker** Yeah.
0:18:01.2 **Adam Garrett-Harris** I’m sorry, not binary.
0:18:02.6 **Jen Luker** NON-binary.
0:18:04.8 **Adam Garrett-Harris** Non-binary. I don’t know… It’s something I need to get better about is I don’t know a lot of the terminology in what’s accepted and what people prefer these days.
0:18:15.8 **Safia Abdalla** Yeah, I generally just ask if I feel comfortable.
0:18:20.0 **Adam Garrett-Harris** Yeah.
0:18:20.2 **Safia Abdalla** Or avoid the person forever and never talk to anybody. Um, that was a joke. But yeah, I find asking the best thing to do, which I think kind of segways into something that I learned a while back from somebody who was doing an accessibility talk and relates really well to the points mentioned in the book about how you design forms that ask for somebody’s gender identity, which is the best option is to just like put in a free-fill form and have people tell you who they are instead of making them pick who they are, and you know I’m sure every developer listening to this kind of squirmed and recoiled at the thought of dealing with all of that messy data, but I think it’s one of the situations where you might have to pick something that is technically more difficult for you as a developer so that more people can feel comfortable. One example of this for me, is on Zarf, which is the startup product I built, there is a registration form that asks people for their first and lasst name and there’s like virtually no checks on what you type besides like, you know, don’t like, try and hack my app with some sort of evil script or anything.
0:19:43.1 **Jason Staten** Drop table.
0:19:45.2 **Safia Abdalla** (laughs) Yeah, don’t say your name is drop table. But there’s-
0:19:50.9 **Jason Staten** Unless that’s your name.
0:19:54.1 **Safia Abdalla** (laughs) I will have to talk to your programmer parents about that. There’s like, very little checks on what the person puts in. Like, for example, there’s no requirement on what the minimum length for a name is or what the maximum length for a name is, and I learned that from a tech talk that I had gone to earlier. There’s no limitations on whether you want like, to put hyphens or apostrophes or things like that in it, and I haven’t had any problems with it so far, like, I also haven’t gotten to situations where I have a large enough user base where those problems might occur, but it was something that was kind of trivial for me to do and take into account, but has improved the experience for customers or users with, you know, I don’t want to say “non-traditional” but just names that might be considered stress cases or edge cases.
0:20:52.8 **Jason Staten** I had a question specifically for Adam. Being a more recently hyphenated person, have you run into any issues where hyphens have been rejected when you’re signing up for something or filling out a form field?
0:21:10.7 **Adam Garrett-Harris** Yeah, I actually worked on a startup about 2 years ago called Beeline and we recently started working on it again and since then I’ve changed my name to be hyphenated, and I realized when I signed up for it recently that I could not type my name in there because it didn’t allow for hyphens or spaces. So, after trying to type with a hyphen it didn’t work, I tried typing with a space and that didn’t work, so I just squished it together and my wife has run into all sorts of issues because her name is now 22 characters long and some places just cut you off at 20 characters. So, she got a new job and her email address was RebeccaGarrettHarr. And having a hyphen can be weird too, because you… A lot of email addresses, and her work at least, they put a dash between the first and last name, and then there’s an actual dash in her last name. So they fixed her email, but now it’s GarrettHarris, with no dash. It’s Rebecca-GarrettHarris. So, yes, I’m much more aware of this problem now.
0:22:12.1 **Jason Staten** I like that the book pointed out, too, that it’s not even just that there’s a problem in one place, but it’s more the thousand cuts scenario where every place that you go handles that thing differently, and so your identity in one place may be different than somewhere else because of where truncation happened or where character rejection happened.
0:22:36.1 **Adam Garrett-Harris** Yeah, and there may be some forms in the future where I might have to write down Adam Garrett-Harr as one of my aliases or Adam GarrettHarris without a dash as one of my aliases.
0:22:45.6 **Jen Luker** This also comes very deeply into play when talking about ethnicity, or nationality. When you are of mixed races as well, which I found was really, really fascinating that because of the fact that so many of those form fields are radio buttons, that you only get to choose one, and at that point you’re kind of having to choose which part of yourself you’re accepting today. That was really… I think that was the biggest eye-opening portion is the fact that I, you know, I choose caucasian, I choose white on those things, but I don’t have any other reason not to. So, it just… It hadn’t quite reached my thought process that that was a thing, you know? That when it came down to, you know, identifying who you are or where you came from or, you know, your nationalities or ethnicities, that you were literally forced to kind of choose one and deny the other because of a radio button.
0:23:46.7 **Safia Abdalla** I think it’s real interesting that you mentioned that, you know, this isn’t something you thought of as often because it was actually a source of great anxiety for me growing up, because I was not born in the states. I was born in Sudan and my family moved here when I was pretty young and I kind of grew up as the eldest daughter who was responsible for filling out a lot of the forms and things like that and one of the big sources of anxiety for me was that often, less so now, I think people are more aware of it, but often in forms you might see that there’s a radio button associated with Black/African American, and then in some forms there’s just a radio button that’s African American. I personally don’t identify as African American, I identify as black, because I wasn’t born here and I don’t have like, American heritage, or you know, family who was born here or anything like that. And that was something that caused me a lot of anxiety when I was growing up because there’d be these forms and it’d be like, you know, “White, Hispanic, African-American, Asian, so on and so forth…” I was like, well I’m not really any of these things, I’m something that I’m like, I belong to an identity that I guess people aren’t thinking about. And like I said, it happens less so now, I think most forms have the Black/African American, which kind of accepts individuals of African heritage who aren’t born or from American families, which is cool. But yeah, totally different experience. Those forms wrecked me growing up because I was like, “Where do I belong? Who am I? What is going on?” So yeah.
0:25:32.2 **Jen Luker** You know, but that was kind of the point, was the fact that even in the book itself it said specifically that if you were Middle Eastern, you should have selected White. So a black, Sudanese person, it’s like, well I’m both Black and White?
0:25:47.8 **Safia Abdalla** Yeah, that’s a whole other dimension because I’m technically I’m Afro-Arab. I’m African and also Arab. So then I guess I’m like mixed race, but like where does that identi- Oh my goodness. You can see where I’m going with this. This is the kind of emotional anguish or just anxiety that happens whenever I have to pick a radio button to describe like, who I am, and generations of my family and identity down into like, one button.
0:26:17.4 **Adam Garrett-Harris** And some forms do a weird thing where they ask about if you’re hispanic or not first, and then they ask your race later. I was always confused by those.
0:26:27.6 **Jen Luker** Again, apparently according to the book, you’re supposed to choose Hispanic, and then you’re supposed to choose White.
0:26:32.6 **Safia Abdalla** Yeah.
0:26:32.9 **Jen Luker** Which is, again, confusing.
0:26:34.5 **Adam Garrett-Harris** Yeah.
0:26:35.4 **Safia Abdalla** Yeah, because I think there’s… I had someone who was of Hispanic heritage explain this to me, and I might be totally botching this explanation, so don’t take it from me, but I think it relates to the fact that Hispanic means individuals who speak Spanish, which means you can be someone from Central America or South America, but you can also be someone from Spain, and a couple of other countries in Europe. So, if you are from Spain, you are Hispanic, but you’re white and there’s another category for if you are from Central America or South America or other regions.
0:27:17.3 **Jen Luker** So, that would be Latino at that point.
0:27:19.5 **Safia Abdalla** Yeah.
0:27:20.5 **Jen Luker** So, you’re actually from the Latin-American continents. Essentially speaking.
0:27:23.5 **Safia Abdalla**Yeah, the Latin-American, the Spanish-speaking Latin countries instead of the Spanish-speaking European countries.
0:27:30.6 **Jen Luker** Which, again, then falls into… Brazil is Portuguese-speaking, so they don’t even fall within that.
0:27:37.8 **Safia Abdalla** It’s like nothing makes sense when you try to put people into buckets or something.
0:27:41.9 **Adam Garrett-Harris** Mmm.
0:27:42.6 **Jen Luker** Imagine that!
0:27:43.7 **Adam Garrett-Harris** And another question is, why do they need to know? I think a lot of times they don’t need to know so much information.
0:27:49.4 **Safia Abdalla** Yeah. I think the only time I’ve had to fill my race and ethnicity out was in job forms, which I guess they need to know for government reasons I think, but I can’t recall having to fill it out for a case that wasn’t medical or related to some government statistics, which begs the question, why does the government need to know your identity, or why do corporations need to know?
0:28:15.2 **Jason Staten** That was going to my first thought, was something medical that could be genetically related but…
0:28:22.1 **Jen Luker** Like, if you are African and Mediterranean you have higher chances of having Sickle cell anemia. However, if you are Asian you have higher chances of having a dry ear wax which causes different problems than a wet ear wax as far as hearing and ear health, and there’s just a lot of different things that can be taken into account based on those genetic factors, but what if you are, you know, something that falls halfway into a bucket, and halfway not, but they force you into a bucket because it’s close, you know? And therefore you end up falling within genetic factors for like, health insurance reasons that you don’t qualify for. You know? You end up not being covered for this thing because you wouldn’t qualify for it because you’re Hispanic even though you are a Black Hispanic person.
0:29:17.3 **Adam Garrett-Harris** Yeah and this also reminds me too of how there were American, I forget what they called them, but basically concentration camps for Japanese-Americans.
0:29:26.8 **Jen Luker & Safia Abdalla** Yeah.
0:29:27.3 **Adam Garrett-Harris** And they used data from the census to find these people. Otherwise they wouldn’t have been able to find a lot of these people.
0:29:33.6 **Safia Abdalla** Yeah, and I think there’s… I’ve seen some like, good discussion about it in folks who work for the government like, doing building technology for the government or consulting in building tech for the company is, as an engineer, and especially in political situations that are kind of hostile toward minorities or certain individuals, do you have a responsibility to refuse to collect data that could potentially later be used to harm individuals. I’m not sure if the book mentioned it, I might be confusing it with something I read along the book, but there’s the, I think, famous story of IBM and their involvement during the Holocaust, and like, the alliances they made with the Nazis with the respect to like, data collection and stuff. I’m not well informed on this so I probably shouldn’t speak about it at the moment, but I know there were instances in the past where technology was used to make it easier for individuals to harm people based on the data they knew about them. So yeah, it’s… I think it’s something we should all be cautious about as technologists is, you know, once you collect it, it’s in a database somewhere forever. There’s backups and it’s never gonna leave and you have this potentially risky information about a person that could be used to harm them.
0:30:57.4 **Adam Garrett-Harris** Yeah, and you never know what that data might be. It may seem completely benign to you, and you would never imagine how someone could use it against somebody, but there’s all sorts of ways to do that.
0:31:08.7 **Jason Staten** It’s a specific scenario, too, about collecting too much data is brought up with the Uber app, and asking to collect your data or- Sorry, to collect your location during the time that the app is open or being able to collect it in the background so that way you can have an improved riding experience. They make the claim that it’s to make sure the 5 minutes before and 5 minutes after drop off are correct, or to analyze that a little bit, but-
0:31:44.8 **Adam Garrett-Harris** Yeah, one example was they want to see if you have to cross the road when you get out.
0:31:49.4 **Jason Staten** Yeah...
0:31:50.7 **Adam Garrett-Harris** Which seems like a lofty goal to make sure your customers are being safe and being dropped off on the right side of the street.
0:31:55.9 **Jason Staten** The iPhone, it’s setting collecting in the background doesn’t work like that. It’s always collecting, and so even if they may not be using it at the moment, as was brought up, it’s sitting in a database somewhere and could be used later.
0:32:10.2 **Safia Abdalla** I think another interesting kind of connection between ethics and product development is not only just not collecting, or determining whether or not you really need to collect sensitive data, but also the way you communicate it to a potentially unwitting audience. So, as like someone who doesn’t know what Uber is and isn’t in tech, if I got a message that was like, “Hey, do you want to enable background location selection so we can help you do this?” I’m tempted to say yes just because it seems like it’s good for me and maybe I don’t necessarily understand what the technology is doing or what kind of information it’s collecting, but because they weren’t completely transparent in their messaging, I’m incentivized to choose a certain option, and you know, I’m not trying to critique Uber too harshly. I understand why they painted it as something positive and helpful to their customers, but it is not transparent and it does not tell the full story to people and might kind of ending up guiding them towards making decisions that are harmful for them.
0:33:18.9 **Jen Luker** You know, on a slightly lighter note, to change it and possible wrap eventually, there’s always the ones that seem much more benign like the famous example of the fact that a soap dispenser could only be used by someone of lighter skin color. Or the fact that in Asia, the facial recognition for unlocking your phone, could be unlocked by your friend based on the similarities on facial structure and how it’s not just data collection that’s the problem, it’s also lack of diverse QA even. The fact that these things went from idea to being installed in bathrooms as soap dispensers and a dark-skinned person can’t get soap out of it, just baffles me. The fact that it went all the way to production and those companies have phones that are openable by a large swath of people, your coworker, and they call them secure and Asia, specifically, has a really huge technology market. So, to release something to such a large technology market that’s supposed to be even more secure than a fingerprint, and then to have it be the exact opposite, that half the people that you know can unlock your phone, is just confounding. How did it make it this far? And the only thing I can think of is it does come around to a distinct lack of diverse product development and QA. There was one example that I like to tell of, you know, this company had this idea where they were going to be able to have an app on your phone that all you’d have to do it tap it and it would start audio recording. So in the event that you needed to kind of inconspicuously start recording like, “Oh, this is a terrible situation, please record.” And it was meant more specifically for like, teachers. And this woman came into it and she says, “Okay, you realize that women’s pants don’t have pockets, right? And that a lot of teachers are women.” So there’s a couple different places that women tend to put their phones if they’re carrying them. One of them is in their back pocket, and one of them is in their bra. So, you’re literally asking a woman to spank her butt, or her boobs to start an audio recording to protect her, and it was killed at that point because it wasn’t feasible, but that was, you know, it took a woman to come in here and say, “Well that’s not gonna work” for that to happen. So, for these other things to happen that’s so big and so public, it’s like, how did you get that far? And the only thing I can think of is that there’s just a distinct lack of QA. Someone didn’t walk in there and say, “I put my hand underneath your soap dispenser and it didn’t work.” Or-
0:36:12.7 **Adam Garrett-Harris** Yeah.
0:36:13.4 **Jen Luker** “My friends can open my phone.”
0:36:14.8 **Jason Staten** My name was rejected by your real name algorithm.
0:36:18.9 **Jen Luker** Ugh, that was terrible.
0:36:20.0 **Safia Abdalla** Yeah, that was a rough one to read.
0:36:22.0 **Adam Garrett-Harris** Oh, Facebook trying to determine what is a real name?
0:36:25.4 **Safia Abdalla** Mm-hmm (Affirmative).
0:36:26.3 **Adam Garrett-Harris** Yeah.
0:36:26.8 **Jen Luker** And whitewashing Native American names.
0:36:29.2 **Safia Abdalla** Yeah.
0:36:29.8 **Adam Garrett-Harris** Yeah. How did that work again? The whitewashing?
0:36:31.9 **Jen Luker & Safia Abdalla** Uh…
0:36:33.2 **Jen Luker** In the end they would only accept “Brown” instead of, it was like “Brown Bear”.
0:36:37.9 **Safia Abdalla** I think it was “Creeping Bear” or something.
0:36:39.9 **Jen Luker** Well there was “Creeping Bear” but there was also “Brown something else.” I think it was near the end of chapter 3.
0:36:46.1 **Adam Garrett-Harris** Yeah, and one of the things you could do was send in official documentation to Facebook, but you may not feel comfortable sending in that kind of information and trusting they’re actually going to delete it.
0:36:58.8 **Jen Luker** You know, why should you have to send in official documentation to a social network site?
0:37:04.1 **Safia Abdalla** (laughs)
0:37:04.6 **Adam Garrett-Harris** It’s like they’re becoming the government.
0:37:07.1 **Safia Abdalla** Oh no.
0:37:07.5 **Jen Luker** It’s like, I feel mad enough that I have to send official documentation in to PayPal to change my name when I got married.
0:37:13.5 **Adam Garrett-Harris** Oh, you have to do that for PayPal?
0:37:15.2 **Jen Luker** Yeah. When your name changes, in order to change your name you have to send in official legal documentation. For a marriage…
0:37:22.5 **Adam Garrett-Harris** I guess I’m about to go through that.
0:37:23.7 **Jen Luker** For divorce, you know, for any of those. So I essentially just left my name as it was and changed it on the profiles. So, I still have a name on my PayPal account from years and years ago that isn’t even mine, and hasn’t been mine, and hasn’t been mine for a long time, because I didn’t want to send legal documentation in to PayPal.
0:37:44.0 **Adam Garrett-Harris** That’s so weird.
0:37:44.9 **Jen Luker** So yeah, you can’t even… You know, stuff like that, it’s like, I recognize that at least they have a little bit more clout and that they are more of a financial institution, but the reason why they’re using my name is to just pass it on as convenient shipping information. When it comes down to actually registering financial business- financial information regarding like, your bank account, they actually ask for your name again. So the only reason this name exists is to just say, “Hey, this person set their name as this, so we’re gonna pass it on.” But I can’t change it without sending in official documentation.
0:38:19.9 **Safia Abdalla** I think that’s an interesting note for us as developers. At what point can we put in processes that accommodate change. Technical process as in, you know, a form or something like that instead of forcing people to go through… I assume more human processes, I don’t know if it’s automated when you send in your identification card and, do they like have some sort of computer vision that looks to see that it’s you? I assume it’s just like a person on the other end, manually tracking.
0:38:51.7 **Jen Luker** And in this case, it does seem like it’s a person on the other end, but who’s checking the person who’s checking my information?
0:38:57.3 **Safia Abdalla** Yeah. Yeah.
0:38:59.3 **Jen Luker** Like, if it’s not and it’s an AI, then an AI has got the ability to save my information, all of my legal information. And if not, who’s validating that the person who is now officially holding my identity, not going to sell it to someone else?
0:39:18.1 **Safia Abdalla** This is one of the things that I experienced a ton with Zarf, and I’m still trying to figure out the best way to do it. Since Zarf is a market place product and writers can sell their work on the platform, one of the things that I need to do is collect like, the last 4 digits of their social security number if they’re based in the United States, their address, their name, their birthday, and all of that is required to verify their identity and that’s specifically in part to be in compliance with some federal regulations that came out of the Patriot Act that just require you to confirm the identity of anyone who is going to be making money on your platform; but, one of the things I’ve consistently run into is a lot of my customers get really nervous, and rightfully so, when they realize they have to provide all of this sensitive information in order to use the platform, and I’m kinda stuck between a rock and a hard place. Because on the one hand it’s something that I do have to do, to like, be in federal compliance, but on the other hand I have to find a way to empathetically and effectively communicate why that information is being collected, where it’s stored, how it’s being used to customers so that they’re not frightened or anxious when they use the application. So it’s like a tough line to cross.
0:40:47.9 **Jen Luker** And I think the education is a really important way of doing that, you know? Putting it up front, right on top, “I have to collect this information for federal regulations. If you would like to read more, click here.” You know? Read the documentation for that here, would be really helpful for me.
0:41:06.0 **Safia Abdalla** Mm-hmm (Affirmative).
0:41:06.7 **Jen Luker** I’d be more willing to do it if that were the case. Like, I had no idea that PayPal, for instance, would you know, need that information in that way. However, the fact that it never really gets check after it’s set, also means that my name is still wrong and has been wrong for over a decade because I haven’t been willing to send in information and change it. So, they’ve got an identity that was active, official, legal… 10 years ago. 12 years ago. Not right now, though.
0:41:36.2 **Jason Staten** Safia, I think a key part there is that you’re actively thinking about that, which is leaps and bounds beyond many places that don’t. Whether it be responding to feedback that people have brought up to you, or thinking of it just because you’re one of a more open mind of thinking of this diverse situations, and then approaching that problem versus just saying, “Eh, if they don’t fill it out, they don’t deserve to be on the platform anyways.” Like-
0:42:11.0 **Safia Abdalla** Yeah, I’ve-
0:42:12.1 **Jason Staten** Instead… Yeah.
0:42:13.4 **Safia Abdalla** Yeah, I’ve tried my best to stray away from that attitude. There have been, in fact, some situations with some of my customers who are a bit on the older side, again you can tell that I’m a youngin’ speaking here, where I’ll actually like, get on the phone and explain to them like, what Stripe is, what it’s used for, like how apps are built and everything, just so they have a better idea of like, what’s going on and why I need to do this and how it’s being done. And yeah, I don’t think there’s a lot of startups that are of my size that are getting on the phone with customers to explain things, but it goes a long way. So, you’re definitely right on that, Jason.
0:42:52.0 **Adam Garrett-Harris** And I think that the last 4 numbers of the social security number is interesting because I guess that’s considered more secure, that you’re not giving away your full number, but the last 4 digits of your social security number are used in so many places to just verify your identity.
0:43:08.0 **Jen Luker & Safia Abdalla** Yeah.
0:43:08.2 **Adam Garrett-Harris** A lot of times that’s what they ask for. Again, because I think it’s considered more safe to just say 4 numbers.
0:43:14.7 **Safia Abdalla** Yeah, and the really interesting thing about these kind of verification methodologies or this identity verification thing is that it’s actually not that effective. Like, your social security number is not the best way of securely verifying your identity. It’s pretty easy to guess a social security number if you like, know roughly where somebody was born and during what year, because there’s sort of a slightly deterministic algorithm that used to distribute out social security numbers.
0:43:46.5 **Adam Garrett-Harris** Yeah.
0:43:46.9 **Safia Abdalla** So, it’s like, it’s not even like that secure or good or anything but we’re still collecting it and I don’t know why. It feels like some more thought should be put into that process of whether we really need to collect that information to verify people’s identities.
0:44:00.9 **Adam Garrett-Harris** There’s a really good video by CGP Grey, about why the social security number is terrible for identification. It has no checksum, easy to guess like you said, and then there’s no picture or anything to actually verify that that’s you.
0:44:14.6 **Jason Staten** And not to mention with the Equifax breach, I mean so many millions and millions of people have theirs leaked anyways. So to pretend like it’s this secret password that only know, you and this company that you’re interacting with is a total falsehood anyways.
0:44:35.1 **Adam Garrett-Harris** Okay, that’s probably good for this week.
0:44:38.2 **Jen Luker** So we’ll continue this conversation next week where we focus more on company culture than technology.
0:44:44.5 **Adam Garrett-Harris** Yep. See you next week. Next time.
0:44:46.0 **Safia Abdalla** See ya.
0:44:46.7 **Jen Luker** Bye.
0:44:46.7 **Jason Staten** Bye.
0:44:46.7 **Adam Garrett-Harris** See ya.
(Exit music: Electro Swing)