KOSTA: Hello, everyone, and welcome to Undesign. I’m your host, Kosta Lucas. Thank you so much for joining me on this mammoth task to untangle the world’s wicked problems and redesign new futures. I know firsthand that we all have so much we can bring to these big challenges, so listen in and see where you fit in as we undesign the topic of gender justice and the role of data. Many would agree that gender justice is a valuable aim in any context, work, home, society, more generally. And while we know that true gender justice is more than just equality of numbers or equity in access to opportunities, we still have a hard time telling the story of what and how gender justice should actually look like.
Why do we have such a hard time telling a richer story about the influence of gender on how we experience the world? And is data the key in unlocking that storytelling potential? Helping us untangle this wicked problem is our latest special guest, Dr. Anisha Asundi. Anisha is a research fellow and an agenda specialist at the Women and Public Policy Program at the Harvard Kennedy School. She manages the Gender Action Portal at Harvard and works on the gender and technology portfolio. This portfolio brings evidence-based research to the tech sector to close the tech gender gap and increase diversity, equity, and inclusion in that area.
Currently, Anisha is also the co-chair of Harvard’s Women in Technology taskforce, which is a volunteer organization that aims to advance gender representation, engagement, and support in STEM fields. Her research focuses on inclusion and health equity, especially amongst marginalized populations, such as women, people of color, queer communities, and low income populations. In this conversation, Anisha eloquently challenges us to reflect on the meaning of gender equality, gender equity, and gender justice, and how they are all into networked in the dailies. We then get to discuss why it is really hard to get diversity in workplaces, especially STEM industries, and how using gender data can help us improve that. Our biases is so ingrained that we can’t tell a deep enough story about gender justice? Or is it simply time to reckon with the way systems perpetuate the biases and the worldviews of their creators?
KOSTA: Anisha, thank you so much for joining us today. How are you going, otherwise?
ANISHA: I’m good. Yeah, doing well. Yeah.
KOSTA: Yeah. Where are you joining us from?
ANISHA: I’m currently in Boston, Massachusetts.
KOSTA: Nice, nice. And how’s the situation over there where you are?
ANISHA: It’s all right. Yeah, it’s interesting being back at Harvard and seeing students on campus after being gone for so long, but it is great to see folks again. I mean, of course everyone’s really cautious still, but it is nice to be back in community with people.
KOSTA: Yeah, that’s good. And I’m glad to hear it. Well, obviously we are here today to talk about gender equality at its essence, right? But we wanted to speak to you specifically because of the particular lens that you bring to these sorts of issues. And having familiarized myself with some of the work that you’re doing, I must say this was a learning curve for me, this topic, only because it shattered the lens that I used to look at these issues through, which is incredible. So I feel really privileged to be able to explore that with you today. So let’s just start with like a broad based question so we can invite everyone onto the same page. What does gender equality mean to you, Anisha?
ANISHA: Yeah, well, that is a huge question, but obviously one, I think about a lot. So I want to start by taking a step back and defining what we mean by gender equality versus gender equity, which is also used in this space, and then take it even a step further to address gender justice, which is where my passion really lies. So gender equity at its core to me really means equal treatment. Under policies and educational systems, in the workplace, people of all genders would be treated the same. An example of what equality would look like, because I do a lot of work in the gender and tech space, gender equality in the tech space could look like having 50, 50 representation for women and men in senior leadership in a tech company. And that’s what we’re striving for when we mean equality, equal treatment.
And then there’s gender equity, which ensures that everyone has equal access to the resources and support in order to succeed. So gender equity is part of the process towards equality, but also thinks a little bit more critically about this starting point for marginalized folks and specifically for women and gender expansive folks. So for example, going back to that tech company, you might be thinking for gender equity, maybe we need targeted options for women to ensure they’re in leadership positions. That could look like a quota, that could look like changing your promotion process, but really looking at how we ensure people have equal access to those resources to succeed, whether it’s leadership or in another gender equity space.
And then the third part of this conversation is gender justice. And that’s the phrase I really like, because justice recognizes that the issue at hand is systemic and historical and it works to break down those barriers and tackle the root cause of the inequity. So instead of making accommodations or adjustments for marginalized folks, we’re also addressing how systems in institutions and organizations have historically marginalized people, and understand that we need to change those systems and procedures in order to achieve liberation and justice. So going back to my organization example, perhaps that means taking a look at the career trajectory of women and understanding why they’re not advancing into senior leadership positions. What is the barrier and how can we remove it? Perhaps it is your recruitment systems, perhaps it is how you promote people, perhaps it goes all the way back to STEM classrooms in high school and how women and girls are treated in those situations. So this issue of gender justice really means taking a good, hard look at how we came to this place of inequity, and tackling those root causes as carefully as possible.
KOSTA: So if I just reflect back my understanding of what you’ve just said, so when we’re talking about equality, we’re talking a kind of a more quantitative kind of measure where it’s like bringing things to the same sort of numerical level? And you’ve got equity, which is more about access to sort of bridge those gaps, and injustice, which is more about a systems look that says, actually things have been shaped in quite an unfair way against particular groups of people over periods of time. Is that a fair summation?
ANISHA: Yeah, Yeah, right, exactly. And I think it’s not in either or in this situation too. I think all three of these components are something we didn’t talk about, but I do think this issue of justice and looking at systems and procedures and histories is really key.
KOSTA: Interesting. So I guess whether we’re talking gender equality, equity, or gender justice, why does it matter so much, I guess? Again, a very simple sounding question, but I don’t know, I would like to invite you to sort of remind us of why these things are really important and maybe shed a new perspective on that.
ANISHA: Yeah. So I think there’s two arguments that we hear in gender equity work and I know your podcast listeners are really interested in tech, so maybe I’ll start there, if that sounds good.
KOSTA: Yeah, please.
ANISHA: So technology is a sector that really thrives on creative thinking innovation, and what the research suggests is that limited diversity produces a limited talent pool and impedes the development of new technology. Research has also shown us that diverse workplace teams and tech and beyond generate more innovation, generate better products, generate better teams. So that is one reason that I think gender equality is really important. But then I also think there’s this even more important issue of the moral and human rights perspective. And in tech, especially, we want to make sure our teams are diverse, our boardrooms are diverse, but we all use tech in our daily lives, whether it is our phones, whether it is our laptops, whether it is being spoken over by a male colleague in a Zoom meeting. This, we are all using tech and it’s really critically important because that sexism and patriarchy and lack of representation of women trickles down into things like seat belts not really being made for women or apps not really being made for women, and leading to more gender based harassment, for example.
So I think when you primarily have white men designing technology, especially that is important for our lives without considering diversity inclusion and gender equity, you can largely disadvantage a lot of populations. And I think it’s really key to make sure that not only our teams are diverse, but our products and the communities we’re serving who are diverse actually reflect the teams that are creating it.
KOSTA: Okay. Actually you’ve almost answered half of a question that I was holding in my head, which is great, which is just around given how new the current technology space is in the scheme of just industry generally, I was holding that thought at the same time as reflecting on what it means like gender justice means. If we’re talking about systems that are built into industry, I wanted to hear from you particularly because this is your lens around how that looks in tech, because we think of it as quite new, but over the last 20 years, at least, I guess that’s enough for a culture to sort of form, and what that actually, how that actually plays out for whether it’s women or non-binary folks in tech spaces. What is the current state of play for the gender equity, justice, equality, all of it, from your point of view?
ANISHA: Yeah. So I think we’re moving into a really interesting direction in the past couple years in that we’ve seen more positions for diversity equity and inclusion come up in tech organizations, but we’ve also tried to reframe diversity, equity, and inclusion a little bit and bring it from a place of data. And that is so central to the tech industry as well. Like tech companies use data for everything else, for finance, for so many other departments, and like product innovation, but they don’t always use data for diversity, equity, and inclusion. And this is a really interesting just phenomenon for me to see because every other aspect of the organization is so rigorous and so based in data, except for diversity, equity, inclusion. And we actually now have a pretty strong evidence-based of what works and what might not work in diversity, equity, and inclusion.
So it is really interesting to me to see how tech has stalled a little bit and how the history of tech is actually really unique in terms of gender equity. The history of computer science and coding in the U.S. is really fascinating. Computing saw a massive influx of women during the Second World War in the U.S, where women were used to fill the void left by men who had to go to war to fight. And many of these women actually entered the feel of computing. A lot of them had degrees in math and physics. Computing and computer science and coding was a job that could be done at home, it was flexible, and it was really seen as a feminist profession back in the ’70s. And then as tech became a more integral part of our lives and associated with higher salaries, more prestige, it became a more male dominated role.
So I think that cultural shift is really, really interesting, and how we’ve viewed this profession societally has changed a lot. So I think it’s interesting now that we’re trying to bring women back into tech or bring some women in STEM. That is a huge initiative that we hear all the time, when at its core, this, the profession of coding and computer science specifically in the U.S. was a very much women dominated field. So I think it’s just interesting how our culture shifts and in who we see in specific roles has shift over the years.
KOSTA: And I guess how revisionist some of our cultural narratives are in that, like that point that you raised, I have heard before, but I feel like it’s quite a deemphasized part of our history, where it’s like, actually there is a hugely gendered history of computer science and technology industry, which has changed over time. I’m curious to hear a bit more about why you think technology, I guess for a lack of better term, has masculinized in some ways. Can you point to any specific touchstones or turning points about where that shift happened and why you think that happened? You alluded to it before with prestige and things like that, but yeah.
ANISHA: Yeah. I think as just the advent of the internet, and technology becoming a really close part of our lives, coding becoming a really valued skill in higher education, and classrooms changing into being STEM classrooms and STEM fields really targeted towards men and specifically white men. I think to the ways in which our culture has shifted, I don’t know if I can pinpoint anything specific, but I do think it is just an interesting case study of how fields that are becoming more economically viable and more profitable are traditionally male fields, even though historically that has been different. And I think that really goes in line with the culture of tech nowadays, which is very much focused on white men and hoodies kind of coding on their computers, when in the past, it was housewives taking care of their kids and coding on the sides when they can. So I think that, again, it’s like a cultural shift of who the ideal worker in this role has been, and historically has been completely different to what it is now.
KOSTA: Actually, can I throw out a hypothesis to you? And you shoot it down if you think it’s nonsense, right?
ANISHA: Yeah, sure.
KOSTA: I was just reflecting again on just it’s impossible not to look at this issue in the context of like a patriarchal system, right? Even the way that technology ended up being feminized, to begin with, is probably shaped by sort of the patriarchal forces that govern a lot of our international relations or our nation states.
KOSTA: Do you think part of the reason, I think the term I learned was like the programmer or the tech pro or whatever it is-
ANISHA: Right. Exactly.
KOSTA: … which I love, which I love that term. Just looking at it through a pop culture lens, I feel like technology has been a way for nerd culture to sort of reassert some sort of dominance, right? And if we look at nerd culture as sort of a reaction to sort of mainstream masculinity as well, does it kind of show you what a sexist system actually looks like in that people who are marginalized in a certain part of the social hierarchy end up claiming space in another part of the hierarchy, and it kind of shows you where people sort of fall in a systemic kind of pecking order. So like, if we’re looking at this as purely archetypes and stereotypes, you’ve got like the alpha kind of construct, hyper aggressive, hyper confident whatever, you’ve got the nerd sort of construct of people that are a bit more introverted, a bit more cerebral. And in that picture, women and non-binary folk don’t even exist at the best of times. Right?
KOSTA: Then you’ve got that sort of nerd culture moving into tech, which in a capitalist system has been a way for this to sort of rise to the top in some ways. Does that resonate at all? That’s just what my mind went.
ANISHA: No, absolutely. I mean, I think a lot of this comes down to culture and inclusion in the tech space too, right? Like not only does tech always say it’s meritocratic, which is just a lie, like not true at all, right? Because looking at representation, that’s not true. But there’s also this value of like culture fit, like the hustle culture wanting to be coding all night long, really being invested in the product, and not having flexible work schedules. I think the culture of tech is so masculinized nowaday that women don’t even feel like they have an entry point anymore. And it’s really interesting if you think about like going back to the stem classrooms example, it’s the subtle cues that really exclude women and girls and non-binary folks from the STEM fields.
There’s a really great study that was done out of the University of Washington that looked at what you put up on the walls of your STEM classrooms. And to your point about nerd culture, it found that when you put up images from star wars versus a gender neutral image of a garden, girls actually felt like they didn’t really belong in the classroom when the star wars images were up, versus the neutral images. And so it’s the subtle cues too. And this is where unconscious bias really, really comes in because I don’t think folks in the tech sphere are trying to intentionally discriminate against folks. It’s the unconscious bias that is built into our systems, that is built into our procedures, that is also built into our classrooms where we wouldn’t really think twice about putting up a picture of Luke Skywalker, but also realizing that that cue can really signal who belongs and who doesn’t in a classroom.
KOSTA: Yeah, that’s really interesting. And then is it your view, Anisha, I guess that unconscious bias is one of the biggest sort of barriers to true gender equality?
ANISHA: Yeah, I think unconscious bias is interesting because the point of unconscious bias, and I say this a lot to different audiences, is that we all have it, right? And that is the real issue. So unconscious bias are social stereotypes about certain groups of people that individuals form outside of their own consciousness. So everyone holds unconscious beliefs about various social and identity groups, and these biases stem from our human tendency to organize our social worlds by categorizing. And it shows up in our workplace or in our organizations or in our schools or in our day to day interactions, but it also shows up in our processes and structures. Now, I know I’ve been saying that a lot, but the systemic perspective, right?
And there’s countless examples of this. And one example that I find really interesting is from the Stanford Business School about Heidi Roizen. It was a case study that was written about Heidi Roizen, who was a very successful Silicon Valley venture capitalist, a case study written for an MBA class about her amazing successes. And what the business professor did when he was teaching this class is they presented half of the class of MBAs with the original case study with Heidi’s name on it, so talking about Heidi and her successes. And then the second half of their class received an altered case study in which Heidi’s name was changed to Howard. So that was the only difference between the two cases. And then they asked the students to rate the competence and likability of these two leaders. So the students rated Howard and Heidi based on how competent or likable they perceived them. They rated Howard and Heidi both as equally competent, but they found Howard more likable than Heidi. And again, completely the same case study. The only change-
KOSTA: So nothing changed.
ANISHA: Nothing changed, but their names, which was the indicator of their gender. So this is a phenomenon of unconscious bias, right? I don’t think the students were trying to discriminate. It is just how our brains are categorizing who is the ideal likable leader in our minds. And Heidi violated our expectations of women. Our unconscious bias perceives women to be either likable or competent. They cannot be both. And so I think this is a really stark and kind of horrifying example of how unconscious bias exists.
KOSTA: That’s a really confronting sentence actually about being either likable or competent, not both. That’s quite disturbing actually.
ANISHA: Yeah. Yeah.
KOSTA: Yeah. Just kind of sitting with that, because it’s like far out. It seems like a very simple example that kind of encourages a lot of projection, I guess. And then in the tech space specifically, or just technology impacting gender bias, you alluded to the way that there’s the cultural aspects and the recruitment aspects and also the way things are designed that could be exclusionary of gendered concerns. Can you speak more on that?
ANISHA: Yeah. I think the question you’re asking too, at its root is how do we overcome unconscious biases, right? So how do we overcome unconscious biases in tech in these specific different systems and ways. And the way we talk about it at the Women and Public Policy Program is we suggest using behavioral design, which is very apt for this Undesign podcast.
ANISHA: So behavioral design really focuses on how we can change our environments to make them more inclusive rather than changing people’s minds, as we see that’s harder to do and also has limited effectiveness. And there’s many research studies now on how diversity trainings actually have very limited effectiveness in changing people’s behaviors and advancing diversity, inclusion at organizations. I think that a really interesting part of behavioral design is coming up with these structures and processes based on what we see from the data in terms of inequities.
I’m happy to share a really interesting example of overcoming gender bias with behavioral design. In the late 1970s, only 5% of musicians in the top five orchestras in the U.S. were women. And the orchestras were thinking we need to obviously fix this gender representation issue. So what the Boston Symphony Orchestra did, what they decided to do was install curtains, so that when people were auditioning for the orchestra, all you heard was their music, all you heard was the quality of their music. You didn’t see their faces or identities. They even installed a carpet so you didn’t hear their high heels. So what they did is they de-biased-
KOSTA: Oh, wow.
ANISHA: … their audition process. Yeah. And what they found, because they were collecting data on this and Claudia Goldin has done a really great study on this, is they found that these anonymous auditions reduced gender based hiring and improved female musicians likelihood of advancing out of preliminary rounds, which often leads to tenured employment in orchestras. So the Boston Symphony Orchestra instituted this in the 1970s, and in the late ’70s and ’80s, a lot of other major orchestras also followed suit. So I think this is a really interesting example, and bringing this to tech and bringing this to the corporate world, what do we look at when we are assessing people for their fit in the company? We look at their resumes, we look at their cover letters.
ANISHA: So ways in which you can take inspiration from the orchestras is by anonymizing your resumes. So why aren’t we taking the names and identities of people off our resumes so that unconscious bias doesn’t come into the process? And we’re just evaluating people based on the quality of their work. And I would argue, to take this even a step further, that resumes and cover letters, and even interviews are very poor predictors of job success in the company. So what people really should be focusing on is work sample tests, and this can happen-
ANISHA: … not only in tech roles or in coding roles, but this can happen for our administrative roles as well for our research roles. And so I think kind of revolutionizing the hiring process so it isn’t so outdated can really remove a lot of the unconscious biases that come up. I hope that answers your question.
KOSTA: Wow. Absolutely.
KOSTA: It’s just given me 10 more, to be honest, but in a really good way. So it sounds like technology has a big role to play potentially in that aspect of sort of shortening the gap in gender equity or achieving gender justice. One thing you mentioned before that I’d love to circle back to is this idea of data and diversity and inclusion.
KOSTA: What can you tell us about why we don’t necessarily talk about those things in the same sentence sometimes?
ANISHA: Yeah, that’s a great point. I think gender data in general gives us insights into the current state of gender equity or gender justice. I think I can talk specifically in terms of tech, there is that reluctance to collect data because when you find a problem, you have to address it.
KOSTA: I keep one to that that.
ANISHA: Right. If you find a gender gap in your promotions, you’re going to have to address it. But I also think just taking it back to your gender data question and kind of connecting it to other forms of gender equity that I think this issue comes up in as well, I think the issues with gender data and any data you’re collecting on marginalized populations is we really need to pay careful consideration to how we are collecting this data and the research methodologies we’re using to uncover the gender issues from this data. And kind of bringing this to a different issue, this really came up with the beginning of the COVID-19 pandemic, where many countries and governments did not report their COVID 19 cases and deaths disaggregated by gender.
A lot of countries did not report their COVID-19 data separately for women and men, and many more did not report their disaggregated data by gender and age. Especially in Asia, the region where I’m from and the region where the coronavirus emerged, only three countries at the beginning of the pandemic were collecting and sharing data, disaggregated data by gender.
KOSTA: Right. And what are… Oh, sorry.
ANISHA: Oh, go ahead.
KOSTA: I was just going to ask you your thoughts, so what effect that actually had on people’s outcomes, right? Because if we’re using that data to make public health decisions, for example, and if we’re not taking gender data into consideration, what were some of those outcomes that flowed from that?
ANISHA: Yeah. Yes, I think you’re absolutely right that the lack of information led to exacerbating existing health disparities and the loss of lives in vulnerable communities, especially for women and marginalized genders who are primary caregivers in a lot of families, so they’re the people who have the information on the healthcare and health status of other folks in their family. In the U.S, especially, women were the frontline workers for the COVID-19 pandemic. They were the grocery store workers, they were frontline healthcare workers. I would say women, but primarily black and Latino women in the U.S. And we weren’t collecting data on what structures would exist to best support them. And I think another hard part of this work as well is you don’t want to collect data that might stigmatize communities. And so this is where the kind of dilemma and gender data-
KOSTA: So this is the great area. Got it.
ANISHA: Exactly. And how you present this data is very important because there are always biases in research. I know we try and say research is the most objective field there is, but there’s biases based on who is collecting the data, who is disaggregating the data, and how you’re presenting it. And going back to another pandemic, which feels like forever ago, but this came up with Ebola in 2013, 2016, where the ways in which countries and organizations presented the data on Ebola led to xenophobic rhetoric against black and African folks in Western countries. We saw it here in the U.S. with COVID 19 led to an uptick of hate incidents against Asian American folks.
KOSTA: Yeah. Likewise over here.
ANISHA: Right. So gender data is so important, but I really can’t stress this enough that it needs to be done carefully, which is why I think there’s really interesting intersectional feminist research frameworks that you can build on to make sure that you’re collecting this data correctly and in a sensitive way.
KOSTA: Yeah. I mean, intersectionality being a very key concept there too, because we can talk about gender as a very sort of flat binary, right? And a lot of issues to do with identity can really flatten the landscape or the realities that people exist with and in between.
ANISHA: Yeah, absolutely.
KOSTA: I was doing a bit of reading in anticipation of this because I was really drawn to the idea of gender data, because it’s not a term I thought before. And I was reading about some of the applications of that in certain countries, like I think it was a report by Data2X. I think actually I’ve got it here. And this example just totally blew my mind, right? So I might just read a bit of a track for you because-
ANISHA: Yeah, sounds good.
KOSTA: … I think this is worth sharing.
KOSTA: It’s in reference to a case study where the key message was that big data yields the most powerful gender insights when combined with traditional data sets. For example, case study 4 out of these 10 looks at how women’s educational choices are affected by the risk of street harassment in new Delhi in India. The research is built on various types of big data, including safety incident reports, crowdsource from mobile phone apps, and Google Maps geospatial data, but the work is centered on survey responses from students about their educational preferences. Then the next figure shows the college choice sets and the various transport options available to reach them. Then the next figure shows a safety service of the entire city created by big data, and that the safety of transport routes constrains women’s educational choices and imposes economic costs, which would not be possible without both types of information.
That totally blew my mind where it’s like, wow. And maybe kind of reflects back some of my own privilege here where it’s like, it wouldn’t even occur to me to make a decision about where I would choose to be educated based on my likelihood of being harassed. But obviously that is an insight that was able to be gleaned by mapping one set of data on top of another set of data. So my big preamble there was really what does gender data tell us that traditional data doesn’t, if we can even draw that line?
ANISHA: Yeah, no, I think that’s a good question, and I think it is really important to talk about the overlapping data sets that you mentioned. So the work we do at the Gender Action Portal is focused on experimental work, specifically randomized control trials, if we can, but including natural experiments, field experiments, because we’re really interested in drawing causal inferences and understanding what works and what doesn’t work in gender equity. But I also think because we’re doing work on the issues of gender and marginalization, that we have to address power structures and imbalances that have existed throughout history. And as you were talking about gender data, like I said before, there’s still bias embedded in that in some ways.
KOSTA: Yeah, sure.
ANISHA: So I think a question that I have a lot on my mind, as someone who summarizes a lot of experimental work, is what are the limitations of doing randomized control trials or experimental work and collecting that kind of gender data in these kinds of contexts? Because sometimes, we lose personal narratives and nuance. We can lose engagements from folks who are directly impacted by the research, right? So using a research design and collecting gender data that works for the community or population you are working with is more likely to gain buy-in, participation, and trust from the folks who are most directly impacted, especially if they already have a distrust of either research or academic institutions, black folks in the American healthcare system, Latin American communities and U.S. funded aid initiatives, and doing research in those spheres.
So I think there is a need for community voices to be upheld in addition to the research. And I think gender data is so important and we need to collect gender data in a more strategic and inclusive way, but I also think community voices and incorporating some of that qualitative research in this conversation is really key as well.
KOSTA: Yeah. And I guess that kind of summarized my feeling on that case study I shared, where it’s like, we’re talking about quantitative and qualitative research method, like we’re talking about methods research where you’re looking at the numbers and the locations and the physical sort of like the physically identifiable attributes of a particular phenomenon, and then you’ve got people’s experiences of those phenomena. And then putting those together tells quite a very rich story, which gives us a deeper insight into the world had we not had both sets of kind of data to begin with. So is my understanding of gender data then maybe its biggest benefit, keeping in mind, all those sorts of biases biases that are still inbuilt, it’s not perfect, but it’s certainly progress in the right direction. Is it that it gives us enough information to tell more nuanced stories about people that these systems are not necessarily built for or that we might not necessarily have an understanding towards?
KOSTA: Yeah. Is that a fair assessment?
ANISHA: Yeah, absolutely. Yeah. And I will say one thing also that our conversation is making me think of is also making sure you’re not collecting data for the sake of collecting data.
KOSTA: Yeah, sure.
ANISHA: And this comes up a lot when you’re collecting data on gender-based violence and survivors of gender-based violence, right? Like, are you retraumatizing folks? Or are you putting folks in more danger than they need to be as you collect this data? So I think the really interesting thing about the intersectional feminist framework on research that I really like is that it asks what it is about societies and economies and political systems and institutions that create excluded groups and what forces maintain those student groups and what we can do. But it also, as I said before, centers marginalized groups from the get go. Marginalized communities, and women, folks of color, whoever the participants are in your research study help frame the research question, provide background information, help recruit participants, gain feedback on participants’ experience in the research, collect, analyze, and help translate the research. So I think it’s really important to make sure that the process we use to even get that data, to begin with, is really careful and really making sure that we’re not just collecting it to have more data on gender-based violence, because that can be really problematic.
KOSTA: Of course. And then I guess just following that, then it highlights to me again what we were talking about earlier, which is around sort of just the overall benefits of a genuinely diverse, inclusive workplace or industry, where it’s like you actually do yield better long term sort of outcomes on quite a few bottom lines, right? And it’s as simple as just having people in the room… Particularly in a very politicized way, there’s a lot of back and forth about quotas and all these sorts of things that people have problems with. But even just the very act of having some difference in the room unlocks the huge potential for better outcomes for everyone, right? And that seems to be a reflection of that, where just even having folks that you’re trying to center in a study and trying to uplift, co-designing with them essentially, or having them as very active participants with agency is going to yield better outcomes generally.
ANISHA: Right. Definitely.
KOSTA: It’s a really interesting way of looking at it, I guess. Just as we sort of kind of progress towards the sort of the actions that we want to take as a result and the things that we can do to improve gender justice. I like that term. I’m going to start using that.
KOSTA: What does gender justice look like to you? And what does it look like to you as a thriving concept? And what does it look like to you in a more realistic sense if we don’t do anything?
ANISHA: Yeah. A hard part of being in this field is it is hard to see change sometimes. There’s that statistic, which I’m going to misrepresent, but I think it was like a couple hundred years until the gender pay gap will close that I’ve heard many times. And so it is hard to be in this field of work of gender justice and even envision what that world would look like.
KOSTA: Yeah, true.
ANISHA: But I don’t think it is for me just 50, 50 representation at the upper echelons of leadership. I think gender justice will occur when women and gender expansive folks have voices that are actually heard and recognized and amplified and appreciated. And that can be as simple as making sure our hiring processes are more inclusive and bringing more folks into these roles and technology organizations. That can be as much as ensuring our meetings are inclusive and that folks can be heard when they’re talking about issues that are important to them. And like I said, I’m an optimist in this work and I really take-
KOSTA: That’s great.
ANISHA: Yeah, I know it’s hard, but I do try and be an optimist.
KOSTA: It’s tough, but it’s great.
ANISHA: Yeah. And I really take inspiration from our youth and young leaders. And the way I see gender equity and gender justice really working out in the future is when leaders are held accountable, and I see that happening now. If you even look at the climate justice or climate crisis movement, that’s all young, black, and indigenous folks here in the U.S with the Sunrise Movement, for example. And these are the people who are using tech and entering companies and trying to hold leaders more accountable because of the racial equity issues they’ve seen in their lifetime. So whether it’s been the pandemic, whether it’s been the Black Lives Matter movement that they’ve grown up with and seen. And obviously this is historical and folks know this has been happening for a long time, but it feels like a culture shift for me again, in seeing companies say, “Oh, I actually really need to do work on diversity, equity, and inclusion because my employees are telling me to.”
So I think gender justice for me is when that grassroots approach, that bottom up approach, where it’s folks on the ground organizing collective action to hold folks in leadership accountable, but also to make them better and to make them more inclusive. And then again at the end of the day, I feel like it’s about hearing women’s voices and hearing gender expansive folks and their voices, and making sure that they’re not just a check the box kind of initiative on your inner organization or in your institution. When you were talking about representation, I think it’s so key, but representation is just one step. It’s actually making sure people are heard and recognized and leading as well.
KOSTA: So it’s almost like fully committing rather than just taking the measures that kind of, yeah, like you say, tick the box, but to actually fully commit, which is just-
KOSTA: … when you take on the challenge to increase representation, also take on the challenge of highlighting and taking seriously the voices and the experiences of those people that are bringing that diversity to this collective asset pool. If we’re going to get corporate, that means there’s more abundance in diversity. That’s what I’ve always said, right?
ANISHA: Right. Yeah.
KOSTA: Do you have any thoughts on how that looks like in where these issues of gender justice and tech intersect specifically, whether that’s the corporate sort of the venture capitalist side or whatever that is the recruitment or the products themselves? How does technology, in your view, what’s its greatest use to bring about gender justice
ANISHA: Yeah. I think I said this earlier, but I really do think tech companies who are creating these tools, the teams that are creating these tools really have to reflect their consumers. And I think an interesting example of this is Pinterest, who have done a lot of diversity, equity, and inclusion work in the past few years because they realize that 70% of their leadership team were men, and a lot of their, I don’t know if it was 70%, maybe it was 70% of their consumers are women, but primarily their leadership team is men. And so can you imagine creating tools without the perspective of your main consumer base on the team? So again, that representation point.
But I think tech more broadly, there’s so many interesting case studies of like virtual assistance, for example, and how that is kind of, again, a form of unconscious bias. A lot of them are female voice assistants. Again, it’s that prototype of the ideal worker that comes up, and it gets embedded into our products in a really scary way. There’s also cases of algorithmic bias, for example, folks of color not being able to use automatic soap dispensers because the people who invented that, they didn’t consider skin colors that were different than white. And that is just-
KOSTA: Oh my gosh.
ANISHA: … absolutely bizarre.
KOSTA: I didn’t even know that one.
ANISHA: Yeah, that one is scary and horrific, and there’s way more even damaging issues, especially for black folks and people of color in terms of algorithmic bias. And so there’s been different solutions kind of thrown around, of course, diversifying your team, but also should there be some sort of algorithmic audit system kind of like the FDA for algorithms, right? Where they’re put through a more rigorous process because-
KOSTA: Oh, wow.
ANISHA: Right. As we know, tech companies have a lot of control over the products, over the data, they collect over the products they’re releasing, but how can we actually hold these organizations more accountable? And again, I think that this employee like bottom up grassroots efforts are really working really well. I’m even just thinking, and stop me if this is kind of off topic, but-
KOSTA: No, not at all.
ANISHA: … this past week at Netflix, Dave Chappelle’s recent… I don’t know if you heard about his recent transphobic-
KOSTA: I’ve heard. Yeah.
ANISHA: … comedy special, right? The folks who are organizing against that are employees within the company.
ANISHA: And tech employees are really empowered to be able to… They’re the people on the ground creating these products for you, coding for you, doing product development, doing communications, doing administration. They’re the people on the ground, obviously making these tech companies work, and they’re holding leadership accountable and saying, “This is not okay that we are part of this company that is providing a platform for a transphobic comedian.” And so yeah, again, this bottom up grassroots approach even happens in tech companies where you wouldn’t think that employees would start organizing against… Not against, but organizing to hold leadership more accountable, but it is happening. And I think it is a beautiful part of this tech and diversity and equity and inclusion work is that it doesn’t always need to come from your HR department, it doesn’t always need to come from your chief diversity officer, it comes from folks who are just interested and want to make the organization better.
KOSTA: It almost sounds like, again, I’m hearing that idea of making it easy and natural for people to be able to have these conversations and start these movements through systems design and behavioral design and gender sensitive behavioral design, in particular, where it’s like, if you build a system that allows these voices to actually come to the surface and reach the people it needs to reach, it becomes easier for people to take on that responsibility, rather than forcing people to change their mindsets as individuals.
ANISHA: Yes, exactly.
KOSTA: So how do we design a system? So I guess it’s a paradox because systems want to keep their power gen, that’s my feeling anyway, particularly in the current climate systems are designed to self sustain and maintain their status quo. So what interest do they have in building a system that allows them to be challenged? So it’s almost like that’s where the key conundrum falls for me. I don’t know. Does that track it for you as well?
ANISHA: Yeah, definitely. But I think what the really important part of this is collective action and accountability is really the way to go, in my opinion, not the way to go, but I think that is one way to target that, right? It is how can organizations function without their employee base? And when their employee base are trying to hold them accountable to being better on diversity, equity, and inclusion issues to create better products, to create better processes, I think that is one way to kind of tackle that accountability measure. And then there’s also an even more top down approach, but in a lot of different countries or in a lot of different states in the U.S. right now, there are, for example, salary transparency laws now.
ANISHA: So in terms of changing the system, the state of Massachusetts, for example, you can’t forbid your employees from discussing their salaries together. And that’s a really key point for gender equity, because the issues that come up with the gender pay gap is there’s just a lack of salary transparency of what other people are making and an understanding how to negotiate for more of the salary when you don’t have more information of pay grades and what other people in your office are making. So having that salary transparency law is another way, very much top down. I know I talked about bottom up a lot, but that top down approach for organizations to really be held accountable to specific laws that will advance gender equity.
KOSTA: I mean, I hear you in that it is top down, but that’s a very small but huge thing to do at the same time.
KOSTA: You could literally just have an Excel spreadsheet on your team’s internet with everyone’s salaries on it. That’s literally all that could take if you really wanted to. And that would have huge ramifications. I can see that going either way. I don’t know what the research says about what impact that has on outcomes, but I could imagine that could make people feel all types of ways if you’re in that system. So my point is more like it might be top down, but that is huge, if that’s a measure that’s adopted.
ANISHA: Yeah. This question of transparency is what really makes us revolutionary. We have never talked about negotiating salaries before. We’ve never talked about kind of… I say, we’ve never talked about it. What I’m meaning is women typically aren’t told that they can negotiate salaries. There’s so much evidence to suggest that women don’t negotiate salaries because they’re worried about backlash, right? But if you have five women employees, and they’re all coming together and saying, “Hey, we’re making different salaries, even though we’re all doing the same work, should we try negotiating?” There are tactics that women and other marginalized folks can use to really negotiate for better salaries and having that data. And something that Hannah Riley Bowles, one of our co-directors, talks about is reducing ambiguity around negotiation and around salary transparency and the gender pay gap. And I think that transparency is so important in that.
KOSTA: Yeah. And one thing, again, just reflecting, oh gosh, this has been a really reflective episode, is again, just looking at it from like cisgender, reasonably white heterosexual man perspective, how freeing this can be for men who feel like they have to be a certain way that… We are told we have to be certain way sometimes, or that we have access to certain things that are pretty unfair. And to be able to have the permission to be like, actually, that’s a privilege I’m not going to exercise because that doesn’t feel right, it’s actually a really powerful thing. So I can see a benefit for a lot of people, for a lot of folks that don’t necessarily subscribe to a particular social construct. It’s interesting. We’re talking technology, and yet we still come down to these fundamental questions of how societies form, how social constructs form, technology being just an expression, an enabler, an accelerator of all of those things. Yeah. So here’s a provocation for you.
KOSTA: Is the idea of gender equality and justice that gender no longer will matter?
ANISHA: I don’t think so. And I don’t think so for a lot of people with lived experiences or identities that may be important to them. Right? The history of women and feminism and gender is really important to a lot of people, and I would hate to take that away. At the same time, of course gender is the social construct in a lot of ways. Of course talking about gender in terms of just men and women is really problematic. And for a lot of folks who subscribe to identities or have identities outside of that gender binary, it can be really problematic, and abolishing gender might be great. But I think it’s really important to think about how gender affects each of us in our daily lives right now.
ANISHA: I don’t want to say it won’t matter because personally my gender identity means a lot to me. I’ve been identifying for a feminist for so long, and I think it’s really important that just because you hold an identity that is a marginalized identity that that doesn’t completely go away when we achieve gender justice, when we achieve whatever we want to achieve. Yeah.
KOSTA: That’s great, because again, my field is extremism and polarization, and I’m always thinking about some of the arguments I hear. And whether it’s terrorism, whether it’s gender neutral bathrooms, whether it’s sort of being accepting of transgender identity generally, one of those common catch cries is like, oh, there’s going to be no differences between men and women soon, which is loaded with assumptions. So there seems to be this feeling among some of those folks that push that point of view is that the idea is for gender not to exist. But really what we’re trying to do is to say, actually, these are important as we want, like people have to sort of have the choice in how important that is and what we actually understand these things actually mean to other people.
ANISHA: Yeah. Yeah. And I think it’s a really interesting point that you bring up around non-binary and trans folks, and I really want to address this because it comes up a lot in gender work, how do we account for non-binary folks? Which I think personally is a problematic question. We shouldn’t be asking how we account for them. We should be asking ourselves why aren’t we including them from the get go.
KOSTA: Sure. Yeah.
ANISHA: And the pushback I really hear from folks in the gender space is that including gender expansive folks or trans and non-binary folks will “dilute” the message for women, which I find frankly offensive.
KOSTA: Yeah, sure.
ANISHA: Because I think it’s really actually important to collect data on any groups that may be affected by patriarchal systems, and that gender inclusion does not, and has never occurred at the expense of cisgender women, right?
KOSTA: Of course. Yeah.
ANISHA: So going back to that research question non-binary folks and trans folks, even though it’s sometimes difficult to gain these pieces of data, they deserve to be part of our research, they deserve to be part of our spaces, our initiatives, our programs that are focused on gender identity. And their experiences are central to us collectively overcoming patriarchy and achieving gender justice. No one is saying that trans men experience patriarchy in the same way that cis women do, but if you aren’t collecting that data to begin with, because you think the population isn’t in your sample group, or you think our experiences don’t overlap enough, or you think there’s no sample size, we’ll never be able to answer those questions.
KOSTA: Yeah, that’s right.
ANISHA: Obviously this isn’t just true for trans and non-binary folks, it’s for women of color, for indigenous women, for LGBTQ folks, working class women.
KOSTA: People with disability as well.
ANISHA: Exactly. So going back to that importance of data desegregation and centering folks who are most impacted to understand how our gender intervention specifically can move away from just centering cisgender rich white women, which is what it has done in the past, and move towards liberation for all marginalized folks.
KOSTA: Yeah. That’s amazing. Anisha, just as a concluding question, let’s think about the person listening to this that’s not part of some huge tech company or is again more in that grassroot setting, but has an interest. What’s your advice for someone like that, that wants to do at least start one intentional thing toward to advance gender justice?
ANISHA: Yeah. I would suggest they think about their spheres of influence. This comes all up a lot in diversity, equity, and inclusion work, but especially racial equity work where folks are saying, “I don’t even know where to start. How do I address structural racism? I’m just one person.” But we all have spheres of influence in some sort of way, whether it is in classroom, so going back to my example of changing the walls of your classroom. Maybe look around at your office or wherever you go to school or where you work and make sure that the walls are diverse. So make sure there are women leaders on the walls of your office and not all white men. Harvard University back a couple years ago, 70% of our, or I think it was… Yeah, I think it was 60 portraits at the Harvard Kennedy School, and most of them were of white men. And that is just crazy to me. But whenever our faculty members, Jenny Mansbridge, recognized this and said, “We need more diverse walls,” and commissioned specific portraits of women leaders.
And that is one simple thing that you can do to make your environments more inclusive. We’ve seen that and the research says that. So I think behavioral design feels like it’s big systems, it feels like it’s all these big issues, but a lot of it is setting norms, setting agreements, setting our environments in a way that they can be inclusive, because as mentioned, that unconscious bias comes up in our individual and interpersonal experiences as well, not just our systems.
ANISHA: So I would say, think about your spheres of influence and where you can have impact. And if you are in the upper leadership positions, think about those systems and processes that you actually can change.
KOSTA: I love that. I love that idea of, we all have spheres of influence and we have control over how inclusive and yeah, how welcoming those environments are for all people. That’s a really lovely, empowering thought to conclude on. Anisha, thank you so much for your time. That was a delight for me, and I really appreciate the generosity with all the experiences that you’ve shared with me today and our now listeners. Where can people find your work or more about you?
ANISHA: Yeah. I work at the Women and Public Policy Program at Harvard Kennedy School. You can find more about our center’s work at wappp.hks.harvard.edu. I run our Gender Action Portal, which is our online collection of experimental research that closes gender gaps. We summarize long academic articles and do short one page summaries. So if you’re interested in learning more about how we can close gender gaps, feel free to check out that website at gap.hks.harvard.edu.
KOSTA: That’s a fantastic resource. I really got lost in that yesterday as I was looking through it.
ANISHA: Awesome. I’m glad.
KOSTA: Anisha, thank you so much.
ANISHA: Yeah. Thank you, Kosta.
KOSTA: It’s been a delight, we’ll talk again soon, I hope.
ANISHA: Perfect. Yeah. Thank you so much.
KOSTA: You have been listening to Undersign, a series of conversations about the big issues that matter to all of us. Undesign is made possible by the wonderful team at DrawHistory, and if you want to learn more about each guest or each topic, we have curated a suite of resources and reflections for you on our Undesign page at www.drawhistory.com. Thank you to the talented Jimmy Linville for editing and mixing our audio, special thank you to our guest for joining us and showing us how important we all are in redesigning our world’s futures, and last but not least, a huge thank you to you, our dear listeners, for joining us on this journey of discovery and hope. The future needs you. Make sure you stay on the journey with us by subscribing to Undesign on Apple, Spotify, and wherever else podcasts are available.
What does the future of morality look like? Tim Dean from The Ethics CentreAvailable Now (Aired July 12, 2022)
In a world of rapidly evolving expectations on citizens due to COVID, environmental pressures, and the very public role playing of morality and ethics on social media, people face a dizzying space in which to set and attune their own moral compass.
Unpicking this challenge, Dr Tim Dean, Senior Philosopher at The Ethics Centre, discusses how morality and ethics came to exist for humans and what roles they play in our modern lives today. The contagious nature of outrage on social media is uncovered, providing solutions for understanding the addictively engaging nature of cancel culture and the limitations of social media in converting outrage into positive action.
Dr Tim Dean is a philosopher and an expert in the evolution of morality, specialising in ethics, critical thinking, the philosophy of science and education. He is also the author of How We Became Human and Why We Need to Change.
How do we design social media platforms that are safer for marginalized youth? Ben Hanckel & Shiva Chandra from Western Sydney UniversityAvailable Now (Aired July 5, 2022)
In 2022, as public institutions continue to grapple with implementing more inclusive structures to curb historical marginalization of people, experts such as Dr Benjamin Hanckel and Dr Shiva Chandra are exploring ways to make social media platforms safer for marginalized youth - both of whom are researchers from the Institute for Culture and Society at Western Sydney University.
In this wide ranging discussion, we examine the complexity of community, what risks are inherent in some social media interactions, and how platforms can foster diversity and feelings of safety for marginalized young people.