Accountability. The reason why I say that is because we can cause more harm than good in this work, whether it’s providing the wrong information to a civil society or a community organization, or whether it’s us thinking that we know more. I think accountability is a huge, huge, huge aspect and a huge value. I think understanding, well it’s not a value, but I think having a strong understanding of liberation and what that means to a person, but also to the communi- ties you work for. I think it’s really important because we’re all engaging with technology in a different way and I think that we should have a baseline understanding of what the role of technology is and how our morals and values play a role in that, but also the organizations we work with. Being mission-driven is an important part too. Like nothing is apolitical. Like technology’s not apolitical, so being mis- sion-driven in the work that we do is important. And respect for the work that’s been done by other people. What I’m also noticing is that because people are in this environment, intersection of technology and social justice, they sometimes think that they still have more expertise than community organizations that have been doing this for a very long time. […] I think that we could recreate a lot of the power structures that we’re seeing in our real life within a digital space. So if we’re not talking about it all the time, having an understanding of it all the time, and understanding the power dynamics, we’re just gonna recreate everything and it’s not gonna be helpful. —JAYLEN, TECH CONSULTANT FOR NONPROFITS

Our fourth research goal is to capture practitioner visions of what is needed to transform and build the field(s) in ways that are inclusive and aligned with their values (social justice, social good, public interest, etc., as articulated by practitioners), as well as how to mitigate threats.

A summary of our key Visions & Values findings is available in the Executive Summary.

VALUES & PRINCIPLES

We asked practitioners what values and principles guide their work, what they see as some of the biggest threats to their vision, and what changes are needed to fulfill their visions.

Accountability and integrity are fundamental values for building trust among individuals, organizations, and communities

Practitioners spoke about the importance of integrity and the need for individuals and organizations to be trustworthy, authentic, reliable, accountable, and respectful in their interactions and in the work they produce. One practitioner working for a city in the Midwest shared how exceptionally important integrity is within the “technology infrastructure side of the house. A lot of the work that is done is still done on a hand- shake. We repair things, fix things, invoice each other and there are no contracts in sight. It’s very refreshing and enjoyable. People really are interested in making it work and that’s really important” (Richard, Broadband Expansion Manager for a Rural City).

Another individual described integrity in terms of the quality and reliability of work: “When you can actually stand up behind the facts that you’re presenting and not put fake data out in the world. Reliability and integrity, those are the values and principles I care about” (Dana, Tech Fellow in Federal Government). For them, practitioners must be accountable to the communities they are working in, and this means respect for the knowledge, resources, and experiences that community members bring to the development of new technology tools.

The capacity to empathize with others’ experiences and needs leads to more meaningful relationships and more useful technologies

To best serve communities, practitioners felt communications that convey empathy, understanding, compassion, patience, and thoughtfulness can improve not only relationships, but also the technologies that are developed and produced in the process. A practitioner working within a foundation shared their perspective on what it means to practice these values: “I’d say empathy, compassion, understanding, respect […] To sit back and listen and understand, even if it takes a while for someone to get there, what their inherent motivations and demands and needs and realities are. That patience that’s needed for those conversations, and also willingness to listen and also to learn, we’re not going to move forward because those conversations are hard” (Erica, Fundraiser at a Foundation).

A government practitioner also expressed patience and empathy as crucial values in their work, and mentioned that these are often lacking among developers. “More often than not, the technological recipe for the way forward isn’t that complicated. It’s not the toughest part, and it’s a common thing to see arrogant developers just arguing with each other about the best way to do things and that kind of stuff, and to them it’s sort of about winning an argument. That’s not the way you get things done in the government, or really, in those places that aren’t just a pure development shop” (Tom, Developer at a Federal Government Office).

Practicing empathy also calls for a willingness to understand others’ experiences and needs. It means being accountable and aware of the responsibility you hold as a practitioner.

A Co-Director at an organization that works with nonprofits and students noted that they have often seen tech projects that are disconnected from community realities. “One big example that is easy to think of is … there’s a one laptop per child thing that was happening where they’re trying to, basically, give kids in underdeveloped countries computers. First, the intention is there, but they’re so disconnected from the community that they’re trying to serve, they didn’t even realize they don’t even have outlets to charge these computers and so that blatant disconnect means that you were not effective at all” (Barbara, Co-Director of Nonprofit).

This sentiment was echoed by a for-profit technology practitioner who stated, “the empathy part of technology is huge. It’s really important, and Silicon Valley tech falls pretty short of this, considering the negative externalities of what you’re doing. Not to say that you’re responsible single-handedly for solving them, but being aware of them” (Elioenai, Civic Tech Head at a Tech Corporation).

Practitioners who take the time to listen, understand, and meet communities where they are can both form meaningful lasting relationships and develop useful technologies that help communities prosper.

Openness and transparency in technology development is integral to growing and sustaining the field

Practitioners expressed the importance of openness and transparency when engaging inside and outside their own relationships, organizations, and communities. Prac- titioners described this in terms of a range of skills, actions, and behaviors during interactions with others.

One practitioner shared that at their Foundation, “I think openness and transparency and communication is really important in terms of ‘this is what you said, this is what I’m saying, this is what I’m hearing.’ Sometimes you’ve got to repeat yourself, and some- times you have to repeat what the other person is saying to make sure real communica- tion is happening, but I think that allows for much better work in the long-term. Being open to a conversation is very important” (Julia, Program Manager at a Foundation).

Others described openness and transparency in the context of how they work, in terms of sharing their knowledge, skills, and resources with others. “I mean, open- ness, transparency, those are the two biggest things. I feel like giving back is a strong core value, because I’ve definitely had a lot of opportunities, both earned and not earned, so I feel like I have to have a commitment to work in the open, share back to other communities, and try to stay mindful of being equitable, down to hiring, for example” (Lou, Senior Technologist at a National Think Tank).

Practitioners also see value in creating an ecosystem where openness and transpar- ency is the norm, since this leads to more effective resource utilization. For example, “I would say open source is probably the most valuable. I think there’s a lot of work being done by, even the meetup I’m in, where you’re trying to define a culture that’s more respectful, that’s healthy for learning, but also healthy for being productive. I think we already have a pretty good open source ecosystem, but I feel like there needs to be more. It’s hard to know what other people have already done, and what you can go off of, and what you need to do from scratch […] As a non-profit, I have problem X, and I know that these people over here, or these set of people over there have been working on it, I can either take what they’re using or build onto what they’re using” (Joss, Developer at a National Think Tank).

Truly innovative spaces are collaborative, inclusive, and diverse, and creating such spaces takes a lot of work

Collaboration amongst practitioners is critical to the success of the work. Building and facilitating collaborative spaces is a skill that not all practitioners have. As one practitioner stated, “I definitely believe that characteristic of being able to bring peo- ple together and get them working together toward a solution is critical. I think to me definitely an innate sense of the need to collaborate. I don’t feel like anybody can do this well by themselves. So you must be a collaborator and must understand how to bring people together” (Polya, Tech Program Manager at a Lab).

Innovation can also be supported by focusing more on building long term relation- ships, contributing to ongoing shared efforts, and collaboration over competition. “For instance, what you think is a brand new idea, but then five years later, someone else comes on the scene and they’re intent on rehashing ideas that either succeeded or failed in the past. Not to say they shouldn’t do that, because ideas are great and when you get a great idea, even if it’s been done to death or whatever, you can innovate on it. But if there was more of a focus on collaboration, then they would know that instead of building your own X, you could be a better collaborator in the open source ecosystem by contributing to X. That makes everything a lot healthier in the long run. I understand that competition exists and it’s a real thing. I get it, but I would like to encourage more collaboration across our different silos” (Marie, Digital Security Expert at a Foundation).

Even within collaborative environments, it can be difficult to create safe and innovative spaces that are truly inclusive and diverse. As one practitioner pointed out, “Do No Harm is a value that I hold, but it’s got to be more than Do No Harm. It has to be thoughtful and inclusive. I think inclusivity is a big one, because continuing to think about who potentially is left out from some process or something that you’re building is important. There need to be more people who aren’t necessarily technologists or know how to code, who are invested in understanding and advocating for different communities and groups who are impacted by technology, or who use it” (Alda, Community Organizer and Consultant at a National Newspaper).

Practitioners who practice this value believe that creating more diverse, inclusive spaces is “a way to remove oppression, racism, classism, ‘all those isms’” (Charley, Executive Director at a Technology Nonprofit). Collaboration and inclusiveness also create opportunities for practitioners to learn and grow professionally. In one prac- titioner’s experience, the spaces they see that are good working models are meetups they participate in that are led by Black people, or are diverse in terms of gender and race: “The people are respectful, people are able to talk without feeling insecure about what they do and do not know. The vibe is completely different than a place like a U.S. Civic Tech Nonprofit, where we have like 3 or 4 white dudes who do not realize the time. They’re very charismatic, it’s great, but it’s like at the expense of people’s learning and ability to build together” (Joss, Developer at a National Think Tank).

Center community expertise, priorities, and solutions in tech development and implementation

Similar to building collaborative spaces, technology that is developed with and by communities in an inclusive, participatory process ensures greater access and use for addressing community needs. This participant described why it’s important to create a participatory process in the field of social justice and digital technology: “I think the priority has to be on anyone that is marginalized, or being shut out. Two key guiding principles, I think, are seeking out those populations, people or perspectives; and then really trying to organize them centrally into the project. Really trying to make sure that’s where the knowledge base is, and trying to make sure that what you’re coming up with is something that they find useful, or that might ameliorate the problem” (Emanuel, Assistant Professor of Communication at an East Coast University).

It’s also important to consider who in the community might be left out of these conversations and participatory spaces, and to seek to actively mitigate any barriers: “We need to center particularly People of Color, low income communities in the work. The work should be centered over the tools itself. Because I think what happens is that people are so quick, ‘oh I got a tool for that.’ That’s not what we do. We should be listening to the needs of the community. We should be centering the needs of the community over everything else, as our vision. That’s sort of basic” (Charley, Executive Director at a Technology Nonprofit).

When communities are not included in the design and development of technological tools, not only do practitioners risk harming the communities they are trying to serve, but this may also create mistrust of other practitioners. “I think there’s a lot of danger with people being, ‘I’m doing this design for good,’ if it’s actually a Band-Aid solution or it’s not understanding the bigger picture. I think, also, designing locally or with local people is really important, so parachuting into a context you’re not familiar with and stemming it briefly and coming up with solutions for that community, I think is really dangerous” (Tivoli, Freelancer and UX Researcher at a Tech Corporation).

Many practitioners are driven by the pursuit of justice and equity

For many practitioners, structural and institutional inequalities underlie the problems they are seeking to solve. Challenging injustice is at the forefront of their work: “Everything that I’m doing with trying to teach people about technology, it’s only meaningful if it is part of a larger narrative about oppression and injustice; that it recognizes what the sources of those are. And for me, it’s capitalism and white supremacy. And so, what I value is understanding and rejecting what I think is a more dominant narrative, which is that technology provides some kind of potential utopian future, which I don’t think it does. I think it can be just oppressive. And it especially can be if you don’t recognize its potential to be” (Vishnu, Founder of a Nonprofit).

Practitioners understand that technology is only one of many tools that can be leveraged to address injustice. One participant shared their conceptualization and practice of this as follows: “There is no technology for justice, there’s only justice. What I try to keep in mind and try to instill in my work is to put technology in its proper place. Give it the attention and value that it deserves, and no more and no less. Don’t make something tech centric just because you’re the technologist and that’s what you bring to the table” (Stevie, Tech Fellow at a Foundation).

A nonprofit practitioner working in the Southwest described how justice and equity are also central to their organizational work and practices: “We have a set of core val- ues that include serving community, empowering youth, equity, and that is of course racial equity and gender equity, and serving the LGBTQ community. […] The last one is love, that we do this work from a place of love, and we help young people to fall in love with the community” (Nessa, Journalist and Founder of a Nonprofit).

THREATS TO PRACTITIONERS’ VALUES & VISION

One lesson that I have learned from engaging with technologists who want to make themselves useful to social justice, political policy work, is that often times people who work with technology, especially coders, think that there are easy solutions for a lot of problems that they’re just really aren’t easy solutions for, and get really frustrated with the political process because the political process and policy work are not remotely mathematical. It is not a bunch of zeros and ones, it is a bunch of human beings who have their traumas and biases and personal histories and personal interests. I think that sometimes technologists view themselves as superior to other people because they have skills that put them in a high income bracket, and they understand the technologies that facilitate modern commerce and communications in ways that ordinary people don’t. I think some humility on their part, when we start to engage in conversations about how to use technology in the service of liberation, would be re- ally useful. Because people who don’t know about how technology works know other things that are really necessary and important to protect libratory technology. — RASHMI, DIRECTOR OF TECH AT A CIVIL RIGHTS ORGANIZATION

Practitioners saw the following three themes as some of the biggest threats to realizing their values and principles. All are rooted in values and principles associated with justice, equity, and inclusion.

Replicating the inequities we are fighting against

Practitioners were self-reflexive about how their own or their organization’s actions may perpetuate the very social inequities they seek to alter. As one put it: “We are not balanced in the representation between who works for an organization and who is served by an organization. All of the pieces that are fundamental to the structure of our organizations and our work is feeding into the inequities we think our missions are addressing” (Mel, Executive Director of a Nonprofit).

Integrity and accountability to our values and principles is at risk if we fail to recognize our own roles within an inequitable system. One practitioner posed the following questions to assess how equitable we are in our own practices: “I think about how tech is not inclusive right now; how so many communities are locked out. It’s not just, ‘Who has these skills,’ but it’s also, if you do have these skills, are you being listened to, are you being passed up for the work that’ll help advance your career? Are you being passed up for promotions? Are you not being hired at all?” (Tal, Founder/Director at Education Nonprofit).

Capitalism, white supremacy, and heteropatriarchy

Many practitioners hold values centered in equity and justice, and have an analysis of structural and institutional inequality. Some participants specifically named the systems of oppression that they seek to transform, using terms of analysis from intersectional Black feminist thought,13 including capitalism, colonialism, white supremacy, and heteropatriarchy: “I’ve noticed there’s a lot of potential or actual harm being done. Particularly when it’s people with a lot of power and resources who are imposing solutions upon communities who lack those same kinds of power and resources. When design solutions or ideas don’t involve the participation of the people who are going to be affected by the design, or who are going to have the design imposed upon them, I feel like that’s very harmful. That goes with technology as well. I think throughout history, we’ve seen a lot of examples of things that are supposedly ‘for good,’ that are tied to colonialism, tied to capitalism, that are these supposedly benevolent kind of efforts. But that really reinforces white supremacy, heteropatriarchy” (Aston, Founder and Creative Director of a Design Collaborative).

Practitioners say that if we do not step back and examine the bigger picture, our vision for equity will be difficult to achieve: “If you think about [oppression and injustice] all stemming from white supremacy and capitalism […], like, who’s responsible for the loss of privacy? Who’s affected by it, why, and what does that look like, in an unchecked future? I mean, from law enforcement, and from digital capitalists, and the like? Teaching people about technology, it’s only meaningful if it is part of a larger narrative about oppression and injustice; that it recognizes what the sources of those are” (Vishnu, Founder of a Nonprofit).

Even organizations that share the same values and vision end up moving away from collaboration and openness, in fear of risking loss of resources or recognition. “I’ve noticed a harmful trend of nonprofits adopting the competitive technology models of for-profit corporations, which involves hiding innovation, rather than sharing, because they are working on shared goals” (Joss, Developer at National Think Tank).

Technologists and technology-centered solutions

One of the threats to centering the expertise and needs of communities in the devel- opment and implementation of technology is the attitude or approach technologists take when working with communities. Participants said that technologists often lack the patience or willingness to authentically engage with communities to develop a relationship and understanding of challenges, yet believe they know the “solutions” to community problems.

Putting technologist and technology first, in the absence of deep experience with community needs, knowledge, and experiences, further disenfranchises communities. To mitigate this risk, one practitioner expressed, “It feels important that there is at least an attempt to build capacity instead of going into a different community and starting to do work” (Jay, Digital Security Trainer at a Nonprofit).

THREATS TO THE FIELD, COMMUNITIES, AND PRACTITIONERS

Communities of color and marginalized communities have always been oppressed with every advance in technology. Whether it’s developing photography and cameras, that correlated exactly with police using mugshots, or the development of fingerprint scanning, or things like that […] marginalized folks have been surveilled for decades, if not centuries. — HIBIKI, FREELANCE DIGITAL SECURITY TRAINER

Key Threats

Practitioners identified the following six key threats to the communities they work with: state violence and surveillance; politically-motivated targeted digital attacks; marginalization based on race, class, gender identity, and sexual orientation; unequal access to digital technology; unaccountable corporate infrastructure; and limited resources. Practitioners discussed how these threats are currently being tackled, which ones they feel need more attention, and how they have seen these threats change over time. Additionally, practitioners pointed out that these threats, for the most part, are not new; they are longstanding systemic issues, amplified by new tools and platforms. For example, in the case of surveillance, practitioners noted that well-meaning white technologists have taken up most of the available resources with narratives about “new” threats, even though Black, Indigenous, Muslim, Latinx, and Queer/Trans communities have always faced state surveillance in the United States.

State violence and surveillance

This was one of the threats most frequently mentioned by practitioners. Many work in this area, but feel it needs still more attention. Technology is both a means to perpetuate state violence and surveillance, as well a tool for mitigating violence. Practitioners spoke about how technology tools and platforms may be new and changing, but the threats are not new: “The threats that we’re talking about are old threats. They’re just digital, digitized. A lot of people have been surveilled this entire time in this country. Native people have been surveilled. Black people are always criminalized, none of this stuff is new. I think that’s the thing that is an error from the part of digital organizers sometimes. This idea that we’re presenting these new things, when in reality there are new tools, new platforms, new ways of talking about it, but the impacts have already been happening” (Amardeep, Developer/Coder/Artist at a Progressive Nonprofit).

One practitioner spoke about the relationship of surveillance to the prison system, and described how they tackle this threat through trainings: “Surveillance in this country, and others, but focusing on the U.S., is really tied to the prison pipeline and the various technologies that militarized police forces have at their disposal. Police departments with a lot of technology at their disposal are also some of the most corrupt. In our trainings, we talk about specifics, like Stingrays, and different technologies that we know police departments use, and we break that down for groups” (Nessa, Journalist and Founder of a Nonprofit).

One practitioner described efforts to mitigate these risks as “disruptive technology” (Pich, Web-developer at a National Think Tank). With respect to state violence, another practitioner spoke about how their organization is “building a better system for monitoring police, that’s independent of civil oversight agencies and the police department and DOJ, because none of those people are going to really do anything as far as we can tell; there’s not the political will to change” (Ruby, Co-founder of a Law Enforcement Accountability Nonprofit). Others mentioned being aware of “a number of apps coming out recently to deal with immigration raids or that the ACLU came up with to send videos of police misconduct securely” (Chandra, Research Associate at a National Think Tank).

Politically-motivated targeted digital attacks

Many practitioners and their organizations face targeted digital attacks, including Distributed Denial of Service (DDOS), doxing (public exposure of personal information), coordinated harassment, and threats of physical and/or sexual violence. Many organizations are concerned about risk mitigation, and actively try to identify and implement the best digital security practices and tools.

Maggie, a Developer at a Foundation, expressed that “those that have access to the technology to attack have the power.” Ruby, who co-founded a Law Enforcement Accountability Nonprofit, referred to the tradeoff between running their own email services in-house and using services like Gmail: “We worry about being targeted by trolls and right-wing groups. We currently don’t have the capacity to fight against malware attacks, and from the services out there, it seems Google has the best mechanisms to fight against these kinds of attacks.” They described penetration testing, a way to test organizational security, as “basically a simulated attack that isn’t really simulated. You would hire attackers and they would try to break into the organization, maybe physically but usually this means electronically. They try to hack people in the organization and the infrastructure to demonstrate where the issues are so they can be fixed. It also demonstrates to the organization where they really need to improve their processes” (Ruby, Co-founder of a Law Enforcement Accountability Nonprofit).

Practitioners also emphasized the importance of personal physical safety as a priority area within the digital security space, especially in situations of intimate partner violence and sexual violence. When asked what urgent threats need attention, this participant noted “the lack of digital security experts mitigating the intricate digital security issues of domestic and sexual violence victims. And how common it is because of technology to hear in these situations, ‘this person is tracking them because of their bank account,’ or, ‘this person has this thing on their phone.’ That is so common that I expect it now” (Jay, Digital Security Trainer at a Nonprofit).

Knowing that their personal or organizational data could be attacked is worrisome for practitioners, and they feel that more needs to be done to prevent digital attacks. Suggestions included digital security literacy in general, as well as a focus on digital security needs in interpersonal relationships (Garnett, Tech Consultant for Nonprofits and Jay, Digital Security Trainer at a Nonprofit).

Discrimination

Many practitioners (about half of our study participants) face discrimination, and see it as a threat that needs to be addressed across the ecosystem. Practitioners who are marginalized based on race, class, gender identity, sexual orientation, and ability feel unsafe in some spaces, making it difficult for some to remain actively working in the field.

A manager of a nonprofit described feeling like they can’t continue to develop software and engage in this space for long, due to the transphobia they face at work. Just in the past two years, they said they have seen many women of color leave due to harassment and discrimination: “I have a hard time picturing myself continuing doing software development for much longer because most people I’m interacting with, in many ways, don’t really respect my existence as a marginalized person” (Barbara, Manager at a Nonprofit). This experience was echoed by others. A Freelance Digital Security Expert pointed out that “even tech spaces that call themselves radicals do not necessarily have conversations about privilege, and when they do, it is difficult to talk about diversity in the creators of technology. Radical and progressive spaces often fail to talk about ableism and classism within their ranks” (Brook, Digital Security Trainer).

One practitioner who is a consultant for nonprofits identified the stark differences in how she is treated in the social justice community versus in the tech community. In the social justice community, she says she is treated with respect and dignity, while in the tech community, which is mostly men, she says there is sexism and her request to collaborate in social justice work is seen as “cute” (Garnett, Tech Consultant for Nonprofits).

Unequal access to digital tools and resources

Despite assumptions that, in the United States, all residents have equal opportunity to access the internet and digital technology, digital inequality remains pervasive.14 “Half the world is not connected. We talk about techs for social justice or trying to leverage internet access to help people do whatever they want, but half the world cannot even consider that” (Nyx, Research Lead at an International Nonprofit).

Systemic inequities based on income, education, race/ethnicity, community disinvestment, and geography are all factors that both produce and are reproduced by access to technology. Unequal access means lost opportunities, less control of the narrative, and unequal power to shape the design and use of technological tools. “Usually, we would say, ‘Oh, the public space, the public sphere is where we are all equal.’ More and more, we understand that that’s not true. Like Ingress,15 Open 311,16 most of the stops, points, are in affluent neighborhoods. Similarly, Fix My Street, the people who have smartphones and other [devices] are building that map. Because if those maps are now what we base nearly everything else on, if we’re not paying attention to the inequalities in that, it’s going to be even more entrenched when we’re doing resource allocation. We have to pay attention in civic tech, in public interest tech, because who gets to build it? Who gets to critique it?” (Hardy, Technology Capacity Builder and Crisis Response Specialist).

One city government practitioner working in a rural community sees the rural lifestyle as desirable and worth maintaining, and to do that, believes that access to gigabit internet and the opportunity that comes with it is critical (Damodar, Director of Innovation and Citizen Engagement at a City Government).

Underscoring technological inequity, an Executive Director for a computer training institution noted how “technology plays an outsized role in our society, yet it is unsuccessful in terms of diversity. Additionally, the huge role that technology plays means that digital literacy and access are key to full participation in society” (Johanna, Executive Director of a Computer Training Institution).

Dependence on unaccountable corporate infrastructure

As public digital infrastructure withers, and our reliance on corporate controlled infrastructure and services increases, we stand to lose our freedom and independence. The current battle over net neutrality has made this stark fact even more clear: “There’s less values-aligned host services, non-corporate web stores every day. We (progressive organizations) are putting all these digital assets into containers we don’t control, giving government authorities access to them, and what happens if we get cut off from them? We need to figure out what we can be doing to fortify what I call movement-facing infrastructure, hosting services, consulting services, capacity building services, and other things that allow us to stay vibrant when digital marshal law is imposed. All of this to say we need alternative infrastructure. We need vegan, cruelty-free, fair trade, locally sourced infrastructure that is not annoying” (Arata, Technology Capacity Builder).

Another practitioner who works with an international nonprofit echoed this critique of dependence on corporations. They described the danger as “a concentration of wealth and what is essentially, these monopolies that have emerged in terms of the content that we use for every day (e.g. Facebook, Google, Microsoft, iOS, Android). The trend is that you have these monopoly patterns emerging or that exist now. You just have a few options and those dominate globally. Monopolies are never a good thing economic-wise or social-wise or in terms of power, or even political powers” (Nyx, Research Lead at an International Nonprofit).

Organizations’ work has also evolved over the years to address emerging needs and threats related to proprietary versus free software. One practitioner described how “we’ve seen our work shift from actually developing the software and philosophy behind freely licensed software, to enforcing the new general public license, or threatening to enforce it, and help people come into compliance with it. Then over time as we got more resources it shifted to activism and advocacy and education about free software and about problems with proprietary software” (Baldev, Communications Manager at Foundation).

Tech solutionism, top-down approaches, and the savior complex

Many participants noted that tech solutionism, top-down approaches, assumptions about the location of knowledge and expertise, and the “savior complex” are all persistent problems that plague this space. For example, Garnett, a Tech Consultant for Nonprofits, sees the biggest threat to the tech for social justice community as the lack of volunteers who want to work on “real issues that affect real people.”

Limited resources and investments threaten long-term sustainability

Government, nonprofits, and grassroots organizations often have less resources available than the private sector. This limits their capacity to compete for technology practitioners who may not initially gravitate to working in social justice. Participants feel that funding is also concentrated in the hands of a relatively small group of organizations and focused on hot-button issue areas. Most organizations are left with shrinking opportunities to build, grow, and sustain their work.

One practitioner believes we need to “solve the money problem” as it’s “really hard in resource-constrained organizations, particularly organizations that are not technology organizations, to divert resources from their core mission to technology” (Raimo, Technologist at National Legal Nonprofit). Solving the money problem also means examining how funders allocate resources, creating diverse funding streams, and being real about who is responsible for the work. This practitioner also noted that they see larger organizations dominating funding: “I think the one trend that I see that’s especially problematic is the existence of very large organizations that know how to get the money and to write for what the funder wants and have developed a pretty consistent supply chain of resources, based off of being able to cater specifically to funders. Those organizations often are able to use that as leverage to force smaller organizations to be dependent on them, or much worse, potentially take up a lot of the air in the room and choke out some of the smaller organizations” (Gertruda, Digital Security Researcher).

With respect to seeking funds, organizations need to “think critically about what funders are useful for what issues […] just because Google won’t fund certain topics doesn’t mean they won’t fund other topics that need to be funded. This is the difference between public funding and private funding for research. That it’s not one or the other, but actually trying to foster both, and leverage both, could be very effective and trying to coordinate that” (Emanuel, Assistant Professor of Communication at an East Coast University). One practitioner who self-identified as an “anti-institutionalist” questioned whether nonprofits should be responsible for sustaining “public interest” work. They feel that the responsibility should be shifted to governments: “If it’s public interest, I would argue that it should be a municipality and paid for through things like taxes” (Hardy, Technology Capacity Builder and Crisis Response specialist).