The struggle is not ‘access to encryption tools.’ It is organizing day labor communities in order to protect against ICE raids, and things like that. We’re confusing means and ends. […] I think that’s the central problem that the technologists continually go through, is they pretend like technology is the thing that matters, when it’s actually people’s fight that matters and the outcome that matters. — GERTRUDA, DIGITAL SECURITY RESEARCHER

[These] platforms tried to get people engaged with civic planning without understanding that they had to be able to implement what people were talking about. You can’t just ask people for their opinion. You also have to act on their opinion. — HARDY, TECHNOLOGY CAPACITY BUILDER AND CRISIS RESPONSE SPECIALIST

Our fifth research goal is to document stories of success and failure, distinguish between approaches to technology for social justice work on the ground, and identify what works, what doesn’t, and why.

A summary of our key Stories of Success & Failure findings is available in the Executive Summary.

MODELS THAT WORK

Community-led design and participatory approaches work

Across every sector (government, for-profit, nonprofit, social movement) we heard from participants that the most successful projects involve people in the design of technology that is supposed to benefit them. Community-led design and participatory approaches work because they enable community members to bring extensive lived experience and tacit knowledge to bear on critical decisions at each stage of the design process, from framing and scoping to the selection of relevant approaches and tools. In addition, this approach means that communities gain an enhanced understanding about the tool and develop skills during the design process. For example, Heiner, an Executive Director of a Legal Service Org, notes that in the public interest law and legal services fields, everything is very client oriented; lawyers doing this work constantly interact with clients who need to navigate larger unequal systems. She would like to see this happen more in the tech space. She emphasizes the importance of having people who are poor, are undocumented, are seeking housing, and/or have dealt with the criminal justice system involved in the creation of apps and technology systems that are supposed to be for them. Hibiki, a Digital Security Trainer, amplifies this point: “[participatory design is] all about developing tools and technology along with the people that it’s meant to serve. Just, in general, I think adopting any type of participatory approach from the beginning is usually super helpful, and also enables people to actually want to use this technology.”

Partnerships and relationships help catalyze project success

Building relationships and partnerships between organizations, government agencies, and/or communities, as well as with those with technical knowledge, can help foster successful projects. For instance, residents and organizers from Red Hook Initiative led and developed a mesh network, training program, and more, through partnership with outside techies. Partnerships are essential to foster change in local governments. A former municipal IT department employee convinced multiple city departments to open their data by developing personal relationships, networks of mutual support, and interdepartmental partnerships. Developing partnerships around a shared issue enables actors to combine efforts for a common goal. One participant described how Fight for the Future partnered with a wide range of actors, including private companies, nonprofits, policy advocacy groups, and informal networks to fight for net neutrality in 2015.17 Moreover, it is also essential to seek partners beyond the “usual suspects.” Innovative partnerships can yield successful projects, such as a national legal nonprofit establishing one-on-one relationships with attorneys for tech companies. Through these relationships, a tech company filed a pro-privacy EQUIS brief in a cell phone tracking case.

Public exposure can pressure large institutions to create change

Corporations, and other large institutions, at times fail to take security vulnerabilities seriously. These vulnerabilities may have real security and privacy implications for users. Security researchers find that publicly naming and shaming corporations can be an effective tool to pressure them to fix vulnerabilities. Similarly, public campaigns in recent years have put pressure on companies to address racial and gender bias in interface design; in search, recommendation, and predictive algorithms;18 and in hiring, salary, and management demographics.19

Crisis response tasks can be crowdsourced using innovative tech approaches

New tools can enable effective crowdsourcing of certain tasks, such as in crisis response. For example, Hardy, a Technology Capacity Builder and Crisis Response Specialist, described a project that crowdsourced aerial damage assessment to reduce wait times for FEMA aid: “We did a thing called MapMill, which is like Hot or Not for damage assessment. Civil air patrol went up and took a bunch of aerial imagery and then people were able to click on, ‘Is it fine? Is it slightly damaged? Is it completely damaged?’ […] we ended up with a heat map of where the damage was so that people from FEMA were able to show up and be like, ‘You need to fill out these forms so we can be here and help,’ instead of waiting for someone in the [city, then county, then federal] government to fill out paperwork […] We were able to shortcut through that.”

In another example, during Superstorm Sandy, Occupy Sandy leveraged existing networks from the Occupy movement, new coordination tools such as the Interoccupy conference call system, and commercial platforms such as the Amazon Wedding Registry to mobilize and coordinate thousands of volunteers and deliver millions of dollars worth of aid, in a process so effective it was studied and praised by researchers for the Department of Homeland Security.20

ICT infrastructure projects can be excellent opportunities to create citywide coalitions, connect diverse actors, and build community power

ICT infrastructure projects can have incredible power and leverage, and can tap significant sources of funds, especially when they draw together city governments, CBOs, policy folks, and technologists. For example, the Detroit Community Technology project and the Detroit Digital Justice Coalition in Detroit, and the Media Mobilizing Project in Philadelphia, have sustained citywide coalitions with social justice organizations and relationships with diverse actors in the ICT infrastructure sector for many years. These coalitions used the Obama administration’s Broadband Technology Opportunity Program as a way to bring tech, telecommunication, community media, storytelling, and community organizing together. The coalitions they created have helped win important policy victories for low-income communities in their respective cities, including around internet access, education, and workers’ rights, and “continue to be the most cutting-edge work in this field” (Alun, Technology Advisor for a City Government on the East Coast). Other examples of successful ICT infrastructure projects include Red Hook Wifi and community-controlled broadband deployment in New York City Housing Authority buildings. This approach is not only effective in large cities: for example, Bartholomeus, an Economic Development Director in a small city government, is leveraging technology and innovation in his mostly rural community by organizing smart agriculture meetups, working toward municipal broadband, creating the broadband infrastructure necessary for telecommuting, and teaching technology and entrepreneurship in K-12 schools.

When movements and communities own their infrastructure, they can also own their data and draft the security and privacy protocols that they need. One model that has proved effective is the approach followed by the MayFirst/PeopleLink collective. When MayFirst/People Link decided to build infrastructure and services for movements, it enabled their members to make political decisions that were usually made for them by third party services like Amazon, Google, and others.

Another often-described model of success is to leverage technology to expand access to legal services. For example, Ivar, the Founder of a Tech-Legal Fellowship Program, said that technology can be a tool to provide greater access to legal services to underserved populations. Examples of this approach include platforms to help people expunge their arrest records, remove themselves from gang databases, and verify eligibility for DACA, among many others. Another successful legal aid service mentioned by a study participant is Illinois Legal Aid Online, a statewide website that has many self-help forms for all legal aid clients across Illinois.

Clarity about political and ethical positions

Luna, a Member of a Tech Cooperative, mentioned that her web development cooperative maintains a vocal political opinion, and that they get clients primarily because people know about their politics. She has ethical and political oppositions to most tech spaces, and prefers to stay in politically conscious, cooperative, and free software communities. Another practitioner notes that grounding countersurveillance work around how technology has historically been used to marginalize, victimize, and oppress communities is essential. Surveillance of Black and brown people did not start with the NSA and cell phones,21 and recognition of history is also a political position.

Prioritize resilient and simple solutions over “cool new tech”

Ahmed, a Technology Lead at a West Coast City Government, noted that working in government is not about finding cool new solutions, but rather, building solutions that are resilient and last over time. For instance, Lulu, a Technology Funder, integrated a simple text messaging system in the legal aid system in Northern Virginia, where clients get a text message reminding them of their legal aid appointment. This simple but important solution enabled them to cut down no-shows by over 40%.

MODELS THAT DON’T WORK

Projects that lack engagement and understanding of technology needs and use on the ground fail

Tech projects that do not engage with or understand the needs of their users tend to fail. Several practitioners used civic gamification platforms as examples. “[These] platforms tried to get people engaged with civic planning without understanding that they had to be able to implement what people were talking about. You can’t just ask people for their opinion. You also have to act on their opinion” (Hardy, Technology Capacity Builder and Crisis Response Specialist).

Even when there is a need for a new technology solution, user research needs to precede design and development. “We funded an earned income tax credit tool [because] … unfortunately billions of dollars each year go unclaimed by the working poor because they don’t know they’re entitled to it. So, we built a system like that, and it got a lot of usage in English, but when we built it in Spanish and Vietnamese almost nobody used it. We built some automated documents in the Detroit area for the Arabic-speaking population. Almost no usage. So either we don’t understand how to deliver technology to these special language groups, or we’re not doing the right outreach, or it’s not culturally appropriate, I don’t know” (Lulu, Technology Project Funder at a National Legal Nonprofit Funder).

Projects with good intentions are not immune from failure. Alda, a Community Organizer and Consultant at a National Newspaper, explains, “I was working for a company that […] built this SMS based voter registration system. It was directly related to a community need where registration was a really difficult task because of how rural some of the landscape was. People had to travel long and far to get registered. It really tried to fill that gap. SMS technology was researched and deemed a preferable way to get that registration done because folks had access to phones. […] They also built a voting component in it. The voting component was something that essentially was, you can use this if you want to, or you don’t have to use it. It wasn’t really thought through. It was kind of just built because it could be built. […] There was never any user research for the voter component. […] There was no analysis on the political context of what could happen if they started using that and different groups got hold of telecoms and could ask telecoms to turn over that data. SMS is clear text. It’s very easy to see then who you voted for, depending on what your mobile number was. There’s just so many things wrong with that. I feel like that was something built with good intentions, but they did not do any of the risk modeling that they should have done.”

Thinking technology is a silver bullet, without understanding the problem, is dangerous

Practitioners shared many stories of failed projects. The common theme amongst these projects was that they all put the solution before the problem, technology before people. At best, this approach wastes scarce resources and time. Tivoli describes one failed project that stood out for her as a user researcher: “This group tried to make [a self-assessment tool] for elderly people, and it was iPad […] the idea was that it would be for patient activation, get people into the system. And it completely failed, because it was a technology solution. And, I don’t remember if it was the same group that redid it or if it was a parallel project. Someone did a brochure, and it was much more successful. That just stuck in my mind, because technology isn’t always the right solution. We don’t have to always make an app for it.”

Another practitioner shared the following story, of an organization that assumed that technology and folks with tech skills could magically solve a particular housing issue: “I was involved in an attempt in the civic tech space for tech workers to come in and do pro bono work for organizations and I was placed on a project […] that was trying to do work around folks who basically had heat violations in their apartment. Meaning that in the winter, their landlords didn’t turn on the heat. Broke the heater in an attempt, often, to get rid of folks. This felt very meaningful but I think that like all of the things you might expect to happen in terms of the scope was way too large, these folks at this nonprofit weren’t organizers so they weren’t actually as connected to housing organizers who are directly working with these folks. They had this assumption that if you gave people evidence, then they would be able to take it to housing court and win their cases. It’s not eviden[t] that’s the problem. There are so many different levels for which I think the assumption that if you add a [technology and] tech people to a thing, that it will work out, [it didn’t]” (Jay, Digital Security Trainer at a Nonprofit).

Technology solutions that do not factor in organizational and community readiness are setting themselves up for failure

Before implementing technology solutions, it is essential to verify whether they meet organizational needs. Organizations are at times eager to adopt new tech solutions, but pushing the wrong tool can result in backlash, mistrust, and over the long run, even greater inefficiency. As one practitioner put it: “First they’re like, ‘We really need this database’ […] but it’s because this one person really thought that this database, they liked it because it’s sort of cool looking. They kept pushing it through the organization, but it didn’t meet their needs. They went through like a year of transition, and it was just horrific. […] I think folks see technology as just a Band-Aid, rather than as an actual culture shift. It can be a game changer. […] The tool itself can cause all sorts of stuff, and then it causes distrust. Those cause distrust for everything, not just that tech person, but distrust for the technology overall. Then they resort back to doing things in a way that take more human hours, […] it becomes more difficult to do and then they’re not able to build on it” (Matija, Worker/Owner at Tech Cooperative).

COMMUNITY ACCOUNTABILITY

Center community needs over tools

Technologists often confuse means and ends. What ultimately matters is not tool adoption, it is people’s struggles and the outcomes in their lived experiences. “The struggle is not access to encryption tools. It is organizing day labor communities in order to protect against ICE raids and things like that. We’re confusing means and ends. […] I think that’s the central problem that I think the technologists continually go through is they pretend like technology is the thing that matters when it’s actually people’s fight that matters and the outcome that matters” (Gertruda, Digital Security Researcher). This is not to say that technology doesn’t matter. However, technology design processes should be accountable to the community and its struggles. One useful community accountability mechanism practitioners suggest is a community advisory board that is representative of the community and that engages and participates meaningfully in the tool development process.

“Parachuting” rarely works

Funders must first build capacity within communities before bringing support from outside. Too often, funders support parachuters for a quick fix, instead of capacity building within a community. Quick fixes are not sustainable beyond the existence of the parachuter: “We have funders that will fund large organizations who have large amounts of money to fly in to communities of color and basically tell them, this is how things should be done. We disagree. I disagree with that methodology and that strategy. One is that there are people within the communities already with knowledge, or lots of knowledge, who are not being lifted up. Two, we believe that if we’re really going to build power, we need to build power in the communities, which means we need to let go of our ego and we need to sort of build, mentor, build that power in the community, build the skills there. […] Funders are not into that work. They want to do something else. They think this is the faster way. I mean I know everybody wants the fast solution, but this is not going to be a fast solution. That’s where I know it impacts our funding greatly. There’s only a handful of funders now that are focusing on building capacity” (Charley, Executive Director at a Technology Nonprofit).

Practitioners said that funders need to listen to community organizers, not only to techies: “I trust the organizations I work with to be able to assess, to some degree, what kind of technology stuff they need. I don’t hear that reflected in some of these initiatives that I hear about. […] I think that people need to really listen to community organizers, not the techies” (Matija, Worker/Owner at Tech Cooperative).

Funders often support projects that do not emerge from the real needs of community organizations, but because of personal relationships or because a technology sounds “cool”

Those with power and resources often get to dictate who gets funding, which projects are funded, and technological “solutions” without much consideration for the community nor the context and broader implications of their proposed approach. As one practitioner described: “I would go in to interview people about what they needed from an online directory of community organizations. We soon found that people don’t need that! But the funders really wanted to. […] It wasn’t necessarily something that the organizations were saying they needed, though of course they said it sounded great” (Matija, Worker/Owner at Tech Cooperative).

Tech practitioners need to use access to elite spaces to open them, and to share knowledge and power

“I think a lot of tech practitioners who have skills and access to elite spaces need to use that position of power and knowledge to teach other people, and also to make those spaces accessible to more. One thing when I got into [an elite university], my grandmother called me and said, “You are now entering another type of space that you need to be the conduit for anybody in our family or in our community to be able to access that space. You are the gateway now to that.” […] I think that’s something that people need to think about. How can we make the knowledge and power that we hold more accessible to more people and redistribute that power and knowledge?” (Chandra, Research Associate at a National Think Tank)

EVALUATION & SUCCESS

We asked practitioners about how they evaluate their work. There was no single evaluation rubric. Instead, success is contextual based on organizational goals. Participants gave a very wide range of concrete examples.

For example, when evaluating status quo systems, participants noted a tension between the recognition that we need maintenance for existing projects, and the ob-servation that sometimes we stick with existing systems because we are locked-in, or because “people who already get money keep getting money.” In other words, it is important to tease out the difference between status quo projects that get support because they are good projects, versus those where we keep pouring money into failed systems. Ultimately, the question should be: Is this project meeting the needs of the community?

Practitioners do not agree on a single rubric to measure the impact of their work. However, most agree that success is a process, rather than a single outcome. Depending on the service their organization provides, the meaning of success can change quite drastically. At the same time, some practitioners opined that their peers in this field “far too often, don’t have a sense of what success is” (Gertruda, Digital Security Researcher).

For instance, for digital security experts within social justice movements, success is about mitigating the harms of state surveillance and infiltration of movements—a longstanding, ongoing battle. For folks that are organizers, success is the ability to organize grassroots movements for a mobilization, cultural, policy, and/or transformative outcome.22 For practitioners working with local, state, or federal legislative processes, success implies a favorable shift in policy. Some measure their success by the relationships fostered within a community. For others, the yardstick is their ability to conduct thorough user research to better inform the design of technological affordances. For public office holders, success is defined by the public’s perception of their work, and ultimately by re-election. For others, success is the number of app downloads, active users, pageviews, encrypted messages exchanged, and so on.

Use of technology to catalyze organizing

Organizations like Color of Change, Presente.org, weareultraviolet.org, Coworker.org, Control Shift Lab, and many others use technology as a catalyst for online and offline organizing. They use digital technology as a tool to empower their constituencies and further their movements. These organizations understand the “speed, scale, and power of new media and technology to raise people’s voices” (Alexis, Director of a National Nonprofit). Others develop websites, tools, apps, and platforms, and build and host ICT infrastructure to help nonprofits and movement groups advance their work.

Technology and science have been used to discriminate against, marginalize, and control communities of color for centuries. Some organizers, activists, and grassroots practitioners leverage their tech skills to teach their communities about threats and harms from digital technology, security, privacy, data extraction and manipulation, malware attacks, and more. Others spend a good chunk of their time facilitating cryptoparties, threat modeling, and providing contextualized digital security training. They develop and use technology to protect communications infrastructures and the open web, avoid censorship, and access information.

Technology also can be used to enable professionals to work “at the top of their license,” as one practitioner put it. For instance, one organization is developing software for legislative drafting to help keep track of changes made to draft bills, and ultimately to help them avoid making mistakes (Loredana, Cofounder of a National Tech Policy Organization). One technology funder we interviewed supports legal aid clinics to create tech tools that automate most of their redundant work, so that they can spend more time with their clients.

Common unmet tech needs of organizations

We asked practitioners to describe their organizations’ unmet tech needs. These were very diverse, across the ecosystem. Many nonprofits do not have the skills or resources to develop digital tools for their work (Garnett, Technology Consultant). For instance, lawyers providing legal-aid services often spend much of their time dealing with paperwork that can potentially be automated. When they are able to automate some aspects of client service delivery, they have more time to solve the legal problems of their clients.

When nonprofits do have a system or a platform, it is often a hand-me-down from the corporate world (Joss, Developer at a National Policy Think Tank). These systems are often expensive and are optimized for the needs of the private sector. Frequently, nonprofits would prefer to use free software and autonomous infrastructure, but adopt corporate services like Gmail because they are more user-friendly and secure than an in-house email server. In addition, most nonprofits find it challenging to expand their in-house systems. Maintaining non-corporate services is very demanding because it requires ongoing maintenance, response to threats and attacks, training, regular upgrades, and so on. Those nonprofits, co-ops, and collectives that do provide tech services to movement groups, nonprofits, and the public often struggle to maintain and update these services.

Unlike their nonprofit counterparts, those working within local, state, and federal governments often deal with government procurement bureaucracy. For instance, practitioners within government noted that smaller vendors are usually either not able to comply with government requirements or are not interested in “jumping through all the hoops” to do business with the government.

Referring to specific unmet tech skills, practitioners identified the need for more data journalists, data engineers, data visualization experts, app developers, system and infrastructure administrators, malware researchers, and practitioners with machine learning skills.