We’re better than this. As software engineers and designers, we’re in the room when decisions are shaped, and the only ones who have the power to actually execute them. It’s our responsibility not to forsake the people who trust the apps we make with our silence. To stand up and refuse to implement unethical systems and dark patterns. And even more, to educate stakeholders on the real human costs of their business decisions: the time, attention, money, and trust of their customers. It’s harder, yes, and riskier. But they can’t build it without us. We get a say. Even if it’s not in that meeting, we can think about the goals they’re trying to accomplish and propose alternatives. We don’t have to hide in our sit-stand nap pods and eye-roll while we engineer a worse world. We can do more than write code. We can research and present better alternatives. We can write memos and make a slide decks to convince them of of our position. We can be activist engineers.In a nutshell, Bischoff claims tech pros at some firms are simply not proactive, and only react when times are tough, often dire. He’s not wrong, and it’s not right. There are plenty of excuses for staying quiet when you know something is amiss at a company, but no good reasons. It may be, as Bischoff points out, that tech pros don’t have a universal code of ethics. He points to the ‘Order of the Engineer’ as a boilerplate example of what tech pros should emulate, but the fact remains that it’s up to you individually to follow your moral compass. We don’t know what will come of the latest imbroglio involving Facebook, but government oversight is on the table. None are louder than Oregon Senator Ron Wyden when it comes to Facebook and government oversight. His comments distill down to ‘fix it, or we will.’ Such oversight inevitably won’t focus just on Facebook, and nobody wants government involvement for any tech product or service – much less all of it. But we should keep in mind that it was the corps of engineers within Facebook that lead us here. They weren’t proactive. They didn’t speak up when it was clear things were wrong internally. They’re not the first group, either. In fact, this happens all too often. Data breaches are also a result of a blasé attitude about a stack or technology. A great example is T-Mobile, which thinks storing usernames and password in plain-text is fine because it believes its security is “amazingly good.” It all shows that engineers, developers or anyone else involved in a consumer-facing product can – and should – speak up when they notice something isn’t right. Facebook, Uber and the rest promise sweeping change, but an internal culture of ignoring issues is actually to blame. A company might have hired you for your technical acumen, but it inherited your morality. Saying 'no' the next time you see that something is wrong, or otherwise incongruous to what you know to be best practices, might save your company a lot of trouble down the line.
At some point in your tech career, a hard choice has to be made. Do you continue down the path your managers laid out, or go against the grain? It’s never an easy choice, but it might be better to take a stand. In the wake of scandals at big tech companies such as Facebook and Uber, engineers and developers were left wondering whether their work had a detrimental effect on the world, or at least their users' data. Many of those tech pros often left rather than work at a company that committed an ethical violation. But their departure just raises more questions: should they have sounded the alarm earlier – if at all? In a blog post titled Activist Engineering, independent developer Matthew Bischoff thinks so (emphasis his):