What does it mean to be an ethical software company?

Posted on 12th November 2019

three women with laptops around table

Written by Joshua Keel, Senior Software Developer at NRECA

Entrepreneur, web designer and writer Paul Jarvis posed a question on Twitter this week:

I immediately started mentally listing the values and behaviors I believe make an ethical company. Ethical software is something I care deeply about. The questions of ethics are ones we all face on a daily basis, but how much more important are they for tech companies who are literally driving customer behavior through the design of their products?

They can make us buy more or less, design experiences that will keep us glued to our screens, or connect us with others face to face. They are not just designers of products — they are designers of human behavior.

Make your customer’s lives better

So what makes a software company (or any company, for that matter) ethical? The foremost principle that I believe should guide us is “Are we making our users’ lives better, or worse?”

Hidden in every individual’s answer to that question are implicit value judgments. What does it mean to make someone’s life better? How do we know their lives are better? One person’s better is another person’s worse. Maybe we should just stay out of it.

This “let’s just stay out of it” attitude seems to be what drives companies like Facebook to not address the major problems they have created. The idea is that they’re building something that can’t possibly be other than good. We’re connecting the world! There’s no way that’s bad, right?

It helps to really analyze whether simply connecting the world is truly making lives better. That’s a tough question, but one that science can help us answer. In his book The Moral Landscape, neuroscientist and philosopher Sam Harris describes the equivalent peaks and valleys of well-being a society might achieve. There is not necessarily just one right answer.

Data can help us find a better answer, though. Studying the actual effects of what a company like Facebook or Twitter is doing to us, and specifically what particular design decisions do to our human brains, is very important.

It seems utterly clear to me that employees at Facebook are not asking the question “does this make the user’s life better?” very often or rigorously. One has only to look at the notifications system to see how ruthlessly they exploit our attention. It would be hard for the notifications bar to be a more noisy or useless contraption.

It’s not that notifications are evil, but I truly do not care if someone in a group somewhere that I maybe look at five times a year, happened to post a new comment. It doesn’t enhance my life in any way to know that. It only means I have another place (much like my email inbox) to check, dismiss, and ignore.

A level of empathy is required to put the user’s needs above profit. Tech companies — nay most companies — exploit one group of people (workers, users, the world at large) for the benefit of another: shareholders.

Be responsible for the effects you create in the world

The fact that companies primarily take action to enhance the well-being of their shareholders rather than their employees or customers makes sense. Shareholders are, after all, the owners, and the people they are ultimately answerable to. Shareholders are the boss.

Thankfully, shareholder and customer concerns are often aligned. Very often what’s good for the customer is also good for the bottom line. Amazon’s obsessive focus on customer satisfaction has certainly paid off.

Customers aren’t the only people to think about, though. Employees and contractors are also people with their own complex lives and problems, who deserve to have their boats rise with the tide of corporate success.

Yet so often, success for a company means profits for the shareholders, while employees are left working minimum wage jobs, struggling to support their families, and being laid off when the CEO (who makes millions per year) deems it necessary to lean out by cutting staff.

Profit isn’t bad, but when that profit is hoarded and used to satisfy the greed of the already spectacularly wealthy, profit isn’t serving the public good. A society in which the well-being of less privileged individuals is protected is one in which we all have a chance to flourish. We all benefit from our neighbors, coworkers and acquaintances living meaningful and fulfilling lives.

More corporations need to adopt the aims of B (or benefit) corporations, whose stated legal goals are to have a “positive impact on society, workers, the community and the environment in addition to profit.”

Community and environment should be valued corporate stakeholders. In their quest for profit, too many companies victimize both the earth and their local communities. Many companies are so bottom-line oriented that they frankly don’t care what impact they’re having on those around them. Their implicit values are selfishness and self-preservation, not making their communities better, or taking care of the planet.

Value the user’s privacy

For many tech companies, privacy seems to be an afterthought, if it’s considered at all. Data is seen as an all-important good which can be “mined” to achieve competitive advantages.

Very few companies treat their user data with the respect it deserves. Software startups should aim to collect less data on their users — the minimum possible, in order to not only protect privacy but also maintain that customer data securely.

Hacks are inevitable. The likelihood that some portion of a company’s user data will eventually be compromised is high. It makes sense to take low-tech measures to decrease the impact of those breaches. The very best option is simply not to store the data in the first place.

Paul Jarvis’ own software company, Fathom Analytics, is a great example of a product that values privacy. Fathom is a website analytics company, similar to tools like Google Analytics. It tracks the number of hits on your webpages so you can see, for instance, which blog posts are your most popular.

Paul and his co-founders have thought not just about the customers who buy Fathom, but also about the innocent people having their data collected every time they visit a site which uses Google Analytics (a huge portion of the Internet). Fathom doesn’t even give their customers the option of tracking individual user data. Everything is aggregated and anonymized, so you get just the data you need to make business decisions, without compromising the privacy of your website visitors.

Humane tech

In their ever-growing and over-zealous need for time on device, eyeballs on ads, and “engagement”, technology companies are pulling us into an invasive, habit-forming vortex; one that we cannot easily escape from.

We need to think a lot harder about what technology does well, and what it does poorly. I’m no Luddite, but technology certainly isn’t an unqualified good. Just look at the atom bomb.

We need to thoughtfully, humanely, and carefully consider our relationship to technology. People like Cal Newport, with his books about how tech is affecting the workforce, are pointed in the right direction. We need more people to care about privacy, like Paul Jarvis. And we need people like Tristan Harris at the Center for Humane Technology, an organization devoted to advancing the ideas of ethical design through thought leadership and political advocacy.

In short, we need to care, to show a little concern for the effects our actions have on others. Technology is wonderful, and we can use it to build a better world. I’m so excited about that possibility! But we need to be wise and awake to the risks that unbridled and unregulated, profit-centered tech development poses to our well-being.

Our technological future is not written in stone. The actions we individually and collectively take over the next years and decades will decide our fate. Will we bring the dystopian world of Pixar’s WALL-E to its full fruition, or will we consciously shape our tech to bring us more fulfilment, better relationships, and the greatest possible well-being?


Originally posted here

whois: Andy White Freelance WordPress Developer London