Skip to main content

22.4 Ethical Issues

Technology is an ever-changing space, but there are some ethical issues that come up over and over again, regardless of how much changes. Whether it's dealing with countries that have repressive laws, handling products designed for children, or creating addictive experiences, you might be faced with decisions that don't feel right to you. It can be difficult to know what's the right thing to do.

Learning objective

In this checkpoint, you'll learn about some of the ethical issues product managers could face. A deep understanding of these issues will make you a better leader in the product space, and a more well-rounded and thoughtful PM candidate. It will also help you consider all the relevant information if you need to handle difficult ethical choices in your work.

By the end of this checkpoint, you should be able to do the following:

  • Explain how specific legal and ethical issues can impact product development



What are ethics?​

Ethics are the rules of conduct that groups use to decide what is right or wrong. Ethics contrast with morals in that ethics are the behavioral expectations defined externally by cultures or countries. Morals are internal values that are about how you define your personal integrity.

As technology pervades more and more areas of people's lives, it's worth reflecting on the ethical and moral issues you'll face in product management. Below, you will learn how to handle situations where a decision makes you uneasy about likely outcomes, or what to do if you are being pushed by other stakeholders to make decisions that you don't agree with on an ethical level.

Ethics and tech​

From voting machines to technology used in surgery or by children, there are many topics where the intersection of ethics and technology can be tricky. Some of the most common ones will be covered in this checkpoint, but this is by no means a comprehensive overview. Dedicated research groups are emerging, mostly in academia, to explore the complex questions created by the intersection of technology and ethics. Some examples of these institutions, which you may want to explore further, are W. Maurice Young Centre for Applied Ethics, the Center for Humane Technology, and the Ethics and Emerging Science Group.

Operating in restrictive countries​

As mentioned in the last checkpoint, some countries like Russia and China have laws that require companies that operate there to disclose any data about any user on demand. That can include turning over access and encryption keys. So, if your product operates in such countries, their governments can access all your information about your users without your knowledge or consent. You might also support a product that's used in a country where you don't want to implicitly support the government; for example, if the government is oppressive or the country is torn by internal strife.

And these issues get even more complicated. You also need to think about how operating in one of these countries can affect users outside of them. If you store US user data in Russia, even briefly, then it is subject to Russia's data localization laws. That means the Russian government or law enforcement can get access to your US users' data!

Data brokers​

You're already familiar with third parties like Google Analytics or Segment who receive, share, and store data that is generated from your product. Some companies go further and outright sell data to other companies. Then, those companies resell data to other parties. Why would they resell data? Because there's a market for it. Some companies want to buy lists of people and data about them so that they can sell them products. This data can include huge lists of email or mailing addresses, as well as demographic and psychographic data that allows for sophisticated segment marketing. Where do you think all those credit card applications you receive in the mail come from?

Purchasing or sharing data is also the basis of advertising and targeting that advertising. If an advertiser wants to place their ad for camping equipment in front of the right person, they need to know a lot about this person. Knowing what websites they visited, products they've bought, and searches they've explored online, among other indicators, can tell the advertiser where their ad spend will most likely yield results. Some companies are transparent about the information they've collected about you—for example, you can check out what Google knows about you here—but not all companies are as forthcoming.

Addictive tech​

Many companies try to create habits in their users so they'll use their product on a daily basis. These products will hammer users with notifications about activity or create rewards for frequent use. This is especially easy to see in games and social media products. You've encountered some of these techniques when learning about the hooked loop in the checkpoint on increasing product growth. When these products are successful, people can get addicted to the product—to the rush of winning a game or the validation of others on social networks.

You might be facing similar pressure in your product role and asking yourself how you can create a product that people use every day. How do you increase the likelihood of users returning to your product again and again? Some users and critics see this as product designers striving to create negative habits, which are not beneficial and can even be harmful to users. You will have to consider whether your product is helping people and adding value to their lives, or hurting them with more frequent use.

Children and tech​

Many countries have laws governing how products can be marketed and used by children. In the US, COPPA (Children's Online Privacy Protection Rule) explicitly governs when children must get parental approval to use a service. This law also restricts how data can be collected and used based on age. The law is intended to offset how children are targeted by certain products, like game companies, because of their susceptibility to offers that generate ad revenue or in-app purchases. This can be seen as furtive or sneaky, since children are not as savvy about how companies try to hook them with products or about the fees they might be charged by those products.

Another distinct area of concern when making products for children is predators. If your product lets users communicate with other users, then you should be very concerned about adults trying to take advantage of children using your app. This applies to many general consumer products that a child could use, not only to child-oriented products. Take very special care to ensure that children are safe when using your product, especially if they can interact with other users.

Autonomous systems​

You're probably familiar with how self-driving cars use detection and automation to navigate city streets. But autonomous systems go way beyond self driving cars. All kinds of products—including factories, medical devices, delivery services, farming, and aviation—use systems to automate their processes. Even the probes that landed on Mars were autonomous systems that were programmed to land and navigate themselves without human intervention.

Automated systems come with implicit dangers, because any mistakes can put people's lives at risk, such as automated cars hitting pedestrians, airplanes crashing, or worse. These aren't hypothetical risks. They've really happened, and they'll become bigger risks as humans depend more and more on technology and automation for daily tasks.

Content moderation​

Any site that lets users create and share content should also plan for how it will moderate the content that's published on the service. News reports about "fake news," online harassment, or toxic content are in the news every day. It's up to the companies running those services to moderate the content by setting policies about what content is allowed, creating systems for detecting and removing bad content, enabling users to report inappropriate content, and establishing a protocol for how such reports are handled.

Companies have to balance complaints from users with the business goals the company is trying to achieve. As a result, YouTube might leave up content that communicates false information, since stepping in as a fact-checker and removing content could negatively impact their revenue or clash with their mission statement "to give everyone a voice and show them the world." When a company decides to moderate content, they must also consider the impact on their employees, as moderators at large companies experience trauma due to the extended exposure to violent content. It's a difficult balance to strike. Companies struggle daily to balance defensible policies, freedom of expression, healthy moderators, and their business interests.

Dealing with ethical choices​

Ethical decisions can be extremely difficult to handle. You might be pressured to allow content or functionality in your product that you think is objectionable, be it because your boss disagrees with your assessment or because going with the more ethical choice would have severe negative consequences on your company's bottom line.

In these situations, discuss the problem with others—both those inside your company and outside of it, if the details are okay to share and you have professional colleagues or mentors you trust to advise you. Usually, you'll find that you're not alone in having objections or concerns. Having those discussions will help you come up with new ways to frame your problem, put things into perspective, or inform your decision about what to do next.

When you're ready to make a decision, a good framework to consider is Exit, Voice, and Loyalty from the book by the same name by Albert Hirschman. The basic idea of the book is that when problems occur, you have to make one of three choices. Say your company is going to expand to Russia, and you strongly disapprove of that move because it is likely to put your users' data privacy at risk. You'll need to make one of the following choices:

  • Exit. Quitting your job might not be the best option, but choosing to exit can also include changing projects and focusing on something that doesn't create a moral quandary for you.

  • Voice. Speak up. Make clear arguments for why you think this is a bad idea. Demonstrate the potential costs in things like user trust and market valuation. Object to the decision with your boss, executives, and others. But be careful; being adamant about your view could put you at risk for losing your job. You need to be prepared to suffer the consequences.

  • Loyalty. Buy in. Say yes. One way to "just do it" is to think about the benefits of going with the decision, like giving new users access to your product and adding to your company's financial stability. If the benefits outweigh the costs, it could be worth putting aside your reservations.

While this framework breaks your choices down to three options, in practice these decisions are more complicated and hard to make. Don't feel bad about struggling. Society in general is still navigating the ramifications of technology. If you're faced with an ethical dilemma, work hard to clarify your perspective and explain your reservations. That's the starting point to ensuring that the right questions are being asked and to helping you navigate challenging situations thoughtfully.

And don't forget that as a product manager, you are also a team leader. Even if you come to terms with a company decision that has ethical ramifications, not everyone on your team will necessarily feel the same way. Prepare to explain your ethical decision making, and emphatically listen to and work through any ethical reservations your team members might bring up.