Watch Out: The Privacy Police Are Coming

© AdobeStock
Hastily enacted regulation that limits innovation and technological advancements and imposes a broader set of requirements on companies and board members is not the solution to the potential ethical issues surrounding A.I.

Consumers and businesses are closely watching the media and political debates surrounding privacy—both what consumers are entitled to expect as well as how regulatory oversight of the collection and usage of consumer data by companies should work. However, these discussions have not made their way into most boardrooms in America.

With more and more regulatory agencies at every level leaping into this fray— from the EU with the General Data Protection Regulation (GDPR) to California’s Consumer Privacy Act—that must change. It is imperative that we are educated on the future of A.I. and the ethical issues surrounding the rapidly advancing technology. Boards must equip themselves to have thoughtful discussions on how the company can adopt the new tools in a responsible way that will maximize the benefits for consumers and shareholders— before regulators force the issue in ways likely to stifle innovation and growth.

The gauntlet has already been thrown down. On April 10, two U.S. senators introduced the Algorithmic Accountability Act of 2019; if the act is passed, it will require large entities that use, store or share personal information to conduct impact assessments surrounding their automated decision systems and their data protection practices. Companies will be required to evaluate their algorithms and, if a report is generated that flags a risk of discrimination, privacy problems or other issues, address the issues in a timely fashion.

This poses a myriad of existential questions that must be carefully considered: Who will define what constitutes a privacy violation, a misuse of customer info or a discriminatory filter in the algorithm? Will board members be responsible for the outcomes of the company’s use of A.I., machine learning, data and algorithms?

Would that mean that the board must take on the role of being the company watchdog to ensure these higher standards are met? Would it mean that a director’s duty of loyalty must include, on an equal footing, shareholders, the community, employees and an affirmative obligation to do good equally across all these constituencies? If so, this would shift the corporation into the role of public policy and government.

Having an avalanche of new regulations that have not been stress-tested could be detrimental for innovation, the economy and the citizens who greatly benefit from the use of emerging technologies.

Today, in the U.S., we have clarity surrounding to whom directors owe their duty of loyalty. As a board member, my duty lies with the shareholders, it is unambiguous.

That’s not to say that corporations should not have responsibility for the impact of their actions. Of course they should not be bad actors and should drive to be supportive of a community, etc. However, as a board member, I have a duty of loyalty first and foremost to the shareholders. I must understand the short-term economic impact of a decision versus the long-term viability of the enterprise, the value of the brand assets, and make a business judgment. In no uncertain terms, the board and the corporation’s first priority is the health of the enterprise.

If a regulation were to pass that could dramatically shift the priorities of a company, it could force the company to redistribute profits into a community rather than reinvest in their core business to drive innovation and bring new products to market.

It is the responsibility of a company to decide whether or not investing in the “social good” will be part of their overall mission statement. Then the consumer is free to decide to support or not support that company.

I believe that this market-based approach works. Most people in corporate America, in the boardrooms of America, are well aware that it’s critically important to not only make a good product and be profitable but also to be good corporate citizens. Nowadays, there are numerous discussions surrounding the mission and purpose of a company, the values to live up to and how decisions will resonate with stakeholders and customers.

If you look at the fact that $1 out of every $4 under professional investment management was invested with consideration of environmental, social and governance investing (ESG) criteria, it is a clear indication that the market values and prioritizes companies that check the boxes of being socially responsible without the interference and nudging of additional regulations.

A hastily enacted regulation that limits innovation and technological advancements and imposes a broader set of requirements on companies and board members is not the solution to the potential ethical issues surrounding A.I.

Read more: Raytheon CEO: Directors Must Hold Management Accountable On Cybersecurity


  • Get the Corporate Board Member Newsletter

    Sign up today to get weekly access to exclusive analysis, insights and expert commentary from leading board practitioners.
  • UPCOMING EVENTS

    JUNE

    13

    AI Unleashed: Oversight for a Changing Era

    Online

    SEPTEMBER

    16-17

    20th Annual Boardroom Summit

    New York, NY

    MORE INSIGHTS