Looking to The Future
Those who say their company will survive or even thrive as a result of digital trends attribute that strength to their own actions and digital capabilities, while those who say their company will weaken or die blame market forces.
Respondents to our survey mention a number of social and ethical concerns brought about by digital innovation. When asked about their biggest concerns, apart from privacy, they most often cite cybersecurity or digital crime, job replacement, and the unethical use of data.
A commitment to greater flexibility, in the service of innovation, brings with it the need for sturdy ethical guardrails around increased employee autonomy. If the benefit of loose coupling is greater agility, its drawback is a loss of control. Executives and managers, therefore, must strive to foresee risks and equip employees so that they know how to respond — or at least know to slow down and seek help — when ethical questions arise, as they surely will.
That might explain why digitally maturing companies are more likely to have adopted policies to support their organisations’ ethical standards with regard to digital initiatives. This year’s survey found that 76% of them had such policies in place, compared with 62% of developing companies and 43% of early-stage ones.
Policies Are Not Enough
Policies are often outdated or insufficient; leaders need more time to consider and communicate the societal impact of their organisation’s digital initiatives.
(Percentage of respondents who agree or strongly agree)
A common mistake managers make vis-à-vis digital ethics is assuming that their companies’ legacy policies are adequate. After all, nearly every company, once it has grown beyond the start-up stage, has some sort of employee handbook that at least begins to spell out expectations regarding proper and improper employee behaviour. More mature organisations might have even taken the time to compose an overarching values statement and craft ethics policies.
But the mere existence of guidelines doesn’t guarantee that those guidelines are up to the task of steering digital innovators through the ethical dilemmas they might face, says Michael Santoro, a management professor at Santa Clara University’s Leavey School of Business. Digital ethics is one of Santoro’s fields of expertise, and he’s often asked to consult with companies on their ethical standards.
He has found what he calls “a very serious, systemic hardware problem.” Often, a company’s code of conduct will have been “written 10, 15, or 20 years ago — even for tech companies — and hasn’t been revisited since.” At times, that code will be more remarkable for what’s missing than for what’s addressed. Santoro says he’ll often notice “a lack of a statement in the code of conduct about all of the business areas that a company is working in, a lack of board responsibility for any of the principles that the company has vowed to uphold, and a lack of channels to report up into the board.”
His advice for digital innovators thus begins with the suggestion that they consider their hardware needs from the beginning, instead of waiting until trouble arrives. Ethical considerations should be part of product design “so that you’re designing your product with a consciousness about the impact that it’s going to have on society,” he says.
The sort of approach Santoro recommends has been adopted by the identity verification and fraud protection company Socure as it has built out its product suite. Socure’s target market, financial services companies, have robust risk systems in place to identify biases in the way they offer their products to the public, says Johnny Ayers, the company’s co-founder, and senior vice president. (Lending bias is closely regulated, specifically with Regulation B from the Equal Credit Opportunity Act from the Consumer Financial Protection Bureau.)
They’re extraordinarily careful to ensure that their credit-reviewing and granting models don’t contain biases related to “age and gender and race and socioeconomic status,” Ayers says. As a result, Socure had to be equally mindful of these issues as it created its services, with each Socure team member being a consumer themselves. Ayers adds: “Even when we were only 10 people, we were building a lot of very specific controls into how we build and train models, knowing that, when you sit down with any number of the major credit issuers, their expectation is that you have stress-tested any of your models that you’re proposing to ensure that none of the aforementioned biases are implicit in your models.”
Santoro sees the potential for applying financial industry thinking to other fields. When he’s called in to help a company solve a problem, he says, “what I’m usually doing is designing something that looks like Sarbanes-Oxley,” the 2002 law enacted in the wake of financial frauds that governs corporate record keeping and disclosure.