Thursday, 5 March 2020

Overlooked in the digital transformation: Ethical, Societal, Legal, Regulatory issues

Overlooked in the digital transformation:  Ethical, Societal, Legal, Regulatory issues
A friend and I were talking recently about the dramatic change that digital transformation will create.  The potential impact on our lives is tremendous, from autonomous vehicles that whisk us to and from work, to drones that deliver goods directly to our homes or other locations, to the insights generated from data harvested by IoT devices. We can expect significant upgrades in customer experience and business models from the use of data massaged by artificial intelligence and machine learning. 

Of course, on the flip side, there are concerns. Governments and private companies are gathering more data about individuals than ever before.  What will they do with that data?  Who decides who owns your image, and what your rights are to privacy?  What will Apple, Google, Facebook and others do with all the data they collect about your activities and selections online? 

We've reached a point where the technology and what it can do has easily surpassed the consideration of its consequences and the impact on our ethical and regulatory frameworks.  This isn't the first time this has happened, however.  Recall just a few years ago that a scientist manipulated genes in a fetus, causing an uproar in the scientific community.  The fact that we can do something because of advances in science does not mean that everyone is prepared for, or comfortable with, the ability to do it.

In the past, non-technical factors - ethical, moral, legal, societal, regulatory - slowed the advancement of science.  If we look back only a few centuries, we can see that ethical, religious and moral issues resisted science and what it could do or tell us. From Galileo confronting the church on the geocentric model of the universe to doctors desecrating graves to get access to human cadavers for dissection, science has often been held back by societal, regulatory or ethical concerns. 
 

Today, the opposite is true

Today, science and technology move so quickly that ethical, moral, societal, legal and regulatory frameworks struggle to understand what is happening, much less keep up with the changes.  Airbnb can enter a city and dramatically change the housing landscape before the city government determines its opinion on short-term leases.  Facial recognition programs tied to cameras in public places can capture millions of images and those images can be used to determine the attitude and potential behavior of citizens.  The speed of implementation has shifted - science and technology move faster than people, societies and beliefs.  The impacts that new science and technology have on how we live, the rules we agree to, the expectations we have about security and privacy are not aligned with all the new technology.

Confounding the new technology will be the fact that society in general, our politicians and the laws and regulations they create and other administrative rules and burdens won't change quickly, and will delay the promise of many of the emerging technologies.

Pushing more than one rock uphill

The use of autonomous vehicles and drones creates an excellent example.  The adoption rate of these devices is NOT based on technology.  Autonomous vehicles are at least as safe as human drivers are today, but that does not mean we'll see rapid adoption.  The barriers that exist are important and diverse.  They include:

 - Multiple jurisdictions.  Just because California likes and supports AVs does not mean that Arizona or Nevada will.  This could even be true between local jurisdictions or counties.  Who will risk buying a car that can be used in only certain locations?
 - Insurance.  Until the insurance industry can determine how to price the risk of autonomous vehicles, and more importantly, who bears the risk, it will be challenging to get a lot of autonomous vehicles on the road.  Moreover, who owns the risk?  Does the passenger in an AV bear the liability for an accident, or the company that is technically in control of the vehicle?
 - Blended traffic.  The perfect world for AVs is when every car is an AV, because they will be more consistent and predictable.  As long as AVs and humans are sharing the road, the level of danger and unpredictability goes up dramatically, meaning that more accidents are likely, which will probably be blamed on the AVs.
 - Standards.  While it is good to have competition in AV technology, we will probably need a unified set of standards so that cars and the devices that control them all work with and on a set of agreed standards. 

Thus, the full scale implementation of a technology that is already reasonably mature will not depend on the technology, but on the legal, jurisdictional, administrative and societal acceptance.  Who is doing the work to prepare the population, revise the laws, change expectations?

Why drones are an even more interesting challenge

Keeping the challenges of the autonomous vehicle in mind, let's make the problem slightly more difficult, and 3 dimensional, by considering the challenge of building a drone business.  We add the complexity of the AV, with the added issue of flying a large object overhead, where risks are greatly increased, and where there are even more regulatory bodies involved (FAA).  If you are trying to build a business in this sector, you are facing a problem that even Sisyphus would find difficult - pushing several different rocks uphill at the same time.  Two points are critical from this sentence:  Several rocks and simultaneous advancement.

Several rocks:  to win in this space you have to 1) demonstrate the technology works, and provides benefits over existing solutions first,   2) convince local, state and federal authorities that the benefits are worth the risks, and to change laws and regulations, 3) convince people within the industry that the new solution is worth adopting, and keep your end customer or consumer from turning against the technology, and 4) demonstrate to the consumer that the value of the new technology outweighs the cost.

Simultaneous advancement:  To win in these very complex technologies, you'll need to do all of these things relatively simultaneously. This is the definition of a "wicked" problem - one that has many participants and constituents. You don't want to gin up too much excitement in the consumer space and be unable to demonstrate the technology works or adds value.  You don't want to over invest in a technology only to find that regulators are unwilling to change the laws to accommodate your new technology.

Many public implementations of digital technologies - especially those that interact with the public like robots, AVs, and drones - need to consider the ethical, moral, societal and regulatory challenges.  While Asimov may have created the 3 laws of robotics, his stories don't consider how the population reacted to the advancement of robots, how they were compensated for the loss of their jobs.  While AVs, drones and robots are risky because they could interact with humans, other digital technologies like AI, facial recognition, natural language processing and IoT are also interesting and potentially problematic, because they could lead to a loss of privacy and security.

How do we prepare the population for the advent of digital technologies?

What we need is more thinking and more investment in the secondary and tertiary impacts of digital transformation - what does it mean that governments have more of our data and images?  How should they use them?  What could it mean that robots and other digital transformation eliminate jobs?  What risks are we willing to accept to live with and among AVs and drones, and what are unacceptable risks?  How do we condition people to the fact that technology will become more prevalent and more overt in their lives?

As we've seen, in the not so distant past, societal norms, religious authorities and governments had significant control over the pace and impact of new scientific advancement.  Queen Elizabeth (the first one) once rejected a patent for an automated loom because she was concerned that her population would lose jobs.  Today, the reverse is true - we base our hopes, companies and futures on rapidly emerging science and technologies, often with little understanding of how much change these technologies unleash, how unprepared the population is for the actual impacts, and how existing laws and regulations may limit the value or use of technology.

There's an opportunity in here somewhere for someone to do some serious thinking, and bringing together different constituencies to help provide a pathway for more information, more analysis of the impact of digital transformation, more sense of the changes necessary in legal and regulatory frameworks and more understanding of the risks to income, privacy and security.  Who is doing this work?

The best way to address this is to bring people from different disciplines together in one team or organization - people who are interested in the technology of course, and what it can do, as well as people from the political realm (who can change laws or create new ones), funding mechanisms, education entities (because we need to educate both young people and older people on the possibilities and impacts of new technologies), people who focus on privacy and ethics, sociologists and of course we'll need experts in the law.  It would work best if we could create integrated information about these topics, with these diverse perspectives, because the technologies will have impacts that cross all of these functions - and probably more.