Back in the early 90s, shows like Transformers and Knight Rider shaped and invigorated the imagination of young kids by integrating well known cars with computerized, autonomous, and intelligent personalities. The cars drove themselves; both literally and in personality.
Today this story, which invigorated the imaginations of these lawyers, is to a large extent no longer a story of fiction, but a lived reality. Self-driving cars are no longer just a thing of the future—they are quickly driving themselves into the present.
Wicked! Radical! Right? Unfortunately, we have to burst your bubble: there are significant privacy and insurance implications that have to be addressed with any up-and-coming technology. Fortunately, we, at Siskinds, are here to guide you through these potential issues.
Most car manufacturers have their own self-driving artificial intelligence
Car manufacturers throw many buzzwords around: Autopilot, Super Cruise, Sensing—the list goes on. But aside from these buzzwords, what do they all mean? Quite simply, each one of these buzzwords describes a whole set of interconnected hardware and software systems that automate some driving-related task in your car. However, no system has successfully automated all driving-related tasks.
SAE, an international transportation standards organization, has developed a standard that creates a classification system for autonomous vehicles. This standard categorizes automated cars by the amount of driving-related tasks performed by the computer. This classification has been adopted by both Ontario in a regulation (which we discuss in greater detail below) and the US National Highway Traffic Safety Administration in September of 2016 when it issued the Federal Automated Vehicles Policy.
This standard has 6 levels (including 0): (which you can also conveniently see in chart form by clicking here)
Level 0 – No Automation: all aspects of driving is performed by a human driver.
Level 1 – Driver Assistance: where the car’s automation system can control either the speed or the steering of the car, but the driver can disengage the system at any time. For example, a car with autonomous cruise control (i.e. a car that can brake when it senses it’s too close to another car or speed up when traffic allows it to).
Level 2 – Partial Automation: where the car’s automation system can control both the speed and the steering of the car, but the driver can disengage the system at any time.
Level 3 – Conditional Automation: where the car can drive itself and does not require human supervision. However, the car’s backup driver is still its human driver. If the car encounters a situation it doesn’t understand or if the car’s autonomous system fails, it will notify the driver that the autonomous system will disengage, and after a short period of time, it will disengage. Thus, the driver must be prepared to take over. Importantly, the driver can disengage the system at any time.
Level 4 – High Automation: the car can drive itself only within certain circumstances, such as city driving or highway driving. However, outside these circumstances (e.g. bad weather on a country road), the human driver must drive. Significantly, if the human driver wishes to take back control of the car, the car may decide to delay giving control to the driver.
Level 5 – Full Automation: the car can drive itself under all circumstances and can delay a driver’s request to disengage the automation system.
Can you guess what the highest level we have achieved on the mass production level? Give yourself a moment to guess before reading down.
As of November 11, 2020, the Japanese Ministry of Land, Infrastructure, Transport and Tourism gave Honda clearance to mass produce Level 3 vehicles for consumers. Click here to see Honda’s press release.
Japan won’t be the last jurisdiction to certify Level 3 vehicles. In fact, Ontario currently has a pilot project working on automated vehicles. The Pilot Project is creatively named “O Reg 306/15: Pilot Project – Automated Vehicles” (the “Regulation”) and is passed through the Minister’s Authority derived from the Highway Traffic Act, RSO 1990, c H8.
Ontario’s Pilot Program allows the testing of automated vehicles
The Regulation defines “automated vehicle” as one that “operates at driving automation Level 3, 4, or 5.” The Project then defines “level” as those contained in the SAE Standard cited above. Importantly, the Pilot Project does not apply to vehicles that are at Level 0, 1, or 2 (and certain level 3 vehicles). The Regulation also prohibits the driving of automated vehicles (i.e. certain cars at level 3, and all cars at levels 4 and 5) on any public road without the “Registrar’s” approval.
The Regulation also has liability and insurance provisions. In terms of liability, if the autonomous car gets into an accident, the Regulation ascribes liability to its owner or lessee. As for theinsurance provisions, the liability insurance must be at a minimum of $5,000,000 or $8,000,000, depending on the seating capacity of the autonomous vehicle.
However, s 18 of the Regulation throws us a curveball: it provides that “[t]his Regulation is revoked on the tenth anniversary of the day it comes into force.” The tenth anniversary, on a cursory calculation, is sometime in October of 2025. The question is, what happens after?
Liability Insurance Implications
Since we’re on the theme of overproduced and under-thought-about movies, does anyone remember the movie Total Recall? If so, focus on JohnnyCab, the self-driving taxi. For all those who haven’t watched Total Recall, just imagine a self-driving taxi with a fake driver that talks to you and a big Austrian Oak of a bodybuilder sitting in the back.
Now, think of the following scenario: say you’re the JohnnyCab’s owner, and while your autonomous cab was transporting the big Austrian bodybuilder, the cab veers off course, hits a wall, and seriously injures the Austrian in the back of the car. Who is responsible?
Currently, the Regulation, as noted above, ascribes liability to the owner or lessee of the automated vehicle—therefore you. But once it is revoked, would you still be liable?
For example, imagine further that forensics crash investigators determine that the cab’s sensors were defective. Perhaps you would argue that the manufacturer should be liable? But instead, let’s hypothesize that some of the sensors are visual based, and these sensors had dirt on them. Would the owner have the responsibility to walk around the car and check the sensors immediately before operating the car? Maybe you would say that is too burdensome and unrealistic to expect. But what if the manufacturer expressly told you to when you bought the vehicle?
Many of these questions are novel, but they may become big insurance issues in the years ahead. But what about the privacy implications mentioned above?
If you (total-ly) recall, we wrote an article about the movement to allow consumers the right to access their vehicle’s telematic data. As a reminder, telematics data is all the information that the vehicle collects and transmits wirelessly to someone else. For example, if you passed 35,000 km, your car could hypothetically send this “milestone” to your dealer, who will then text you to book your next maintenance appointment.
The more technologically sophisticated cars become, the more data they will produce. Autonomous vehicles are no exception to this rule. Autonomous cars will (obviously) drive you places and thus necessarily have information on where it is taking you and which route it is using. This data could then be processed to also determine how long you stayed at that place. Although one could argue that our phones already do this, but at least people had the opportunity to just leave their phone at home.
The next question is: where will this information be stored? Will it be stored on your manufacturer’s server and thereby be accessible by the internet or will it be stored locally in your car and not transmitted? If the former, are the car manufacturers going to sell us cheaper cars and in return, take our information and sell it to marketing companies to better target you for ads? Again, this is a new technology that is still being developed, and its use has not been clearly defined, but we need to be careful.
The point is, as cars become more autonomous, they will store more and more data which, although could be beneficial to us, will come with consequences. For example, just as we iterated in our article about the right to repair our cars, the more information we put online, the more susceptible we are to data breaches.
How does PIPEDA apply to all of this?
Imagine you decide to start a taxi business on the eve of St. Paddy’s. While you’re out partaking in the celebrations, you decide you can make a few quick bucks by having your self-driving car take locals to the pub. Unsurprisingly, your car picks up a few excited students. One of them happens to spill a drink in your car. You probably would want to know who he is so that you can have him pay to have your car cleaned. To prevent chasing this college student yourself, you may think it’s a good idea to request their name, address and credit card information before letting them enter your self-driving car.
In Canada, PIPEDA (along with the relevant PCI regulations) would regulate your taxi business because it’s for-profit and you’re collecting the personal information of people. If you need a quick refresher about how PIPEDA works, please checkout our previous post about how a new Privacy Commissioner may be coming to town.
What are the legal ramifications to you if PIPEDA applies? Well imagine if someone hacks into your car and steals all the data of your customers. Unfortunately, your regular car insurance likely won’t help you. To make matters worse, you would also have to inform all the impacted customers who may sue you. Further, you may have to fend off the Privacy Commissioner.
Lastly, consider the cybersecurity issues
Just as your personal computer can be affected by viruses, so could your car. Hypothetically, a hacker who could find a way to remote control your future autonomous car. Scary, right? That is why the cars of the present and future need to have strong security safeguards. We can also help you ensure that your cybersecurity practices are compliant with industry standards.
Consumers, manufacturers, and businesses need to better understand consumer privacy rights, their data management responsibilities, and upcoming technological trends. They should also consider what safeguards they should include to better mitigate their cyber-risk. This may include cyber-insurance, data protection agreements, privacy policies, determining the appropriate cybersecurity systems, and online-terms and conditions.
Should you have any questions or comments, you can reach out to the authors Michael Weinberger, Ontario attorney specializing in business and privacy law, at [email protected] or Savvas Daginis, Ontario student-at-law and Illinois attorney, at [email protected].
This article was written in collaboration with lead co-author Savvas Daginis, student-at-law.