{"id":28792,"date":"2020-01-07T08:45:37","date_gmt":"2020-01-07T14:45:37","guid":{"rendered":"http:\/\/law.marquette.edu\/facultyblog\/?p=28792"},"modified":"2020-01-07T08:47:07","modified_gmt":"2020-01-07T14:47:07","slug":"autonomous-vehicle-malfunctions-may-not-be-so-complicated-after-all","status":"publish","type":"post","link":"https:\/\/law.marquette.edu\/facultyblog\/2020\/01\/autonomous-vehicle-malfunctions-may-not-be-so-complicated-after-all\/","title":{"rendered":"Autonomous Vehicle Malfunctions May Not Be So Complicated After All"},"content":{"rendered":"<p><em><img loading=\"lazy\" decoding=\"async\" class=\"alignleft size-full wp-image-28797\" src=\"http:\/\/law.marquette.edu\/facultyblog\/wp-content\/uploads\/2020\/01\/ntsb_investigation-e1578408249988.jpg\" alt=\"\" width=\"300\" height=\"300\" \/>NTSB\u2019s Final Report on Pedestrian Fatality Involving an Uber AV Highlights Obvious Programming Missteps<\/em><\/p>\n<p>On a dark street in Tempe, Arizona just before 10 p.m. on March 18, 2018, an Uber vehicle being tested in autonomous mode hit and killed a pedestrian.\u00a0 This was the first pedestrian fatality involving an autonomous vehicle, and it triggered a media firestorm that caused Uber to suspend its autonomous vehicle program for nine months as it worked with the NTSB to understand the causes of the crash.\u00a0 With the adoption by the NSTB of its <a href=\"https:\/\/dms.ntsb.gov\/pubdms\/search\/document.cfm?docID=479021&amp;docketID=62978&amp;mkey=96894\">final report on the crash<\/a> on November 19, that work is now complete.<\/p>\n<p>The NTSB\u2019s final report paints a vivid picture of programming and human missteps that belies the argument commonly advanced in legal scholarship about AV liability &#8212; that crashes involving AVs will be impossible for the judges, juries, and doctrines that make up our current system of tort law to \u201cunderstand.\u201d\u00a0 Indeed, the errors that led to the crash were all too simple. <!--more--><\/p>\n<p>Each of the autonomous vehicle fatalities that have occurred so far have involved several overlapping missteps, and the Uber crash was no different. It did not take long for investigators to discover that the pedestrian, Elaine Herzberg, was high on methamphetamine and marijuana when she jaywalked across an unlit portion of the road. \u00a0She did not react in any way to the approaching car. (NTSB, <em>Human Performance Group Chairman\u2019s Factual Report<\/em>, HWY18MH010, Nov. 5, 2019, at 16.)<\/p>\n<p>The Uber\u2019s safety operator, meanwhile, was plainly not paying attention to the road and did not see Herzberg until a fraction of a second before the crash; <a href=\"https:\/\/www.azcentral.com\/story\/money\/business\/tech\/2019\/11\/19\/driver-fatal-arizona-uber-crash-mostly-blame-ntsb-report-finds\/4232936002\/\">police investigators determined<\/a> that she had been watching <em>The Voice <\/em>on her phone.<a href=\"#_ftn1\" name=\"_ftnref1\">[1]<\/a> Much of the NTSB\u2019s final report focuses on this flagrant lapse and the underlying corporate culture that helped make it possible.<a href=\"#_ftn2\" name=\"_ftnref2\">[2]<\/a><\/p>\n<p>But the performance of the algorithm itself initially presented something of a puzzle.\u00a0 The car\u2019s various sensors detected Herzberg five and a half seconds before the crash (plenty of time to slow down), but the algorithm struggled to decide what exactly it was seeing as it sped toward the unknown object, and did not appreciate the need to slow down or change course until a second before impact, by which time it was too late.<\/p>\n<p>Many scholars and commentators have argued that applying traditional principles of tort law to accidents involving autonomous vehicles is undesirable or impractical, in part because algorithms will be impossible for regular people like judges and jurors to understand.<a href=\"#_ftn3\" name=\"_ftnref6\">[3]<\/a>\u00a0 Algorithms, after all, are not programmed as a series of if-then instructions.\u00a0 Instead they are, broadly speaking, told to pursue goals within a set of constraints, allowing them to \u201clearn\u201d how to accomplish tasks by trial and error.\u00a0 They thus display \u201cemergent behavior,\u201d acting in ways their human programmers neither instructed nor, in some cases, even anticipated.\u00a0 The Uber\u2019s inscrutable indecision as it hurtled towards Herzberg might at first have been taken to support the argument that tracing algorithmic flaws back to some sort of human fault or negligence is a fool\u2019s errand.<\/p>\n<p>The NTSB\u2019s final report on the crash seriously undermines this argument, at least as it applies to this particular incident.\u00a0 Instead of an erratic algorithm whose behavior can\u2019t be explained after the fact, the NTSB highlighted specific, easily understandable and, in retrospect, obviously negligent programming decisions made by ordinary humans.<\/p>\n<p>Most glaringly, the report notes that the algorithm was not programmed to understand that people sometimes jaywalk.\u00a0 The system was programmed to classify objects it detected in the road and to then make assumptions about those objects\u2019 future position based on both its record of past positions and inferences about how objects of a given type usually behave.\u00a0 Pedestrians detected in crosswalks were, sensibly enough, assigned the goal of crossing the street.\u00a0 But pedestrians in the road anywhere other than a crosswalk were not assigned a goal, meaning that the system could only guess where they might be headed based on its own observations of their past positions.\u00a0 As the NTSB put it, \u201cthe system design did not include a consideration for jaywalking pedestrians.\u201d<a href=\"#_ftn4\" name=\"_ftnref4\">[4]<\/a><\/p>\n<p>Even without a goal, the system could predict object\u2019s future path based on past observations of its position.\u00a0 Unfortunately, this process too contained glaring flaws.\u00a0 If the system changed its classification of an object, its data about the object\u2019s prior position did not carry over.\u00a0 In other words, each time the system changed its mind about whether Herzberg was a bicycle, an object, or a vehicle, it essentially \u201cforgot\u201d all it had learned about where she had been and where she was going, and started observing her position and direction of travel with a clean slate.<a href=\"#_ftn5\" name=\"_ftnref5\">[5]<\/a><\/p>\n<p>Ironically, NTSB investigators found that the relatively mundane driver assistance systems with which the Volvo had come factory-equipped would have prevented the accident.<a href=\"#_ftn6\" name=\"_ftnref6\">[6]<\/a>\u00a0 These systems included forward collision warning, which alerts the driver of the need to brake to avoid a collision, and automatic emergency braking, which stops the car if the driver does not respond to the warning.\u00a0 Such features are these days fairly commonplace on new cars.<\/p>\n<p><a href=\"https:\/\/www.degruyter.com\/view\/j\/jtl.2019.12.issue-2\/jtl-2019-0029\/jtl-2019-0029.xml\">In an article published in November in the <em>Journal of Tort Law<\/em><\/a>, I argued that fears about the tort system\u2019s inability to make sense of collisions involving autonomous vehicles were overblown, and that existing doctrines are capable of resolving such disputes.<a href=\"#_ftn7 name=\">[7]<\/a>\u00a0 The new details released by the NTSB support that view, as they highlight several relatively simple and (in hindsight) glaring programming errors that caused the AV to barrel into Herzberg at 39 mph despite having detected her more than five seconds earlier.<\/p>\n<p>Whether these facts represent negligence or a defective design, they suggest that the autonomous vehicles that may one day shuttle us around will, like most things created by humans, sometimes malfunction.\u00a0 Whether these malfunctions will someday involve emergent behavior too esoteric for us mortals to understand is an open question.\u00a0 So far, the answer appears to be no.<\/p>\n<p><a href=\"#_ftnref1\" name=\"_ftn1\">[1]<\/a> Local prosecutors have not yet ruled out the possibility of criminal charges being brought against the vehicle\u2019s operator.<\/p>\n<p><a href=\"#_ftnref2\" name=\"_ftn2\">[2]<\/a> Uber removed backup operators from its cars and paid insufficient attention, the NTSB noted, to \u201cautomation complacency,\u201d a widely-known phenomenon that results from humans\u2019 poor performance on tasks requiring \u201cpassive vigilance.\u201d For example, investigators determined that the safety operator had made the exact same trip 73 times without incident.<\/p>\n<p><a href=\"#_ftnref3\" name=\"_ftn3\">[3]<\/a> <em>See, e.g.,<\/em> Kenneth S. Abraham &amp; Robert L. Rabin, <em>Automated Vehicles and Manufacturer Responsibility for Accidents: A New Legal Regime for a New Era<\/em>, 105 Va. L. Rev. 127, 144 (2019); Bryan H. Choi, <em>Crashworthy Code<\/em>, 94 Wash. L. Rev. 39, 44 (2019).<\/p>\n<p><a href=\"#_ftnref4\" name=\"_ftn4\">[4]<\/a> Ensar Becic, NTSB, <em>Vehicle Automation Report<\/em>, HWY18MH010, Nov. 5, 2019, at 12.\u00a0 This has since been changed, so that Uber\u2019s automation system now understands that \u201cjaywalking is . . . a possible pedestrian goal.\u201d\u00a0 NTSB Report at 30.<\/p>\n<p><a href=\"#_ftnref5\" name=\"_ftn5\">[5]<\/a> Id. at 8 (\u201c[I]f the perception system changes the classification of a detected object, the tracking history of that object is no longer considered when generating new trajectories.\u201d); id. at 13.\u00a0 This too has since been fixed.\u00a0 Id. at 30.<\/p>\n<p><a href=\"#_ftnref6\" name=\"_ftn6\">[6]<\/a> David Pereira, NTSB, <em>Vehicle Factors Group Chairman\u2019s Factual Report<\/em>, HWY18MH010, Nov. 5, 2019, at 9-10.\u00a0 These features had been turned off by Uber so as not to interfere with its testing.<\/p>\n<p><a href=\"#_ftnref7\" name=\"_ftn7\">[7]<\/a> Alexander B. Lemann, <em>Autonomous Vehicles, Technological Progress, and the Scope Problem in Products Liability<\/em>, 12 J. Tort L. 157 (2019).<\/p>\n","protected":false},"excerpt":{"rendered":"<p>NTSB\u2019s Final Report on Pedestrian Fatality Involving an Uber AV Highlights Obvious Programming Missteps On a dark street in Tempe, Arizona just before 10 p.m. on March 18, 2018, an Uber vehicle being tested in autonomous mode hit and killed a pedestrian.\u00a0 This was the first pedestrian fatality involving an autonomous vehicle, and it triggered [&hellip;]<\/p>\n","protected":false},"author":279,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"ocean_post_layout":"","ocean_both_sidebars_style":"","ocean_both_sidebars_content_width":0,"ocean_both_sidebars_sidebars_width":0,"ocean_sidebar":"","ocean_second_sidebar":"","ocean_disable_margins":"enable","ocean_add_body_class":"","ocean_shortcode_before_top_bar":"","ocean_shortcode_after_top_bar":"","ocean_shortcode_before_header":"","ocean_shortcode_after_header":"","ocean_has_shortcode":"","ocean_shortcode_after_title":"","ocean_shortcode_before_footer_widgets":"","ocean_shortcode_after_footer_widgets":"","ocean_shortcode_before_footer_bottom":"","ocean_shortcode_after_footer_bottom":"","ocean_display_top_bar":"default","ocean_display_header":"default","ocean_header_style":"","ocean_center_header_left_menu":"","ocean_custom_header_template":"","ocean_custom_logo":0,"ocean_custom_retina_logo":0,"ocean_custom_logo_max_width":0,"ocean_custom_logo_tablet_max_width":0,"ocean_custom_logo_mobile_max_width":0,"ocean_custom_logo_max_height":0,"ocean_custom_logo_tablet_max_height":0,"ocean_custom_logo_mobile_max_height":0,"ocean_header_custom_menu":"","ocean_menu_typo_font_family":"","ocean_menu_typo_font_subset":"","ocean_menu_typo_font_size":0,"ocean_menu_typo_font_size_tablet":0,"ocean_menu_typo_font_size_mobile":0,"ocean_menu_typo_font_size_unit":"px","ocean_menu_typo_font_weight":"","ocean_menu_typo_font_weight_tablet":"","ocean_menu_typo_font_weight_mobile":"","ocean_menu_typo_transform":"","ocean_menu_typo_transform_tablet":"","ocean_menu_typo_transform_mobile":"","ocean_menu_typo_line_height":0,"ocean_menu_typo_line_height_tablet":0,"ocean_menu_typo_line_height_mobile":0,"ocean_menu_typo_line_height_unit":"","ocean_menu_typo_spacing":0,"ocean_menu_typo_spacing_tablet":0,"ocean_menu_typo_spacing_mobile":0,"ocean_menu_typo_spacing_unit":"","ocean_menu_link_color":"","ocean_menu_link_color_hover":"","ocean_menu_link_color_active":"","ocean_menu_link_background":"","ocean_menu_link_hover_background":"","ocean_menu_link_active_background":"","ocean_menu_social_links_bg":"","ocean_menu_social_hover_links_bg":"","ocean_menu_social_links_color":"","ocean_menu_social_hover_links_color":"","ocean_disable_title":"default","ocean_disable_heading":"default","ocean_post_title":"","ocean_post_subheading":"","ocean_post_title_style":"","ocean_post_title_background_color":"","ocean_post_title_background":0,"ocean_post_title_bg_image_position":"","ocean_post_title_bg_image_attachment":"","ocean_post_title_bg_image_repeat":"","ocean_post_title_bg_image_size":"","ocean_post_title_height":0,"ocean_post_title_bg_overlay":0.5,"ocean_post_title_bg_overlay_color":"","ocean_disable_breadcrumbs":"default","ocean_breadcrumbs_color":"","ocean_breadcrumbs_separator_color":"","ocean_breadcrumbs_links_color":"","ocean_breadcrumbs_links_hover_color":"","ocean_display_footer_widgets":"default","ocean_display_footer_bottom":"default","ocean_custom_footer_template":"","ocean_post_oembed":"","ocean_post_self_hosted_media":"","ocean_post_video_embed":"","ocean_link_format":"","ocean_link_format_target":"self","ocean_quote_format":"","ocean_quote_format_link":"post","ocean_gallery_link_images":"on","ocean_gallery_id":[],"footnotes":""},"categories":[32,122,29],"tags":[],"class_list":["post-28792","post","type-post","status-publish","format-standard","hentry","category-computer-law","category-public","category-tort-law","entry"],"_links":{"self":[{"href":"https:\/\/law.marquette.edu\/facultyblog\/wp-json\/wp\/v2\/posts\/28792","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/law.marquette.edu\/facultyblog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/law.marquette.edu\/facultyblog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/law.marquette.edu\/facultyblog\/wp-json\/wp\/v2\/users\/279"}],"replies":[{"embeddable":true,"href":"https:\/\/law.marquette.edu\/facultyblog\/wp-json\/wp\/v2\/comments?post=28792"}],"version-history":[{"count":4,"href":"https:\/\/law.marquette.edu\/facultyblog\/wp-json\/wp\/v2\/posts\/28792\/revisions"}],"predecessor-version":[{"id":28798,"href":"https:\/\/law.marquette.edu\/facultyblog\/wp-json\/wp\/v2\/posts\/28792\/revisions\/28798"}],"wp:attachment":[{"href":"https:\/\/law.marquette.edu\/facultyblog\/wp-json\/wp\/v2\/media?parent=28792"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/law.marquette.edu\/facultyblog\/wp-json\/wp\/v2\/categories?post=28792"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/law.marquette.edu\/facultyblog\/wp-json\/wp\/v2\/tags?post=28792"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}