"It seems like if we leave it to the general public, they may unintentionally go soft on AV manufacturers to improve their safety standards," said study researcher Edmond Awad.
The study found that in crashes involving a car that has dual human and AI controls, the public presume that the machine is less accountable.
For the study, the research team asked members of the public to consider hypothetical cases in which a pedestrian was killed by a car operated under shared control between humans and machines - and to indicate how blame should be allocated.
It found that when one driver makes an error, they were blamed more - regardless of whether that driver is a machine or a human.
Significantly, however, when both drivers make errors in cases of human-machine shared-control vehicles, the blame attributed to the machine is reduced.
There is a greater presumption that the human driver should be held more accountable.
The research team believe the results could have significant ramifications for how juries could apportion blame in future death-crash cases.
While the pursuit of creating fully 'driverless' cars has been the goal for many manufacturers in recent years, they still remain some way off from coming to fruition, the study said.