Texas motorists who are interested in the development of self-driving cars may have heard about a fatal accident involving a pedestrian on March 18. A professor in Arizona, where the accident occurred, says it happened because autonomous cars are being programmed to drive in the same way that humans do.

The Tempe police chief says the Uber vehicle was not at fault in the accident. Video of the incident shows the pedestrian stepping from a dark area into a part of the road where there was not a pedestrian crossing. However, the professor maintains that the problem was that the car was proceeding the way a human would by assuming that there were no obstacles in the path ahead despite being unable to confirm that visually. He says that autonomous vehicles should drive at a speed that would allow them to stop if an object comes into their range of vision.

His work is in guaranteeing the behavior of a computer that operates a physical system. One example he gave is a car that could brake in a millisecond if it detects an obstacle. The professor points out that the standards for self-driving vehicles are much higher than for humans. A fatal accident caused by a human is considered a tragedy, but a fatality caused by a self-driving car could shut the industry down.

When an accident happens, whether it is caused by an autonomous car’s error or human error, another person or entity may be considered legally liable for personal injury. This means that compensation might be owed to the injured person. If the accident is caused by a manufacturer’s negligence, such as faulty brakes, an attorney could be of assistance in seeking appropriate compensation.