The YouTuber and former NASA engineer Mark Rober put out a video called “Can You Fool a Self Driving Car?” earlier this month. In it, he showed a Tesla on Autopilot being fooled by a wall made to look like the road in front of it.

High-frame-rate video shows the car driving right through the Styrofoam blockage, which raises questions about whether the company’s driver aid software is based on video feeds alone. Some of its rivals in the self-driving car market also use LIDAR and radar, which makes it easy to tell the difference between the road and a wall that looks like it.

Fans were furious about the video because it caused a heated argument about Tesla’s driver assistance technology. They said Rober should have used the company’s more advanced, but more expensive, Full Self-Driving (FSD) software.

But after a new attempt to repeat the experiment, there are still many reasons to be worried about Tesla depending only on cameras. There is also a small bit of hope for the electric car company.

Kyle Paul, a YouTuber, posted his own response video over the weekend. It showed that even with the FSD feature turned on, a Model Y with an older model HW3 computer can still drive through a wall painted to look like the road ahead.

Paul said, “See the wall, does not see the wall,” as he slammed on the brakes and slowly made his way toward the wall, which was also painted to look like the horizon. “Starting to see its own shadow on the wall, and if I get really close, about touching it, it sees the wall.”

“With no doubt, the Model Y would have gone through the wall,” he said. “I had to break full force because I saw that the camera, the visualization, was not seeing the wall.”

However, that’s not the whole story. A Cybertruck equipped with the newest HW4 computer and camera system quickly found the same wall and stopped.

“Sees the wall,” Paul said as the truck stopped. It was running FSD version 13.2.8 from last month. “Stops.”

At no point did I feel like it was going to hit the wall,” he said.

It’s hard to come to a firm opinion because the results were so different. Another thing is that Tesla drivers don’t really run into walls that look like Wile E. Coyote every day.

In spite of this, hundreds of thousands of cars may still have the old HW3 computer, which may not work as well now that Tesla has decided to rely only on camera sensors.

Musk has previously said that he would give those customers a free upgrade to HW4, but given his record, it’s not clear if he’ll keep his word.

Musk himself is scared that most of the Tesla cars on the road right now won’t be able to drive themselves after all, as shown by the promise.

Paul also didn’t do Rober’s tests where he had the Tesla drive through heavy artificial fog and rain, which might be a lot more useful for real-world drivers than those who live in a Looney Tunes universe.

In both tests, the Tesla did crash into a kid mannequin, which shows that the electric car company has a long way to go before it can keep Musk’s promise of a safe future with self-driving cars.