“We’re trying to have those conversations with Elon to establish what the sensors would need to do,” Baglino added. “And they were really difficult conversations, because he kept coming back to the fact that people have just two eyes and they can drive the car.”
Yes, and people crash cars all the time Elon…
If you want an autopilot with the failure rate of a human, then you might only need two eyes. If you want an autopilot with a near zero failure rate, you need much better telemetry data
Our heads are just loaded with sensory capabilities that are more than just the two eyes. Our proprioception, balance, and mental mapping allows us to move our heads around and take in visual data from almost any direction at a glance, and then internally model that three dimensional space as the universe around us. Meanwhile, our ears can process direction finding for sounds and synthesize that information with our visual processing.
Meanwhile, the tactile feedback of the steering wheel, vibration of the actual car (felt by the body and heard by the ears), give us plenty of sensory information for understanding our speed, acceleration, and the mechanical condition of the car. The squeal of tires, the screech of brakes, and the indicators on our dash are all part of the information we use to understand how we’re driving.
Much of it is trained through experience. But the fact is, I can tell when I have a flat tire or when I’m hydroplaning even if I can’t see the tires. I can feel inclines or declines that affect my speed or lateral movement even when there aren’t easy visual indicators, like at night.
To be fair, 98% of drivers seem to barely be able to hold a straight line and can’t see past the end of their hood, let alone do shoulder checks and be able to hear anything over the stereo turned up to 11. So I’d take my chances with the half-baked autopilot that can at least discern what a red light looks like.
I followed one gentleman for about 10 blocks before he stopped and I could tell him that he was missing the entire tire on the rear left of his car. There were a lot of sparks and metal screeching. Not a clue.
Just adding to your point, when F1 drivers were asked to play a racing sim, they could not perform like real life because they said no matter how good the sim is, it doesn’t provide the feedback of a real car.
And people turn their heads, move their eyes across their windshield, change focus to look ahead or closer, look in their mirrors, listen for sounds (emergency vehicles, car honks, etc), are able to do things like look through gaps and other car windows to adjust to partial obstructions.
The fact that he doesn’t realize you need a multitude of sensors to do even a little bit of what a human can do tells you all you need to know about Elon’s so called brilliance.
Even the social aspect of driving eludes him. You and another driver come up to a 4 way stop at the same time, crossing paths. They wave you on to be polite. You wave back and go first. How and when does he plan to handle that behavior?
Or the asocial, where you come up to a stop sign, look right, see a guy coming way too fast to stop in time, and don’t go till after he’s blown through the intersection
Well we perform pretty well with just two eyes, but the difference is that we are a highly skilled general pattern recognition machine that you just can’t recreate in software yet. A few lines diverging with a bigger and smaller circle under it? Guess that’s a truck going that way. Oh the lines are changing angles? Holy shit the truck is coming into this lane!!
Anybody else remember the now-removed Tesla blog post from 2016 arguing that FSD will require LIDAR? Idk why they’ (Elon) are so stubborn about it. It can see through fog and darkness . Add that data to their model and they would probably already be near deployment readiness of real FSD.
A person approaching on foot or a bicycle from my right side at the coincidentally perfect speed can accidentally stay within both my human eyes’ blind spots (behind the support pillar) as I come to a stop at a 4-way. I have learned I need to crane around a bit before proceeding, or their frightened and angry face will suddenly lurch into view too close for comfort. The robot must be designed to have zero blind spots because humans are ridiculously good at hiding in them. Especially the little humans.
I wish people would talk about this, but Elon really isn’t that smart and he certainly isn’t a genius. I learned a long time ago that smart is relative and really shouldn’t be foisted onto people. Elon has a BA in Physics from a school known for business degrees. He also got a BS in Business, but UPenn and Wharton are known more for how hard it is to get in than how hard the classes are.
The website CollegeVine says UPenn is known as the “Social Ivy” and “UPenn’s admissions is highly-selective, but students applying to the UPenn College of Arts & Science (CAS) will find it less academically competitive than schools like Harvard, Yale, Princeton, and Stanford (although exceptional academics are still a must).”
By the way, he started college in 1990, transfered to UPenn in 1992, and states he graduated in 1995, but UPenn refutes that saying he graduated in 1997. This is a school where 96% of those who are accepted graduate within 150% of the degree time (4 year degree within 6 years) (https://www.collegetuitioncompare.com/edu/215062/university-of-pennsylvania/graduation/).
Musk of course says he completed the courses in 1995, but there was some sort of mixup with an English and History credit that delayed the degree by 2 years.
Yes, and people crash cars all the time Elon…
If you want an autopilot with the failure rate of a human, then you might only need two eyes. If you want an autopilot with a near zero failure rate, you need much better telemetry data
Our heads are just loaded with sensory capabilities that are more than just the two eyes. Our proprioception, balance, and mental mapping allows us to move our heads around and take in visual data from almost any direction at a glance, and then internally model that three dimensional space as the universe around us. Meanwhile, our ears can process direction finding for sounds and synthesize that information with our visual processing.
Meanwhile, the tactile feedback of the steering wheel, vibration of the actual car (felt by the body and heard by the ears), give us plenty of sensory information for understanding our speed, acceleration, and the mechanical condition of the car. The squeal of tires, the screech of brakes, and the indicators on our dash are all part of the information we use to understand how we’re driving.
Much of it is trained through experience. But the fact is, I can tell when I have a flat tire or when I’m hydroplaning even if I can’t see the tires. I can feel inclines or declines that affect my speed or lateral movement even when there aren’t easy visual indicators, like at night.
To be fair, 98% of drivers seem to barely be able to hold a straight line and can’t see past the end of their hood, let alone do shoulder checks and be able to hear anything over the stereo turned up to 11. So I’d take my chances with the half-baked autopilot that can at least discern what a red light looks like.
I followed one gentleman for about 10 blocks before he stopped and I could tell him that he was missing the entire tire on the rear left of his car. There were a lot of sparks and metal screeching. Not a clue.
Just adding to your point, when F1 drivers were asked to play a racing sim, they could not perform like real life because they said no matter how good the sim is, it doesn’t provide the feedback of a real car.
And people turn their heads, move their eyes across their windshield, change focus to look ahead or closer, look in their mirrors, listen for sounds (emergency vehicles, car honks, etc), are able to do things like look through gaps and other car windows to adjust to partial obstructions.
The fact that he doesn’t realize you need a multitude of sensors to do even a little bit of what a human can do tells you all you need to know about Elon’s so called brilliance.
Even the social aspect of driving eludes him. You and another driver come up to a 4 way stop at the same time, crossing paths. They wave you on to be polite. You wave back and go first. How and when does he plan to handle that behavior?
if it isn’t your turn the car waits, ignores wave, and after a long wait pulls forward very slowly or only if the driver takes over
Or the asocial, where you come up to a stop sign, look right, see a guy coming way too fast to stop in time, and don’t go till after he’s blown through the intersection
Well we perform pretty well with just two eyes, but the difference is that we are a highly skilled general pattern recognition machine that you just can’t recreate in software yet. A few lines diverging with a bigger and smaller circle under it? Guess that’s a truck going that way. Oh the lines are changing angles? Holy shit the truck is coming into this lane!!
Anybody else remember the now-removed Tesla blog post from 2016 arguing that FSD will require LIDAR? Idk why they’ (Elon) are so stubborn about it. It can see through fog and darkness . Add that data to their model and they would probably already be near deployment readiness of real FSD.
Automotive lidar costs around $500-1000 to add to a car.
That’s it. That’s the whole reason.
A person approaching on foot or a bicycle from my right side at the coincidentally perfect speed can accidentally stay within both my human eyes’ blind spots (behind the support pillar) as I come to a stop at a 4-way. I have learned I need to crane around a bit before proceeding, or their frightened and angry face will suddenly lurch into view too close for comfort. The robot must be designed to have zero blind spots because humans are ridiculously good at hiding in them. Especially the little humans.
I wish people would talk about this, but Elon really isn’t that smart and he certainly isn’t a genius. I learned a long time ago that smart is relative and really shouldn’t be foisted onto people. Elon has a BA in Physics from a school known for business degrees. He also got a BS in Business, but UPenn and Wharton are known more for how hard it is to get in than how hard the classes are.
The website CollegeVine says UPenn is known as the “Social Ivy” and “UPenn’s admissions is highly-selective, but students applying to the UPenn College of Arts & Science (CAS) will find it less academically competitive than schools like Harvard, Yale, Princeton, and Stanford (although exceptional academics are still a must).”
By the way, he started college in 1990, transfered to UPenn in 1992, and states he graduated in 1995, but UPenn refutes that saying he graduated in 1997. This is a school where 96% of those who are accepted graduate within 150% of the degree time (4 year degree within 6 years) (https://www.collegetuitioncompare.com/edu/215062/university-of-pennsylvania/graduation/).
Musk of course says he completed the courses in 1995, but there was some sort of mixup with an English and History credit that delayed the degree by 2 years.