The poster, who pays Tesla CEO Elon Musk for a subscription to the increasingly far-right social media site, claimed that the FSD software “works awesome” and that a deer in the road is an “edge case.” One might argue that edge cases are actually very important parts of any claimed autonomy suite, given how drivers check out when they feel the car is doing the work, but this owner remains “insanely grateful” to Tesla regardless.
How are these people always such pathetic suckers.
Same in Kansas. Was in a car that hit one in the 80s and see them often enough that I had to avoid one that was crossing a busy interstste highway last week.
Deer are the opposite of an edge case in the majority of the US.
Being a run of the mill fascist (rather than those in power) is actually an incredibly submissive position, they just want strong daddies to take care of them and make the bad people go away. It takes courage to be a "snowflake liberal" by comparison
Not really, a good fascist should be always ready to fight for their place in the sun, on all levels, their collective included. There's no rightful domination there, or right per se, but there is fighting and the resulting domination of the strongest. So if you disobey and lose, you have contributed to fascism to the best of your ability. If you disobey and win, you are the most virtuous fascist. Apathy is the worst crime there. It's the "jungle" ideology in some sense.
It would be fine if not for the fact that it doesn't contribute anything to the human, just describes the basic level and how to succeed there, but there are better levels.
Still I think it's important to deeply understand fascism and how it's not all evil, because we must understand why and when it's in demand. It's an ideology of chaotic life and violent evolution, and the demand for it arises when more gracious alternatives erode, and nothing around is certain other than one's will to fight.
Umberto Eco's "Foucault's Pendulum" is a wonderful book deeply exploring fascist aesthetic, by the way.
The issue with fascist followers (an important word) is that it doesn't take anything to pretend to be a fascist, while being a submissive slave in fact.
I actually find it funny how if you remove NAP from anarcho-capitalism, it can become both classical fascism and classical anarchism, with the difference being in what people of these ideologies want from the future, not the rules these ideologies impose.
Edge cases (NOT features) are the thing that keeps them from reaching higher levels of autonomy. These level differences are like "most circumstances", "nearly all circumstances", "really all circumstances".
Since Tesla cares so much more about features, they will remain on level 2 for another very long time.
I’d go even farther and say most driving is an edge case. I used 30 day trial of full self-driving and the results were eye opening. Not how it did: it was pretty much as expected, but looking at where it went wrong.
Full self driving did very well in “normal” cases, but I never realized just how much of driving was an “edge” case. Lane markers faded? No road edge but the ditch? Construction? Pothole? Debris? Other car does something they shouldn’t have? Traffic lights not aligned in front of you so it’s not clear what lane? Intersection not aligned so you can’t just go straight across? People intruding? Contradictory signs? Signs covered by tree branches? No sight line when turning?
After that experiment, it seems like “edge” cases are more common than “normal” cases when driving. Humans just handle it without thinking about it, but the car needs more work here
Deer on the road is an edge case that humans cannot handle well. In general every option other than hitting the deer is overall worse - which is why most insurance companies won't increase your rates if you hit a deer and file a claim for repairs.
The only way to not hit/kill hundreds of deer (thousands? I don't know the number) every year is to reduce rural speed limits to unreasonably slow speeds. Deer jump out of dark places right in front of cars all the time - the only option to avoid it that might work is either drive in the other lanes (which sometimes means into an oncoming car), or into the ditch (you have no clue what might be there - if you are lucky the car just rolls, but there could be large rocks or strong fence posts and the car stops instantly. Note that this all happens fast, you can't think you only get to react. Drivers in rural areas are taught to hit the brakes and maintain their lane.
Drivers in rural areas are taught to hit the brakes and maintain their lane.
Which the Tesla didn't do. It plowed full speed into the deer, which arguably made the collision much much worse than it could have been. I doubt the thing was programmed to maintain speed into a deer. The more likely alternative is that the FSD couldn't tell there was a deer there in the first place.
The problem is not that the deer was hit, a human driver may have done so as well. The actual issue is that the car didn't do anything to avoid hitting it. It didn't even register that the deer was there and, what's even worse, that there was an accident. It just continued on as if nothing happened.
Deer on the road is an edge case that humans cannot handle well.
If I'm driving at dawn or dusk, when they're moving around in low light I'm extra careful when driving. I'm scanning the treeline, the sides of the road, the median etc because I know there's a decent chance I'll see them and I can slow down in case they make a run across the road. So far I've seen several hundred deer and I haven't hit any of them.
Tesla makes absolutely no provision in this regard.
This whole FSD thing is a massive failure of oversight, no car should be doing self driving without using cameras and radar and Tesla should be forced to refund the suckers customers who paid for this feature.
Yeah this Tesla owner is dumb. wdym "we just need to train the AI to know what deer butts look like"? Tesla had radar and sonar, it didn't need to know what a deer's butt looks like because radar would've told it something was there! But they took it away because Musk had the genius idea of only using cameras for whatever reason.
The day he said that "ReGULAr CAmErAs aRe ALl YoU NeEd" was the day I lost all trust in their implementation. And I'm someone who's completely ready to turn over all my driving to an autopilot lol
I believe we can make a self-driving car with only optical sensors that performs as well as a human someday. I don't think today is that day, or that we shouldn't aim for self-driving to be far better than human drivers.
Deer aren’t edge cases. If you are in a rural community or the suburbs, deer are a daily way of life.
As more and more of their forests are destroyed, deer are a daily part of city life. I live in the middle of a large midwestern city; in neighborhood with houses crowded together. I see deer in my lawn regularly.
But I agree with your point that the overpopulation is impossible to miss. I'm also in the suburbs of a major Midwestern city and the deer are everywhere. My city tags them so, oddly, you kind of get to know them.
Last year #100 and #161 both had fawns in my back yard (for a total of 3 babies). This year, #161 dropped 2 more back there. I still see #100 around, but I don't think she had offspring this year. She might have been sterilized, but I heard that the city stopped doing that because some of our tagged deer were tracked to 2 states away. Now we just cull them.
Two days ago I saw a buck (rare for the 'burbs) chasing a few of this year's fawns around. I thought "you dummy, those girls are too young to breed," but then I looked it up, and apparently sexual maturity in deer is determined by weight, not age. Does can participate in their first-year rut if they've had enough to eat. And those little shits have had plenty of flowers out of my garden.
I see buck all the time I’m my neighborhood! I was on a walk earlier this summer and turned a corner to be face to face with a small herd that was hopping fences to graze. The buck was across the street and just stared at me.
At first I was afraid because they can get big, but now I’ve seen them a few times and I’m thinking they are used to people. I’m still not getting close if I can help it. They are much bigger than you would expect.
I like seeing them but I feel bad that they are stuck in the city.
People are acting like drivers don't hit deers at full speed while they're in control of the car. Unless we get numbers comparing self driving vs human driven cars then this is just a non story with the only goal being discrediting Musk when there's so many other shit that can be used to discredit him.
To quote OP "How many kids will we let Elon kill before we shut him down?", by this logic, how many kids will we let drivers kill before we take all cars off the road then?
People are acting like drivers don’t hit deers at full speed while they’re in control of the car.
I should be very surprised if people don't generally try to brake or avoid hitting an animal (with some exceptions), if only so that they don't break the car. Whether they succeed at that is another question entirely.
Driving is full of edge cases. Humans are also bad drivers who get edge cases wrong all the time.
The real question isn't is Tesla better/worse in anyone in particular, but overall how does Tesla compare. If a Tesla is better in some situations and worse in others and so overall just as bad as a human I can accept it. Is Tesla is overall worse then they shouldn't be driving at all (If they can identify those situations they can stop and make a human take over). If a Tesla is overall better then I'll accept a few edge cases where they are worse.
Tesla claims overall they are better, but they may not be telling the truth. One would think regulators have data for the above - but they are not talking about it.
Tesla claims overall they are better, but they may not be telling the truth. One would think regulators have data for the above - but they are not talking about it.
The agency is asking if other similar FSD crashes have occurred in reduced roadway visibility conditions, and if Tesla has updated or modified the FSD system in a way that may affect it in such conditions.
It sure seems like they aren't being very forthcoming with their data between this and being threatened with fines last year for not providing the data. That makes me suspect they still aren't telling the truth.
It sure seems like they aren't being very forthcoming with their data between this and being threatened with fines last year for not providing the data. That makes me suspect they still aren't telling the truth.
I think their silence is very telling, just like their alleged crash test data on Cybertrucks. If your vehicles are that safe, why wouldn't you be shoving that into every single selling point you have? Why wouldn't that fact be plastered across every Gigafactory and blaring from every Tesla that drives past on the road? If Tesla's FSD is that good, and Cybertrucks are that safe, why are they hiding those facts?
One trick used is to disengage auto pilot when it senses and imminent crash. This would vastly lower the crash count shifting all blame to the human driver.
Being safer than humans is a decent starting point, but safety should be maximized to the best of a machine's capability, even if it means adding a sensor or two. Keeping screws loose on a Boeing airplane still makes the plane safer than driving, so Boeing should not be made to take responsibility.
Yes. The question is if the Tesla is better than a anyone in particular. People are given the benefit of the doubt once they pass the drivers test. Companies and AI should not get that. The AI needs to be as good or better than a GOOD human driver. There is no valid justification to allow a poorly driving AI because it's better than the average human. If we are going to allow these on the road they need to be good.
The video above is HORRID. The weather was clear, there was no opposing traffic , the deer was standing still. The auto drive absolutely failed.
If a human was driving in these conditions plowed through a deer at 60 mph and didn't even attempt to swerve or stop they shouldn't be driving.
I've been able to get demos of autopilot in one of my friend's cars, and I'll always remember autopilot correctly stopping at a red light, followed by someone in the next lane over blowing right through it several seconds later at full speed.
Unfortunately "better than the worst human driver" is a bar we passed a long time ago.
From recent demos I'd say we're getting close to the "average driver", at least for clear visibility conditions, but I don't think even that's enough to have actually driverless cars driving around.
There were over 9M car crashes with almost 40k deaths in the US in 2020, and that would be insane to just decide that's acceptable for self driving cars as well. No company is going to want that blood on their hands.
If a Tesla is better in some situations and worse in others and so overall just as bad as a human I can accept it.
This idea has a serious problem: THE BUG.
We hear this idea very often, but you are disregarding the problem of a programmed solution: it makes it's mistakes all the time. Infinitely.
Humans are also bad drivers who get edge cases wrong all the time.
So this is not exactly true.
Humans can learn, and humans can tell when they made an error, and try to do it differently next time. And all humans are different. They make different mistakes. This tiny fact is very important. It secures our survival.
The car does not know when it made a mistake, for example, when it killed a deer, or a person, and crashed it's windshield and bent lot's of it's metal. It does not learn from it.
It would do it again and again.
And all the others would do exactly the same, because they run the same software with the same bug.
Now imagine 250 million people having 250 million Teslas, and then comes the day when each one of them decides to kill a person...
When people want to say that someone is very rare they should say “corner case,” but this doesn’t seem to have made it out of QA lingo and into the popular lexicon.
I notice nobody has commented on the fact that the driver should've reacted to the deer. It's not Tesla's responsibility to emergency brake, even if that is a feature in the system. Drivers are responsible for their vehicle's movements at the end of the day.
Meh, the "full self driving" will shut off 0,00001 second before impact so they can say "The system was not active at the time of the impact". Easycheasy
True but if Tesla keeps acting like they're on the verge of an unsupervised, steering wheel-free system...this is more evidence that they're not. I doubt we'll see a cybercab with no controls for the next 10 years if the current tech is still ignoring large, highly predictable objects in the road.
Note that part of the discussion is we shouldn't settle for human limitations when we don't have to. Notably things like LIDAR are considered to give these systems superhuman vision. However, Tesla said 'eyes are good enough for folks, so just cameras'.
The rest of the industry said LIDAR is important and focus on trying to make it more practical.
Hell, even not having lidar The thing was pretty clearly a large road obstacle a second and a half out. They had a whole left lane open At least enough time to do a significant speed reduction.
Remember when they removed ultrasonic and radar sensors in favor of "Tesla Vision"? That decision demonstrably cost people their lives and yet older, proven tech continues to be eschewed in favor of the cutting edge new shiny.
I'm all for pushing the envelope when it comes to advancements in technology and AI in its many forms, but those of us that don't buy Teslas never signed up to volunteer our lives as training data for FSD.
reading this, I am scared how dulled I have become to the danger posed from my 45 minute daily commute back from work. 65 kilometer driving into the black at 100km/h
It is autopilot (a poor one but still one) that legally calls itself cruise control so Tesla wouldn't have to take responsibility when it inevitably breaks the law.
Real Autopilot also needs constant attention, the term comes from aviation and it's not fully autonomous. It maintains heading, altitude, and can do minor course correction.
It's the "full self driving" wording they use that needs shit on.
Newer "real" autopilot systems absolutely do not need constant attention. Many of them can do full landing sequences now. The definition would match what people commonly use it for, not what it was "originally". Most people believe autopilot to be that it pilots itself automatically. There is 0 intuition about what a pilot actually does in the cockpit for most normal people. And technology bares out that thought process as autopilot in it's modern form can actually do 99% of flying, where take-off and landing isn't exempted anymore.
Inb4 it actually stopped with hazards like I've seen in other videos. Fuck elon and fuck teslas marketing of self driving but I've seen people reach far for karma hate posts on tesla sooooooo ¯\_(ツ)_/¯
Tesla’s approach to automotive autonomy is a unique one: Rather than using pesky sensors, which cost money, the company has instead decided to rely only on the output from the car’s cameras. Its computers analyze every pixel, crunch through tons of data, and then apparently decide to just plow into deer and keep on trucking.
I mean, to be honest...if you are about to hit a deer on the road anyway, speed up. Higher chance the scrawny fucker will get yeeted over you after meeting your car, rather than get juuuuust perfectly booped into air to crash through windshield and into your face.
Official advice I heard many times. Prolly doesn't apply if you are going slow.
Edit: Read further down. This advice is effing outdated, disregard. -_- God I am happy I've never had to put it i to test.
So, a kid on a bicycle or scooter is an edge case? Fuck the Muskrat and strip him of US citizenship for illegally working in the USA. Another question. WTF was the driver doing?
In regards to the deer, it looks like it might have been hard to see for the driver. I remember learning in driversED that it is better to hit the animal instead of swerving to miss it as it might hit a car to your side, so maybe that is what they were thinking?
Friendly reminder that tesla auto pilot is an AI training on live data. If it hasn't seen something enough times then it won't know to stop. This is how you have a tesla running full speed into an overturned semi and many, many other accidents.
It was an illegal deer immigrant, it recognised it, added it to the database on Tesla servers, and mowed it down before it took any jobs or whatever the hate-concern was.
/s
... but some actual technically human people do the same when they see an animal, don't they?
:(
If you want to motivate people to action, frame it in terms of the property damage they’ll experience to their car when it hits a child. We’ve already seen how far the American public is willing to go for children’s lives, and it’s not very far at all.
What if I told you that this has happened with Kids and recorded with kid stand ins. https://www.youtube.com/watch?v=3mnG_Gbxf_w
Not perfect evidence but the best you will find without signing a Tesla Non Disclosure.
How come human drivers have more fatalities and injuries per mile driven?
Musk can die in a fire, but self driving car tech seems to be vastly safer than human drivers when you do apples to apples comparisons. It's like wearing a seatbelt, you certainly don't need to have one to go from point A to point B, but you're definitely safer with it - even if you are giving up a little control. Like a seatbelt, you can always take it off.
I honestly think it shouldn't be called "self driving" or "autopilot" but should work more like the safety systems in Airbusses by simply not allowing the human to make a decision that would create a dangerous situation.
The real companies doing this as a serious endeavor yes. With all the added sensors, processing and tech are safer. Elons cars are years behind the competition . It's not Tesla gathering the safe driving data it's companies like Waymo.
That's a low bar when you consider how stringent airline safety is in comparison, and that kills way less people than driving does. If sensors can save people's lives, then knowingly not including them for profit is intentionally malicious.
Is there a longer video anywhere? Looking closely I have to wonder where the hell did that deer come from? There's a car up ahead of the Tesla in the same lane, I presume quickly moved back in once it passed the deer? The deer didn't spook or anything from that car?
This would have been hard for a human driver to avoid hitting, but I know the issue is the right equipment would have been better than human vision, which should be the goal. And it didn't detect the impact either since it didn't stop.
But I just think it's peculiar that that deer just literally popped there without any sign of motion.
It depends. If it's on the side of the road it may do the opposite and jump in front of you. This one actually looked like it was going to start moving, but not a chance.
It's the gap between where the deer is in the dark and the car in front that's odd. Only thing I can figure is the person was in the other lane and darted over just after passing the deer.
Sure and living in Wyoming I've seen that happen often enough right in front of me but the more I watch this video the more I want to know how that deer GOT there.
I can see a small shrub in the dark off the (right) side of the road but somehow you can't see the deer enter the lane from either the right or left. The car in front of the Tesla is maybe 40 feet past the deer at the start of the video (watch the reflector posts) but somehow that car had no reaction to the deer standing in the middle of the lane?!
That is because at a distance they freeze in case a predator hasn't noticed them yet. Theey don't bolt until they think an attack is imminent, and cars move to fast for them to react.
Is there a longer video anywhere? Looking closely I have to wonder where the hell did that deer come from?
I have the same question. If you watch the video closely the deer is located a few feet before the 2nd reflector post you see at the start of the video. At that point in time the car in front is maybe 20' beyond the post which means they should have encountered the deer within the last 30-40 feet but there was no reaction visible.
You can also see both the left and right sides of the road at the reflector well before the deer is visible, you can even make out a small shrub off the road on the right, and but somehow can't see the deer enter the road from either side?!
It's like the thing just teleported into the middle of the lane.
The more I watch this the more suspicious I am that the video was edited.
I don’t keep pictures like that on me, and I don’t feel like doing a google search for you. Travel blue ridge parkway or skyline drive, or any back road in the Appalachians and you will see what happens when a ten point meets metal.
I’ve seen semi-trucks disintegrate after hitting a dear.
I’d like to see that, I’ve seen modern regular full size trucks annihilate a deer without disintegrating. Semis wouldn’t be bothered much unless you’re talking about something larger like a moose. Deer are about the same weight as humans, whatever is good at killing humans is usually good for deer.
The average weight of an adult male is 203 lb (maximum, 405 lb). The average weight of a female is about 155 lb (maximum, 218 lb).
Granted the semi I saw had a guard on the front of it, but I witnessed one smoke a fully grown cow at 70mph. Sent the cow and pieces of it flying about 100 feet, with no visible damage to the truck at all. There was a tremendous amount of blood and spatter everywhere and my own car got a ton of blood on it from the cloud of guts and blood made by the truck. Mostly there was just shit everywhere leading up to the remnants of the carcass, but the truck gave no fucks whatsoever. I asked the driver if he was ok and he didn't even seem to have any agitation whatsoever, more like "oh, another one".
A truck will not disintegrate, there might be damage if it didn't have a guard, but against a deer, that must've been a paper mache piece of shit truck if it disintegrated on a deer.
Deer often travel in herds so where there is one there are often more. In rural area you can go miles without seeing one, and then see 10 in a few hundred feet. There are deer in those miles you didn't see them as well, but they happened to not be near the road then.
I know a lot of people here are/will be mad at Musk simply for personal political disagreement, but even just putting that aside, I've never liked the idea of self-driving cars. There's just too much that can go wrong too easily, and in a 1-ton piece of metal and glass moving at speeds up to near 100 mph, you need to be able to have the control enough to respond within a few seconds if the unexpected happens, like a deer jumping in the middle of the road. Computers don't, and may never, have the benefit of contextual awareness to make the right decision as often as a human would in those situations. I'm not going to cheer for the downfall of Musk or Tesla as a whole, but they do severely need to reconsider this idea or else there will be a lot of people hurt and/or killed and a lot of liability on them when it happens. That's a lot of risk to take on for a smaller auto maker like them, just thinking in business terms.
I mean we do let humans drive cars and some of them are as dumb as bricks and some are malicious little freaks.
Not saying we are anywhere FSD and Elon is a clown, but I would support a future with this technology if we ever got there. The issue is we would have to be all or nothing. Like you can’t have a mix of robots and people driving around.
The problem is that with dumb drivers you can easily place blame at the driver and make him pay for his idiocracy. FSD is a lot more complicated. You can't really blame the driver since he wasn't driving the car but neither did the engineer or the company itself. We'd have to draw up entirely new frameworks in order to define and place criminal neglect if one should exist. Is the company responsible for a malicious developer? Is the company responsible for a driver ignoring a set guideline and sits impaired behind the emergency stop? Is the driver responsible for a software fault?
All of these questions and many more needs to be answered. Some probably can't and must remain a so-called "act of God" with no blame to place. And people is not fond of blaming just the software, they're out for blood when an accident happens and software don't bleed. Of course the above questions might be the easiest to answer but the point still stands.
An FSD car that makes perfect decisions would theoretically be safer than a human driver who also makes perfect decisions, if for no other reason than the car could do it faster.
Personally, I would love to see autonomous cars see widespread use. They don't have to be perfect, just safer mile-for-mile than human drivers. (Which means that Teslas, with Musk's gobsmackingly stupid insistence on only using cameras, will never reach that threshold).
I thought the deer would be running or something, but no its just straight on from the car, doesn't move at all! How the fuck does a deer standing dead center in front of you not get caught by the camera!
I wouldn't be against using teslas to clean up the deer overpopulation problem in the US. I'm in favor of rolling this code into all Tesla models in the next update.
The one thing I will say is this isn't a human... Deer probably aren't in their training data at near the rates humans are.
It's definitely still concerning, but also still maybe more trustworthy than some human drivers. We seriously give licenses to too many people. Within the last week I've seen a guy that went into the other lane by like 4' multiple times and I also saw a lady who blocked 2 lanes of traffic so she could make an illegal U turn on a 4 lane city street (rather than you know turning off on a side street/one of many nearby parking lots and turning around).
Yeah, but Elon's self-driving cars aren't self-driving, nor are they necessarily as safe a good driver.
There are people out there who shouldn't be able to drive, and in sane countries many of them don't manage to get their licenses. But in the US for an example, apparently you can't get anywhere without a car, so until the public transit situation is solved, drivers licenses need to be given out like candy :/ Exception being some cities with awesome public transit. The only one I've been to is NYC, where most people don't really need to drive. I'd say the transit there is better than in my country.
And the worst part is that even once real SDCs exist and can be bought, not everyone can afford them. Or maybe they'll be more like Uber or Bolt in that you hail one from an app and it picks you up - but then people in rural areas are still fucked without being able to drive themselves.
I'm sure you all know, but to be clear, the above are the beginnings of sentences from people who don't hate elon. They are sentences from people who like elon, but think you will hate them, or not consider their opinions, if they say out loud that they do, in fact, like elon.
On a separate but related note, this is elon speaking at a hate-filled rally featuring a series of bigoted speakers, including himself. The rally very intentionally cosplayed an American Nazi rally that famously occurred at Madison Square garden in the 1930s. To emphasize how on the nose this all was, elon wore a specially made hat - a hat that very deliberately used an especially prominent font from the Nazi era. They are literally SCREAMING it in your face and tattooing it on their foreheads
elon has done nothing good or admirable with his life and elon will will not do anything good or admirable with his life. You can't compartmentalize your opinion on this, he sucks, on the whole.
I prefer electric cars over gas ones, and think that at some point computer controlled transportation will be more reliable than the average human driver. These aren't endorsements of any person or company.
Why are you wrapping this up so much with the one dude?
Its like saying that you can't endorse online shopping unless you also like Bezos.
I hit a deer on the highway in the middle of the night going about 80mph. I smelled the failed airbag charge and proceeded to drive home without stopping. By the time I stopped, I would never have been able to find the deer. If your vehicle isn't disabled, what's the big deal about stopping?
I've stuck two deer and my car wasn't disabled either time. My daughter hit one and totaled our van. She stopped.
Whether or not a human should stop seems beside the point. Autopilot should immediately get the driver to take back control if something unexpected happens, and stop if the driver doesn't take over. Getting into an actual collision and just continuing to drive is absolutely the wrong behavior for a self-driving car.
It was an expressway. There were no lights other than cars. You're not wrong, had a human sprinted at 20mph across the expressway in the dark, I'd have hit them, too. That being said, you're not supposed to swerve and I had less than a second to react from when I saw it. It was getting hit and there was nothing I could've done.
My point was more about what happened after. The deer was gone and by the time I got to the side of the road I was probably about 1/4 mile away from where I struck it. I had no flashlight to hunt around for it in the bushes and even if I did I had no way of killing it if it was still alive.
Once I confirmed my car was drivable I proceeded home and called my insurance company on the way.
The second deer I hit was in broad daylight at lunch time going about 10mph. It wasn't injured. I had some damage to my sunroof. I went to lunch and called my insurance when I was back at the office.
If your vehicle isn’t disabled, what’s the big deal about stopping?
If you're just careening down the highway at 80, you're not really giving your car a fair chance to let you know that it's really in a disabled state now are you?
It's just common sense that after a major impact you should evaluate the safety of continuing in your current state. Stopping and doing the bare minimum of just looking at your car would be the first step of that process.
How does that compare to the number of deer/miles traveled of "regular" cars? That's the important part.
If you live in deer country you know how often you see dead ones on the side of the road, it's just scandalous because it was a car on autopilot, but if it's still safer per miles traveled than having humans behind the wheel then it's still a win.
If it had braked and just couldn't stop it would be comparable to a person. Continuing at full speed into a solid object on the road is the same as the worst human driver.
If it wasn't limited to only cameras because Musk is an arrogant jackass then the lidar/rader/other non-camera sensor would have sensed the deer being an object even if the camera didn't recognize it.
That's one example, do you seriously believe there are no cases of human drivers not slowing down before hitting something because they're busy checking their phone or they're DUI?
But I learned at my driving lessons that you shouldn't hit the breaks for animals running into your lane, because it can result in a car crash that's way worse. (think truck behind you with a much longer break length.)
You absolutely need to hit the brakes, but don't swerve. A deer weighs over 200lbs and will likely crash into your windshield if you hit it head on. You need to safely loose as much speed as you can because even a side hit on the deer is likely to wreck your axel and prevent you from driving.
If you watch the video, the deer was standing on a strip of off coloured pavement, and also had about the same length as the dotted line. Not sure how much colour information comes through at night on those cameras.
The point here isn't actually "should it have stopped for the deer" , it's "if the system can't even see the deer, how could it be expected to distinguish between a deer and a child?"
The calculus changes incredibly between a deer and a child.
Agree, it didn't do anything to avoid the obstacle. A human could probably see it as an obstacle and try to swerve to the side, albeit not knowing what it is. Not saying it's possible to avoid, but some reaction would be made.
You learned wrong if you think that is a universal rule for all animals.
You might have been told that for small animals like squirrels, but that is more about not overreacting. You should absolutely brake for a deer, whether or not you are being tailgated, just like you would brake for any large object on the road.
Hitting a deer at speed is going to cause far more problems for you AND the people behind you than trying to not hit the deer.
That's why humans have brains, for situational awareness.
And it's less about not breaking for an animal, as it is about not wildly swerving.
Also, you should probably revise your thinking on this before you visit any states that have large animals like Moose on the roads. Because if you plow into one with a car, it can easily kill you when it crushes you after impact.
Also on motorbikes you are more stable at high speed so better to hit a dog at speed than slow down which could lead to person behind you hitting you or you crashing.
Ok seems I was wrong.
Absolutely not true. No amount of speed is going to keep you safe if you strike an animal on a bike. You're better off slowing down so that you have less momentum when you wreck. Drivers should be giving you enough space (even though they rarely do). A deer weighs more than a grown man and will kill you if you hit it at highway speed. A dog will take out your front wheel and cause you to wreck whether you hit it at 15mph or 80mph.
Are you honestly defending this, the software took a life and didnt react. Im not on a skynet buzz but it is concerningly bad software and implementation.
I dont care if humans do it, they shouldnt and that should be the easy bar to clear in implementing a replacement for humans.
Honestly if the software is better than humans yeah. I'm also very much moving insurance costs to the software and having insurance based on those exact things.
If the software is safer then humans I don't care if it makes mistakes at a lower rate.