OK, its just a deer, but the future is clear. These things are going to start kill people left and right.
How many kids is Elon going to kill before we shut him down? Whats the number of children we’re going to allow Elon to murder every year?
self driving cars should absolutely be banned from public roads.
As much as I hate Elon, self-driving cars are the future and will be way safer than some idiot behind the wheel
All cars are death machines
In the day, we sweat it out on the streets
Of a runaway American dream
At night we ride through the mansions of glory
In the suicide machines
“Born to Ride” - Bruce Springsteen
I watched the whole video… Mowed down like 90 deer in a row.
Friendly reminder that tesla auto pilot is an AI training on live data. If it hasn’t seen something enough times then it won’t know to stop. This is how you have a tesla running full speed into an overturned semi and many, many other accidents.
I hit a deer on the highway in the middle of the night going about 80mph. I smelled the failed airbag charge and proceeded to drive home without stopping. By the time I stopped, I would never have been able to find the deer. If your vehicle isn’t disabled, what’s the big deal about stopping?
I’ve stuck two deer and my car wasn’t disabled either time. My daughter hit one and totaled our van. She stopped.
That said, fuck Musk.
If your vehicle isn’t disabled, what’s the big deal about stopping?
If you’re just careening down the highway at 80, you’re not really giving your car a fair chance to let you know that it’s really in a disabled state now are you?
It’s just common sense that after a major impact you should evaluate the safety of continuing in your current state. Stopping and doing the bare minimum of just looking at your car would be the first step of that process.
Maybe drive a little slower at night. If you can’t spot and react to animals on your path, you won’t able to react when it’s a human
It was an expressway. There were no lights other than cars. You’re not wrong, had a human sprinted at 20mph across the expressway in the dark, I’d have hit them, too. That being said, you’re not supposed to swerve and I had less than a second to react from when I saw it. It was getting hit and there was nothing I could’ve done.
My point was more about what happened after. The deer was gone and by the time I got to the side of the road I was probably about 1/4 mile away from where I struck it. I had no flashlight to hunt around for it in the bushes and even if I did I had no way of killing it if it was still alive.
Once I confirmed my car was drivable I proceeded home and called my insurance company on the way.
The second deer I hit was in broad daylight at lunch time going about 10mph. It wasn’t injured. I had some damage to my sunroof. I went to lunch and called my insurance when I was back at the office.
Great on paper but literally not okay to slow down to 35 mph on the freeway … Where most wild animals are hit at night.
Nobody is asking you to go at 35 mph. But going 60 mph instead of 80 mph means that your stopping distance will be nearly half and you will have almost twice the amount of time to react.
https://www.automotive-fleet.com/driver-care/239402/driver-care-know-your-stopping-distance
Have you hit a deer before or almost hit them in the dark? Yes absolutely 60mph will shorten your stopping distance and reaction time but not nearly enough. Even at 35mph people hit deer all the time because they typically jump out in front. But much faster than 35mph and even standing still in the middle of the road they’re tough to see and stop for. 60mph, not a chance.
I haven’t hit a deer, not even come close since they aren’t a problem in my country. You are most probably right and i have seen videos of deer just jumping onto the road at the last second which causes an unavoidable accident. My viewpoint is that when you hit a creature(animal or human) at 80mph, they are most certainly dead. If you hit them at 60, they might survive but be gravely wounded. If are able to react and slow down before contact to about 30, they will be hurt but at least they have a much better chance of the survival. Somehow going at same speeds during the day and during the night seems very risky
Whether or not a human should stop seems beside the point. Autopilot should immediately get the driver to take back control if something unexpected happens, and stop if the driver doesn’t take over. Getting into an actual collision and just continuing to drive is absolutely the wrong behavior for a self-driving car.
It doesn’t have to not kill people to be an improvement, it just has to kill less people than people do
That’s a low bar when you consider how stringent airline safety is in comparison, and that kills way less people than driving does. If sensors can save people’s lives, then knowingly not including them for profit is intentionally malicious.
I roll my eyes at the dishonest bad faith takes people have in the comments about how people do the same thing behind the wheel. Like that’s going to make autopiloting self-driving cars an exception. Least a person can react, can slow down or do anything that an unthinking, going-by-the-pixels computer can’t do at a whim.
How come human drivers have more fatalities and injuries per mile driven?
Musk can die in a fire, but self driving car tech seems to be vastly safer than human drivers when you do apples to apples comparisons. It’s like wearing a seatbelt, you certainly don’t need to have one to go from point A to point B, but you’re definitely safer with it - even if you are giving up a little control. Like a seatbelt, you can always take it off.
I thought the deer would be running or something, but no its just straight on from the car, doesn’t move at all! How the fuck does a deer standing dead center in front of you not get caught by the camera!
I know a lot of people here are/will be mad at Musk simply for personal political disagreement, but even just putting that aside, I’ve never liked the idea of self-driving cars. There’s just too much that can go wrong too easily, and in a 1-ton piece of metal and glass moving at speeds up to near 100 mph, you need to be able to have the control enough to respond within a few seconds if the unexpected happens, like a deer jumping in the middle of the road. Computers don’t, and may never, have the benefit of contextual awareness to make the right decision as often as a human would in those situations. I’m not going to cheer for the downfall of Musk or Tesla as a whole, but they do severely need to reconsider this idea or else there will be a lot of people hurt and/or killed and a lot of liability on them when it happens. That’s a lot of risk to take on for a smaller auto maker like them, just thinking in business terms.
I mean we do let humans drive cars and some of them are as dumb as bricks and some are malicious little freaks.
Not saying we are anywhere FSD and Elon is a clown, but I would support a future with this technology if we ever got there. The issue is we would have to be all or nothing. Like you can’t have a mix of robots and people driving around.
The problem is that with dumb drivers you can easily place blame at the driver and make him pay for his idiocracy. FSD is a lot more complicated. You can’t really blame the driver since he wasn’t driving the car but neither did the engineer or the company itself. We’d have to draw up entirely new frameworks in order to define and place criminal neglect if one should exist. Is the company responsible for a malicious developer? Is the company responsible for a driver ignoring a set guideline and sits impaired behind the emergency stop? Is the driver responsible for a software fault?
All of these questions and many more needs to be answered. Some probably can’t and must remain a so-called “act of God” with no blame to place. And people is not fond of blaming just the software, they’re out for blood when an accident happens and software don’t bleed. Of course the above questions might be the easiest to answer but the point still stands.
Full self driving should only be implemented when the system is good enough to completely take over all driving functions. It should only be available in vehicles without steering wheels. The Tesla solution of having “self driving” but relying on the copout of requiring constant user attention and feedback is ridiculous. Only when a system is truly capable of self-driving 100% autonomously, at a level statistically far better than a human, should any kind of self-driving be allowed on the road. Systems like Tesla’s FSD officially require you to always be ready to intervene at a moment’s notice. They know their system isn’t ready for independent use yet, so they require that manual input. But of course this encourages disengaged driving; no one actually pays attention to the road like they should, able to intervene at a moment’s notice. Tesla’s FSD imitates true self-driving, but it pawns off the liability do drivers by requiring them to pay attention at all times. This should be illegal. Beyond merely lane-assistance technology, no self-driving tech should be allowed except in vehicles without steering wheels. If your AI can’t truly perform better than a human, it’s better for humans to be the only ones actively driving the vehicle.
This also solves the civil liability problem. Tesla’s current system has a dubious liability structure designed to pawn liability off to the driver. But if there isn’t even a steering wheel in the car, then the liability must fall entirely on the vehicle manufacturer. They are after all 100% responsible for the algorithm that controls the vehicle, and you should ultimately have legal liability for the algorithms you create. Is your company not confident enough in its self-driving tech to assume full legal liability for the actions of your vehicles? No? Then your tech isn’t good enough yet. There can be a process for car companies to subcontract out the payment of legal claims against the company. They can hire State Farm or whoever to handle insurance claims against them. But ultimately, legal liability will fall on the company.
This also avoids criminal liability. If you only allow full self-driving in vehicles without steering wheels, there is zero doubt about who is control of the car. There isn’t a driver anymore, only passengers. Even if you’re a person sitting in the seat that would normally be a driver’s seat, it doesn’t matter. You are just a passenger legally. You can be as tired, distracted, drunk, or high as you like, you’re not getting any criminal liability for driving the vehicle. There is such a clear bright line - there is literally no steering wheel - that it is absolutely undeniable that you have zero control over the vehicle.
This actually would work under the same theory of existing drunk-driving law. People can get ticketed for drunk driving for sleeping in their cars. Even if the cops never see you driving, you can get charged for drunk driving if they find you in a position where you could drunk drive. So if you have your keys on you while sleeping drunk in a parked car, you can get charged with DD. But not having a steering wheel at all would be the equivalent of not having the keys to a vehicle - you are literally incapable of operating it. And if you are not capable of operating it, you cannot be criminally liable for any crime relating to its operation.
I wouldn’t be against using teslas to clean up the deer overpopulation problem in the US. I’m in favor of rolling this code into all Tesla models in the next update.
Tesla’s approach to automotive autonomy is a unique one: Rather than using pesky sensors, which cost money, the company has instead decided to rely only on the output from the car’s cameras. Its computers analyze every pixel, crunch through tons of data, and then apparently decide to just plow into deer and keep on trucking.
People are well known for never ever running over anything or anyone.
The autopilot knows deers can’t sue
Driving is full of edge cases. Humans are also bad drivers who get edge cases wrong all the time.
The real question isn’t is Tesla better/worse in anyone in particular, but overall how does Tesla compare. If a Tesla is better in some situations and worse in others and so overall just as bad as a human I can accept it. Is Tesla is overall worse then they shouldn’t be driving at all (If they can identify those situations they can stop and make a human take over). If a Tesla is overall better then I’ll accept a few edge cases where they are worse.
Tesla claims overall they are better, but they may not be telling the truth. One would think regulators have data for the above - but they are not talking about it.
Being safer than humans is a decent starting point, but safety should be maximized to the best of a machine’s capability, even if it means adding a sensor or two. Keeping screws loose on a Boeing airplane still makes the plane safer than driving, so Boeing should not be made to take responsibility.
Given that they market it as “supervised”, the question only has to be “are humans safer when using this tool than when not using it?”
One of the cool things I’ve noticed since recent updates, is the car giving a nudge to help me keep centered, even when I’m not using autopilot
Tesla claims overall they are better, but they may not be telling the truth. One would think regulators have data for the above - but they are not talking about it.
The agency is asking if other similar FSD crashes have occurred in reduced roadway visibility conditions, and if Tesla has updated or modified the FSD system in a way that may affect it in such conditions.
It sure seems like they aren’t being very forthcoming with their data between this and being threatened with fines last year for not providing the data. That makes me suspect they still aren’t telling the truth.