OK, its just a deer, but the future is clear. These things are going to start kill people left and right.

How many kids is Elon going to kill before we shut him down? Whats the number of children we’re going to allow Elon to murder every year?

  • Fleur_@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    I cannot support tesla now that I know they aren’t vegan smh

  • Rhaedas@fedia.io
    link
    fedilink
    arrow-up
    0
    ·
    2 months ago

    Is there a longer video anywhere? Looking closely I have to wonder where the hell did that deer come from? There’s a car up ahead of the Tesla in the same lane, I presume quickly moved back in once it passed the deer? The deer didn’t spook or anything from that car?

    This would have been hard for a human driver to avoid hitting, but I know the issue is the right equipment would have been better than human vision, which should be the goal. And it didn’t detect the impact either since it didn’t stop.

    But I just think it’s peculiar that that deer just literally popped there without any sign of motion.

    • TimeSquirrel@kbin.melroy.org
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      Ever hear the phrase “like a deer caught in headlights”? That’s what they do. They see oncoming headlights and just freeze.

      • Rhaedas@fedia.io
        link
        fedilink
        arrow-up
        0
        ·
        2 months ago

        It depends. If it’s on the side of the road it may do the opposite and jump in front of you. This one actually looked like it was going to start moving, but not a chance.

        It’s the gap between where the deer is in the dark and the car in front that’s odd. Only thing I can figure is the person was in the other lane and darted over just after passing the deer.

        • snooggums@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 months ago

          The front car is probably further ahead than you think, and a deer can move onto the road quickly and freeze when looking at headlights or slow down if confused. I think in this case the deer was facing away and may not have even heard the vehicle approaching so it wasn’t trying to avoid danger.

          I avoided a deer in a similar situation while driving last week, and the car ahead of us was closer than this clip. Just had to brake and change lanes.

      • Pandemanium@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        That’s why you flash your lights on and off at them, to get them to unfreeze before you get too close.

    • Buelldozer@lemmy.today
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      2 months ago

      Is there a longer video anywhere? Looking closely I have to wonder where the hell did that deer come from?

      I have the same question. If you watch the video closely the deer is located a few feet before the 2nd reflector post you see at the start of the video. At that point in time the car in front is maybe 20’ beyond the post which means they should have encountered the deer within the last 30-40 feet but there was no reaction visible.

      You can also see both the left and right sides of the road at the reflector well before the deer is visible, you can even make out a small shrub off the road on the right, and but somehow can’t see the deer enter the road from either side?!

      It’s like the thing just teleported into the middle of the lane.

      The more I watch this the more suspicious I am that the video was edited.

  • Hubi@feddit.org
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    The poster, who pays Tesla CEO Elon Musk for a subscription to the increasingly far-right social media site, claimed that the FSD software “works awesome” and that a deer in the road is an “edge case.” One might argue that edge cases are actually very important parts of any claimed autonomy suite, given how drivers check out when they feel the car is doing the work, but this owner remains “insanely grateful” to Tesla regardless.

    How are these people always such pathetic suckers.

    • bluGill@fedia.io
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      Deer on the road is an edge case that humans cannot handle well. In general every option other than hitting the deer is overall worse - which is why most insurance companies won’t increase your rates if you hit a deer and file a claim for repairs.

      The only way to not hit/kill hundreds of deer (thousands? I don’t know the number) every year is to reduce rural speed limits to unreasonably slow speeds. Deer jump out of dark places right in front of cars all the time - the only option to avoid it that might work is either drive in the other lanes (which sometimes means into an oncoming car), or into the ditch (you have no clue what might be there - if you are lucky the car just rolls, but there could be large rocks or strong fence posts and the car stops instantly. Note that this all happens fast, you can’t think you only get to react. Drivers in rural areas are taught to hit the brakes and maintain their lane.

      • Hubi@feddit.org
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        The problem is not that the deer was hit, a human driver may have done so as well. The actual issue is that the car didn’t do anything to avoid hitting it. It didn’t even register that the deer was there and, what’s even worse, that there was an accident. It just continued on as if nothing happened.

        • snooggums@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 months ago

          Yeah, the automated system should be better than a human. That is the whole point of collision detection systems!

          • AA5B@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            2 months ago

            Right. I was trying to decide whether to mention that deer can be hard to spot in time. Even in the middle of the road like this, they’re non-reflective and there may be no movement to catch the eye. It’s very possible for a human to be zoning out and not notice this deer in time

            But yeah, this is where we need the car to help. This is what the car should be better than human with. This is what would make ai a good tool to improve safety. If it saw the deer

      • 0x0@programming.dev
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        In general every option other than hitting the deer is overall worse

        You’re wrong. The clear solution here is to open suicide-prevention clinics for the depressed deer.

    • leftytighty@slrpnk.net
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      Being a run of the mill fascist (rather than those in power) is actually an incredibly submissive position, they just want strong daddies to take care of them and make the bad people go away. It takes courage to be a “snowflake liberal” by comparison

    • tacosanonymous@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      Sunk cost? Tech worship?

      I’m so jaded, I question my wife when she says the sun will rise tomorrow so I really don’t get it either.

    • AA5B@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      I’d go even farther and say most driving is an edge case. I used 30 day trial of full self-driving and the results were eye opening. Not how it did: it was pretty much as expected, but looking at where it went wrong.

      Full self driving did very well in “normal” cases, but I never realized just how much of driving was an “edge” case. Lane markers faded? No road edge but the ditch? Construction? Pothole? Debris? Other car does something they shouldn’t have? Traffic lights not aligned in front of you so it’s not clear what lane? Intersection not aligned so you can’t just go straight across? People intruding? Contradictory signs? Signs covered by tree branches? No sight line when turning?

      After that experiment, it seems like “edge” cases are more common than “normal” cases when driving. Humans just handle it without thinking about it, but the car needs more work here

  • themurphy@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    I hate Tesla as much as the next guy in here.

    But I learned at my driving lessons that you shouldn’t hit the breaks for animals running into your lane, because it can result in a car crash that’s way worse. (think truck behind you with a much longer break length.)

    Don’t know if there’s different rules.

    • BakerBagel@midwest.social
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      You absolutely need to hit the brakes, but don’t swerve. A deer weighs over 200lbs and will likely crash into your windshield if you hit it head on. You need to safely loose as much speed as you can because even a side hit on the deer is likely to wreck your axel and prevent you from driving.

    • snooggums@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      You learned wrong if you think that is a universal rule for all animals.

      You might have been told that for small animals like squirrels, but that is more about not overreacting. You should absolutely brake for a deer, whether or not you are being tailgated, just like you would brake for any large object on the road.

      Hitting a deer at speed is going to cause far more problems for you AND the people behind you than trying to not hit the deer.

      • themurphy@lemmy.ml
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        You’re probably right. I encountered maybe 2 or 3 deers running out in front of my car so far, and I hit the breaks every time in pure reflex anyway.

        Dodged them so far, but damn I’m scared I might hit one at some point.

    • Windex007@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      If you watch the video, the deer was standing on a strip of off coloured pavement, and also had about the same length as the dotted line. Not sure how much colour information comes through at night on those cameras.

      The point here isn’t actually “should it have stopped for the deer” , it’s “if the system can’t even see the deer, how could it be expected to distinguish between a deer and a child?”

      The calculus changes incredibly between a deer and a child.

      • Kecessa@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        At the same time, it would have located it if it was using radar, but Musk decided that cameras are the future (contrary to all other brands)

        • Windex007@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 months ago

          Yeah. I mean, I understand the premise, I just think it’s flawed. Like, you and I as vehicle operators use two cameras when we drive (our two eyes). It’s hypothetically sufficient in terms of raw data input.

          Where it falls apart is that we also have brains which have evolved in ways we don’t even understand to consume those inputs effectively.

          But most importantly, why aim for parity at all? Why NOT give our cars the tools to “see” better than a human? I want that!

          • Turbonics@lemmy.sdf.org
            link
            fedilink
            English
            arrow-up
            0
            ·
            2 months ago

            No human could have avoided that deer without swerving their car.

            A lidar provides superhuman vision which works in the dark and through fog. Elon is making a human car and ignores all the limits we have that can be solved in other ways.

            A human is a general purpose organism. We are not designed as specialized driving machines.

            • Windex007@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              2 months ago

              I completely agree that if there are tools that can allow a vehicle to “see” better than a human it’s absurd not to implement them. Even if musk could make a car exactly as good as a human, that’s a low bar. It isn’t good enough.

              As for humans: if you are operating a vehicle such that you could not avoid killing an unexpected person on the road, you are not safely operating the vehicle. In this case, it’s known as “over driving your headlights”, you are driving at a speed that precludes you from reacting appropriately by the time you can perceive an issue.

              Imagine if it wasn’t a deer but a chunk of concrete that would kill you if struck at speed. Perhaps a bolder on a mountain pass. A vehicle that has broken down.

              Does Musk’s system operate safely? No. The fact that it was a deer is completely irrelevant.

  • Dark Arc@social.packetloss.gg
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    The one thing I will say is this isn’t a human… Deer probably aren’t in their training data at near the rates humans are.

    It’s definitely still concerning, but also still maybe more trustworthy than some human drivers. We seriously give licenses to too many people. Within the last week I’ve seen a guy that went into the other lane by like 4’ multiple times and I also saw a lady who blocked 2 lanes of traffic so she could make an illegal U turn on a 4 lane city street (rather than you know turning off on a side street/one of many nearby parking lots and turning around).

    • snooggums@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      Deer and other animals are extremely common on rural roads. If they don’t have enough training data then they are being willfully incompetent.

  • Jo Miran@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    Oooooh, can we shut Elon down? I mean literally shut down actual Elon. Does he have an off switch? He’s gone wonky and I’d like to turn him off now.

    • bluGill@fedia.io
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      Deer often travel in herds so where there is one there are often more. In rural area you can go miles without seeing one, and then see 10 in a few hundred feet. There are deer in those miles you didn’t see them as well, but they happened to not be near the road then.

  • Evil_Shrubbery@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    It was an illegal deer immigrant, it recognised it, added it to the database on Tesla servers, and mowed it down before it took any jobs or whatever the hate-concern was.

    /s

    … but some actual technically human people do the same when they see an animal, don’t they?
    :(

    • snooggums@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      … but some actual technically human people do the same when they see an animal, don’t they?

      Not deer…

  • Nytixus@kbin.melroy.org
    link
    fedilink
    arrow-up
    0
    ·
    2 months ago

    I roll my eyes at the dishonest bad faith takes people have in the comments about how people do the same thing behind the wheel. Like that’s going to make autopiloting self-driving cars an exception. Least a person can react, can slow down or do anything that an unthinking, going-by-the-pixels computer can’t do at a whim.

    • Lets_Eat_Grandma@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      How come human drivers have more fatalities and injuries per mile driven?

      Musk can die in a fire, but self driving car tech seems to be vastly safer than human drivers when you do apples to apples comparisons. It’s like wearing a seatbelt, you certainly don’t need to have one to go from point A to point B, but you’re definitely safer with it - even if you are giving up a little control. Like a seatbelt, you can always take it off.

  • whotookkarl@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    It doesn’t have to not kill people to be an improvement, it just has to kill less people than people do

    • ano_ba_to@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      That’s a low bar when you consider how stringent airline safety is in comparison, and that kills way less people than driving does. If sensors can save people’s lives, then knowingly not including them for profit is intentionally malicious.

  • brbposting@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    Tesla’s approach to automotive autonomy is a unique one: Rather than using pesky sensors, which cost money, the company has instead decided to rely only on the output from the car’s cameras. Its computers analyze every pixel, crunch through tons of data, and then apparently decide to just plow into deer and keep on trucking.

  • Grangle1@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    I know a lot of people here are/will be mad at Musk simply for personal political disagreement, but even just putting that aside, I’ve never liked the idea of self-driving cars. There’s just too much that can go wrong too easily, and in a 1-ton piece of metal and glass moving at speeds up to near 100 mph, you need to be able to have the control enough to respond within a few seconds if the unexpected happens, like a deer jumping in the middle of the road. Computers don’t, and may never, have the benefit of contextual awareness to make the right decision as often as a human would in those situations. I’m not going to cheer for the downfall of Musk or Tesla as a whole, but they do severely need to reconsider this idea or else there will be a lot of people hurt and/or killed and a lot of liability on them when it happens. That’s a lot of risk to take on for a smaller auto maker like them, just thinking in business terms.

    • dependencyinjection@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      I mean we do let humans drive cars and some of them are as dumb as bricks and some are malicious little freaks.

      Not saying we are anywhere FSD and Elon is a clown, but I would support a future with this technology if we ever got there. The issue is we would have to be all or nothing. Like you can’t have a mix of robots and people driving around.

      • VonReposti@feddit.dk
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        The problem is that with dumb drivers you can easily place blame at the driver and make him pay for his idiocracy. FSD is a lot more complicated. You can’t really blame the driver since he wasn’t driving the car but neither did the engineer or the company itself. We’d have to draw up entirely new frameworks in order to define and place criminal neglect if one should exist. Is the company responsible for a malicious developer? Is the company responsible for a driver ignoring a set guideline and sits impaired behind the emergency stop? Is the driver responsible for a software fault?

        All of these questions and many more needs to be answered. Some probably can’t and must remain a so-called “act of God” with no blame to place. And people is not fond of blaming just the software, they’re out for blood when an accident happens and software don’t bleed. Of course the above questions might be the easiest to answer but the point still stands.

        • WoodScientist@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 months ago

          Full self driving should only be implemented when the system is good enough to completely take over all driving functions. It should only be available in vehicles without steering wheels. The Tesla solution of having “self driving” but relying on the copout of requiring constant user attention and feedback is ridiculous. Only when a system is truly capable of self-driving 100% autonomously, at a level statistically far better than a human, should any kind of self-driving be allowed on the road. Systems like Tesla’s FSD officially require you to always be ready to intervene at a moment’s notice. They know their system isn’t ready for independent use yet, so they require that manual input. But of course this encourages disengaged driving; no one actually pays attention to the road like they should, able to intervene at a moment’s notice. Tesla’s FSD imitates true self-driving, but it pawns off the liability do drivers by requiring them to pay attention at all times. This should be illegal. Beyond merely lane-assistance technology, no self-driving tech should be allowed except in vehicles without steering wheels. If your AI can’t truly perform better than a human, it’s better for humans to be the only ones actively driving the vehicle.

          This also solves the civil liability problem. Tesla’s current system has a dubious liability structure designed to pawn liability off to the driver. But if there isn’t even a steering wheel in the car, then the liability must fall entirely on the vehicle manufacturer. They are after all 100% responsible for the algorithm that controls the vehicle, and you should ultimately have legal liability for the algorithms you create. Is your company not confident enough in its self-driving tech to assume full legal liability for the actions of your vehicles? No? Then your tech isn’t good enough yet. There can be a process for car companies to subcontract out the payment of legal claims against the company. They can hire State Farm or whoever to handle insurance claims against them. But ultimately, legal liability will fall on the company.

          This also avoids criminal liability. If you only allow full self-driving in vehicles without steering wheels, there is zero doubt about who is control of the car. There isn’t a driver anymore, only passengers. Even if you’re a person sitting in the seat that would normally be a driver’s seat, it doesn’t matter. You are just a passenger legally. You can be as tired, distracted, drunk, or high as you like, you’re not getting any criminal liability for driving the vehicle. There is such a clear bright line - there is literally no steering wheel - that it is absolutely undeniable that you have zero control over the vehicle.

          This actually would work under the same theory of existing drunk-driving law. People can get ticketed for drunk driving for sleeping in their cars. Even if the cops never see you driving, you can get charged for drunk driving if they find you in a position where you could drunk drive. So if you have your keys on you while sleeping drunk in a parked car, you can get charged with DD. But not having a steering wheel at all would be the equivalent of not having the keys to a vehicle - you are literally incapable of operating it. And if you are not capable of operating it, you cannot be criminally liable for any crime relating to its operation.