• set_secret@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    VERGE articles seem to be getting worse over the years, they’ve almost reached Forbes level, yes this does raise some valid safety concerns. No Tesla isn’t bad just because it’s Tesla.

    It doesn’t really give us the full picture. For starters, there’s no comparison with Level 2 systems from other car makers, which also require driver engagement and have their own methods to ensure attention. This would help us understand how Tesla’s tech actually measures up.

    Plus, the piece skips over extremely important stats that would give us a clearer idea of how safe (or not) Tesla’s systems are compared to good old human driving.

    We’re left in the dark about how Tesla compares in scenarios like drunk, distracted, or tired driving—common issues that automation aims to mitigate. (probably on purpose).

    It feels like the article is more about stirring up feelings against Tesla rather than diving deep into the data. A more genuine take would have included these comparisons and variables, giving us a broader view of what these technologies mean for road safety.

    I feel like any opportunity to jump on the Elon hate wagon is getting tiresome. (and yes i hate Elon too).

    • PersnickityPenguin@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      2 months ago

      A couple of my criticisms with the article, which is about “autopilot” and not fsd:

      -conflating autopilot and dad numbers, they are not interoperable systems. They are separate code bases with different functionality.

      -the definition of “autopilot” seems to have been lifted from the aviation industry. The term is used to describe a system that controls the vector of a vehicle, is the speed and direction. That’s all. This does seem like a correct description for what the autopilot system does. While “FSD” does seem like it does not live up to expectations, not being a true level 5 driving system.

      Merriam Webster defines autopilot thusly:

      “A device for automatically steering ships, aircraft, and spacecraft also : the automatic control provided by such a device”

    • WormFood@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      a more genuine take would have included a series of scenarios (e.g. drunk/distracted/tired driving)

      I agree. they did tesla dirty. a more fair comparison would’ve been between autopilot and a driver who was fully asleep. or maybe a driver who was dead?

      and why didn’t this news article contain a full scientific meta analysis of all self driving cars??? personally, when someone tells me that my car has an obvious fault, I ask them to produce detailed statistics on the failure rates of every comparable car model

  • antlion@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    These are spanning from the earliest adopters, up until August of last year. Plenty of idiots using a cruise control system and trusting their lives to beta software. Not the same as the current FSD software.

    Your own car insurance isn’t based on your driving skill when you had your learners permit. When Tesla takes on the liability and insurance for CyberCab, you’ll know it’s much safer than human drivers.

    • Hegar@kbin.social
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      Plenty of idiots using a cruise control system and trusting their lives to beta software.

      Using it exactly as it was marketed doesn’t make you an idiot.

      • halcyoncmdr@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        You really want to get into reality versus marketing in this world? Very little marketing actually shows real world products and use cases in a real world environment. Heck, advertising often doesn’t even show the actual product at all.

        Your McDonald’s burger is NEVER going to look like the marketing photo. You don’t want to get anywhere near that “ice cream” or “milkshake” from the ad either, mashed potatoes and glue are often used for those advertising replacements.

        This doesn’t even get into things like disclaimers and product warnings, or people ignoring them.

      • Thorny_Insight@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        The car prompts you every single time you enable this system to keep your eyes on the road and be prepaired to take over at any moment.

  • root@precious.net
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    There are some real Elon haters out there. I think they’re ugly as sin but I’m happy to see more people driving vehicles with all the crazy safety features, even if they aren’t perfect.

    You’re in control of a massive vehicle capable of killing people and destroying property, you’re responsible for it.

    • Thorny_Insight@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      I’m quite certain that there will be some humble pie served to the haters in not too distant future. The performance of FSD 12.3.5 is all the proof you need that an actual robotaxi is just around the corner. Disagree with me all you want. All we need to do is wait and see.

      However I’m also sure that the cognitive dissonance is going to be so strong for many of these people that even a mountain of evidence is not going to change their mind about it because it’s not based in reason in the first place but emotions.

  • Toes♀@ani.social
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    This is speculation, but were most of them from people who disabled the safety features?

  • tearsintherain@leminal.space
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    Move fast, break shit. Fake it till you sell it, then move the goal posts down. Shift human casualties onto individual responsibility, a core libertarian theme. Profit off the lies because it’s too late, money already in the bank.

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    This is the best summary I could come up with:


    In March 2023, a North Carolina student was stepping off a school bus when he was struck by a Tesla Model Y traveling at “highway speeds,” according to a federal investigation that published today.

    The Tesla driver was using Autopilot, the automaker’s advanced driver-assist feature that Elon Musk insists will eventually lead to fully autonomous cars.

    NHTSA was prompted to launch its investigation after several incidents of Tesla drivers crashing into stationary emergency vehicles parked on the side of the road.

    Most of these incidents took place after dark, with the software ignoring scene control measures, including warning lights, flares, cones, and an illuminated arrow board.

    Tesla issued a voluntary recall late last year in response to the investigation, pushing out an over-the-air software update to add more warnings to Autopilot.

    The findings cut against Musk’s insistence that Tesla is an artificial intelligence company that is on the cusp of releasing a fully autonomous vehicle for personal use.


    The original article contains 788 words, the summary contains 158 words. Saved 80%. I’m a bot and I’m open source!

    • ForgotAboutDre@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      Cameras and AI aren’t a match for radar/lidar. This is the big issue with the approach to autonomy Tesla’s take. You’ve only a guess if there are hazards in the way.

      Most algorithms are designed to work and then be statistically tested. To validate that they work. When you develop an algorithm with AI/machine learning, there is only the statistical step. You have to infer whole systems performance purely from that. There isn’t a separate process for verification and validation. It just validation alone.

      When something is developed with only statistical evidence of it working you can’t be reliably sure it works in most scenarios. Except the exact ones you tested for. When you design an algorithm to work you can assume it works in most scenarios if the result are as expected when you validate it. With machine learning, the algorithm is obscured and uncertain (unless it’s only used for parameter optimisation).

      Machine learning is never used because it’s a better approach. It’s only used when the engineers don’t know how to develop the algorithm. Once you understand this, you understand the hazard it presents. If you don’t understand or refuse to understand this. You build machines that drive into children, deliberately. Through ignorance, greed and arrogance Tesla built a machine that deliberately runs over children.

  • curiousPJ@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    If Red Bull can be successfully sued for false advertising from their slogan “It gives you wings”, I think it stands that Tesla should too.

  • froh42@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    2 months ago

    “If you’ve got, at scale, a statistically significant amount of data that shows conclusively that the autonomous car has, let’s say, half the accident rate of a human-driven car, I think that’s difficult to ignore,” Musk said.

    That’s a very problematic claim - and it might only be true if you compare completely unassited vehicles to L2 Teslas.

    Other brands also have a plethora of L2 features, but they are marketed and designed in a different way. The L2 features are activate but designed in a way to keep the driver engaged in driving.

    So L2 features are for better safety, not for a “wow we live in the future” show effect.

    For example lane keeping in my car - you don’t notice it when driving, it is just below your level of attention. But when I’m unconcentrated for a moment the car just stays on the lane, even on curving roads. It’s just designed to steer a bit later than I would do.

    Adaptive speed control is just sold as adaptive speed control - it did notice it uses radar AND the cameras once, as it considers. my lane free as soon the car in front me clears the lane markings with its wheels (when changing lanes)

    It feels like the software in my car could do a lot more, but its features are undersold.

    The combination of a human driver and the driver assist systems in combination makes driving a lot safer than relying on the human or the machine alone.

    In fact the braking assistant has once stopped my car in tight traffic before I could even react, as the guy in front of me suddenly slammed their brakes. If the system had failed and not detected the situation then it would have been my job to react in time. (I did react, but can’t say if I might have been fast enough with reaction times)

    What Tesla does with technology is impressive, but I feel the system could be so. much better if they didn’t compromise saftey in the name of marketing and hyperbole.

    If Tesla’s Autopilot was designed frim ground up to keep the driver engaged, I believe it would really be the safest car on the road.

    I feel they are rather designed to be able to show off “cool stuff”.

    • ForgotAboutDre@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      Tesla’s autopilot isn’t the best around. It’s just the most deployed and advertised. People creating autopilot responsibly don’t beta test them with the kind of idiots that think Tesla autopilot is the best approach.

      • Thorny_Insight@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        2 months ago

        If Tesla’s self-driving isn’t the best one around then which one is? I’m not aware of any other system capable of doing what FSD does. Manufacturers like Mercedes may have more trust in their system because it only works on a limited number of hand-picked roads and under ideal conditions. I still wouldn’t say that what essentially is a train is better system for getting around than a car with full freedom to take you anywhere.

  • TypicalHog@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    It only matters if the autopilot does more kills than an average human driver on the same distance traveled.

    • PresidentCamacho@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      This is the actual logical way to think about self driving cars. Stop down voting him because “Tesla bad” you fuckin goons.

      • doubtingtammy@lemmy.ml
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        It’s not logical, it’s ideological. It’s the ideology that allows corporations to run a dangerous experiment on the public without their consent.

        And where’s the LIDAR again?

        • PresidentCamacho@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          2 months ago

          My argument is that self driving car fatalities have to be compared against human driven car fatalities. If the self driving cars kill 500 people a year, but humans kill 1000 people a year, which one is better. Logic clearly isn’t your strong suit, maybe sit this one out…

        • TypicalHog@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 months ago

          I SAID: “IT ONLY MATTERS IF AUTOPILOT CAUSES MORE NET DEATHS PER MILE TRAVELED RATHER THAN LESS, WHEN COMPARED TO HUMAN DRIVERS!”

      • gallopingsnail@lemmy.sdf.org
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        Tesla’s self driving appears to be less safe and causes more accidents than their competitors.

        “NHTSA’s Office of Defects Investigation said in documents released Friday that it completed “an extensive body of work” which turned up evidence that “Tesla’s weak driver engagement system was not appropriate for Autopilot’s permissive operating capabilities.”

        Tesla bad.

        • Socsa@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 months ago

          I don’t quite understand what they mean by this. It tracks drivers with a camera and the steering wheel sensor and literally turns itself off if you stop paying attention. What more can they do?

        • TypicalHog@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 months ago

          Can you link me the data that says Tesla’s competitors self-driving is more safe and causes less accidents and WHICH ONES? I would really like to know who else has this level of self-driving while also having less accidents.

    • Geobloke@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      No it doesn’t. Every life stolen matters and if it could be found that if tesla could have replicated industry best practice and saved more lives so that they could sell more cars then that is on them

    • SirEDCaLot@lemmy.today
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      This is 100% correct. Look at the average rate of crashes per mile driven with autopilot versus a human. If the autopilot number is lower, they’re doing it right and should be rewarded and NHTSA should leave them be. If the autopilot number is higher, then yes by all means bring in the regulation or whatever.

      • flerp@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        Humans are extremely flawed beings and if your standard for leaving companies alone to make as much money as possible is that they are at least minimally better than extremely flawed, I don’t want to live in the same world as you want to live in.

        • AdrianTheFrog@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 months ago

          Having anything that can save lives over an alternative is an improvement. In general. Yes, we should be pushing for safer self driving, and regulating that. But if we can start saving lives now, then sooner is better than later.

  • EmperorHenry@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    and the pedestrian-emergency-break on tesla cars, and many other cars with that feature will malfunction sometimes causing people behind you to rear-end you.

  • bitwolf@lemmy.one
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    I just read on LinkedIn a post from a Tesla engineer laid off.

    He said “I checked my email while auto piloting to work”.

    The employees know more than anyone its capabilities and they still take the same stupid risk.

    • n3m37h@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      Just like fight club, they’re imagining them crashing into every transport they come close to

  • Betide@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    The same people who are upset over self driving cars are the ones who scream at the self checkout that they shouldn’t have to scan their own groceries because the store isn’t paying them.

    32% of all traffic crash fatalities in the United States involve drunk drivers.

    I can’t wait until the day that this kind of technology is required by law I’m tired of sharing the road with these idiots and I absolutely trust self driving vehicles more than I trust other humans.

    • Flying Squid@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      people who are upset over self driving cars

      If you are talking about Teslas, you can’t be upset about something a car doesn’t actually do unless you think it’s actually capable of doing it.

      The only thing I don’t like is that Tesla is able to claim it has a “full self driving” mode which is not full self driving. Seems like false advertising to me.

    • EvolvedTurtle@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      I recently learned that at least half of the drivers where I live thing it’s fine to cut me off while we are going 70mph on the highway with no signal

    • TypicalHog@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      I swear some people in this thread would call airplane autopilot bad cause it causes SOME death from time to time.

    • letsgo@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      OK.

      Question: how do you propose I get to work? It’s 15 miles, there are no trains, the buses are far too convoluted and take about 2 hours each way (no I’m not kidding), and “move house” is obviously going to take too long (“hey boss, some rando on the internet said “stop using cars” so do you mind if I take indefinite leave to sell my house and buy a closer one?”).

        • letsgo@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 months ago

          Sure, but the challenge was “Don’t use cars”, not “Don’t use cars where there is viable mass transit in place”.

        • letsgo@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 months ago

          I already have (Yamaha MT10), but presumably that has the same problem that cars do (burning fossil fuels); also it’s no good in shit weather (yeah I know that means I need better clothing).