• randoot@lemmy.world
    link
    fedilink
    arrow-up
    4
    ·
    20 days ago

    Ha only if. Autopilot turns off right before a crash so that Tesla can claim it was off and blame it on the driver. Look it up.

    • Tja@programming.dev
      link
      fedilink
      arrow-up
      1
      ·
      19 days ago

      The driver is always at blame, even if it was on. They turn it off for marketing claims.

      PS: fuck elon

      • Sonicdemon86@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        20 days ago

        Mark Rober had a video on autopilot of several cars and he used his Tesla. The car turned off the autopilot when he crashed through a styrofaom wall.

        • randoot@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          20 days ago

          This is how they claim autopilot is safer than human drivers. In reality Tesla has one of the highest fatality rates but magically all of those happen when autopilot was “off”

      • JcbAzPx@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        20 days ago

        Most states apply liability to whoever is in the driver seat anyway. If you are operating the vehicle, even if you’re not controlling it at that moment, you are expected to maintain safe operation.

        That’s why the Uber self driving car that killed someone was considered the test driver’s fault and left Uber mostly off the hook.

        Not sure how it works for the robo taxis, though.

  • supersquirrel@sopuli.xyz
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    20 days ago

    Unironically this is a perfect example of why AI is being used to choose targets to murder in the Palestinian Genocide or in cases like DOGE attacking the functioning of the U.S. government, also US healthcare company claims of denial or collusion of landlord software to raise rent.

    The economic function of AI is to abdicate responsibility for your actions so you can make a bit more money while hurting people, and until the public becomes crystal clear on that we are under a wild amount of danger.

    Just substitute in for Elon the vague idea of a company that will become a legal and ethical escape goat for brutal choices by individual humans.

  • ZkhqrD5o@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    20 days ago

    Tldr: Take the train and be safe.

    Rant: In the EU, you are 35x more likely to die from a car crash, compared to a train crash. The union has created the so-called Vision Zero program, which is designed to reach zero driving deaths by some arbitrarily chosen date in the future. And of course it talks about autonomously driving cars. You know, crazy idea, but what if instead of we bet it all on some hypothetical magic Jesus technology that may or may not exist by the arbitrarily chosen date and instead focus on the real world solution that we already have? But well, the car industry investors would make less money, so I can answer that myself. :(

    Edit: Also, Musk is a Nazi cunt who should die of cancer.

    • bleistift2@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      1
      ·
      20 days ago

      Speaking as a German: There are fewer train-related deaths because the trains don’t drive.

    • dorumon@lemm.ee
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      20 days ago

      Too bad I live in hell country. Where there are no sidewalks or public transportation just roads and we have facist dipshits bought out by big car companies. I would love to take a train or a bus but that stuff doesn’t exist here and never will until we re-educate and remake America from the ground up. America is just too far gone at this rate to even want these public transportation services at all or even bike-lanes. Cities would rather destroy themselves for big top stores anyway and highways thinking they are a good thing only to realize that will ensure they will cease to be alongside their local businesses. I’m sorry but I’m forced to walk on the road and nearly get run over legally speaking with zero repurcussions from the driver side because I shouldn’t of been walking on the road anyway.

  • guywithoutaname@lemm.ee
    link
    fedilink
    arrow-up
    1
    ·
    20 days ago

    I’d imagine you are always responsible for what you do when you’re driving, even if a system like autopilot is helping you drive.

    • NιƙƙιDιɱҽʂ@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      20 days ago

      Nah, it just disengages a fraction of a second before impact so they can claim “it wasn’t engaged at the moment of impact, so not our responsibility.”

      There were rumours about this for ages, but I honestly didn’t fully buy it until I saw it in Mark Rober’s vison vs lidar video and various other follow-ups to it.

      • Tja@programming.dev
        link
        fedilink
        arrow-up
        1
        ·
        19 days ago

        It not about responsibility, it’s about marketing. At no point do they assume responsibility, like any level 2 system. It would look bad if it was engaged, but you are 100% legally liable for what the car does when on autopilot (or the so called “full self driving”). It’s just a lane keeping assistant.

        If you trust your life (or the life of others) to a a lane keeping assistant you deserve to go to jail, be it Tesla, VW, or BYD.

    • koper@feddit.nl
      link
      fedilink
      arrow-up
      1
      ·
      20 days ago

      Don’t worry, DOGE will just fire the investigators before that happens.