Elon Musk’s FSD v12 demo includes a near miss at a red light and doxxing Mark Zuckerberg — 45-minute video was meant to demonstrate v12 of Tesla’s Full Self-Driving but ended up being a list of thi…::Elon Musk posted a 45-minute live demonstration of v12 of Tesla’s Full Self-Driving feature. During the video, Musk has to take control of the vehicle after it nearly runs a red light. He also doxxes Mark Zuckerberg.

  • NotYourSocialWorker@feddit.nu
    link
    fedilink
    English
    arrow-up
    2
    ·
    10 months ago

    If most drivers are rolling through stop signs and you’re the only one stopping completely, while you might technically be in the right, your behaviour could lead to accidents due to the unpredictability.

    Simply no. If you as a driver aren’t prepared that the car in front of you might actually stop when there’s a sign that says stop, and if you aren’t keeping enough of a distance to be able to break, then it isn’t the car in front that is the problem, or who is the one causing the accident, it’s you and only you.

    The same applies to speeding. Driving significantly slower than the flow of traffic might slow down the traffic flow, leading to unsafe overtakings and such.

    Again no. If they are driving at the speed of the signage, keeping the speed and driving predictable, then the ones driving “significantly” faster are the ones decreasing road safety. No-one is forcing them to perform “unsafe overtakings and such”. Also, just because you, from your vantage point, can’t see a reason for the car in front of you driving slowly doesn’t mean that there isn’t one.

    While a dose of humility is good, a dose of personal responsibility is also great

    • Thorny_Thicket@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      then the ones driving “significantly” faster are the ones decreasing road safety. No-one is forcing them to perform “unsafe overtakings and such”.

      I’m not claiming it is so, but I’m saying it’s conceivable that if the autonomous vehicle drives slightly over the speed limit, with the flow of traffic, it may actually lead to a statistically significant drop in accidents compared to the scenario where it follows the speed limit. Yes, no one is forcing other drivers to behave in such a way, but they do, and because of that, people die. In this case, forcing self-driving cars to follow traffic rules to the letter would paradoxically mean you’re choosing to kill and injure more people.

      I don’t think the answer to this kind of moral question is obvious. Traffic is such a complex system, and there are probably many other examples where the actually safer thing to do is not what you’d intuitively think.