• over_clox@lemmy.world
    link
    fedilink
    English
    arrow-up
    36
    ·
    6 months ago

    They just recalled all the Cybertrucks, because their ‘smort’ technology is too stupid to realize when an accelerator sensor is stuck…

    • Jesus@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      ·
      edit-2
      6 months ago

      The accelerator sensor doesn’t get stuck, pedal does. The face of the accelerator falls off and wedges the pedal into the down position.

      • over_clox@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        6 months ago

        I realize it’s the pedal that gets stuck, but the computer registers the state of the pedal via a sensor.

        The computer should be smart enough to realize something ain’t right when it registers that both the accelerator and brake pedals are being pressed at the same time. And in that case, the brake should always take priority.

        • Jesus@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          6 months ago

          The stories I’ve heard around the recall have been saying that the brakes override the accelerator in the cyber truck.

  • TypicalHog@lemm.ee
    link
    fedilink
    English
    arrow-up
    32
    ·
    6 months ago

    It only matters if the autopilot does more kills than an average human driver on the same distance traveled.

    • NIB@lemmy.world
      link
      fedilink
      English
      arrow-up
      38
      ·
      6 months ago

      If the cars run over people while going 30kmh because they use cameras and a bug crashed into the camera and that caused the car to go crazy, that is not acceptable, even if the cars crash “less than humans”.

      Self driving needs to be highly regulated by law and demand to have some bare minimum sensors, including radars, lidars, etc. Camera only self driving is beyond stupid. Cameras cant see in snow or dark or whatever. Anyone who has a phone knows how fucky the camera can get under specific light exposures, etc.

      Noone but tesla is doing camera only “self driving” and they are only doing it in order to cut down the cost. Their older cars had more sensors than their newer cars. But Musk is living in his Bioshock uber capitalistic dream. Who cares if a few people die in the process of developing visual based self driving.

      https://www.youtube.com/watch?v=Gm2x6CVIXiE

      • TypicalHog@lemm.ee
        link
        fedilink
        English
        arrow-up
        3
        ·
        6 months ago

        What are you? Some kind of lidar shill? Camera only should obviously be the endgame goal for all robots. Also, this article is not even about camera only.

          • TypicalHog@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            ·
            6 months ago

            Because that’s expensive and can be done with a camera. And once you figure the camera stuff out - you gucci. Now you can do all kinds of shit without needing a lidar on every single robot.

            • AdrianTheFrog@lemmy.world
              link
              fedilink
              English
              arrow-up
              9
              ·
              6 months ago

              Because that’s expensive and can be done with a camera.

              Expensive, as in probably less than $600? Compared to the $35000 cost of a tesla?

              (comparing the cost of the iPhone 12 (without lidar) and iPhone 12 pro (with lidar), we can guess that the sensor probably costs less than $200, so 3 of them (for left, right, and front) would cost probably less than $600)

              lidar can actually be very cheap and small. Unfortunately, Apple bought the only company that seems to make sensors like that (besides some other super high end models)

              There have been a lot of promising research papers on the technology lately though, so I expect more, higher resolution and cheaper lidar sensors to be available relatively soon (next couple years probably).

              • Grippler@feddit.dk
                link
                fedilink
                English
                arrow-up
                2
                ·
                edit-2
                6 months ago

                Yeah that’s not even remotely the same type of sensor used in robotics and autonomous cars. Yes lidar is getting cheaper, but for high detail long range detection they’re much more expensive than the case of your iphone example. The iPhone “lidar” is less than useless in an automotive context.

              • TypicalHog@lemm.ee
                link
                fedilink
                English
                arrow-up
                1
                ·
                6 months ago

                Perhaps. Idk, maybe I’m wrong. But it for sure seems it would be so much better if we achieved the same shit with a cheaper and more primitive simpler sensor.

                • BURN@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  ·
                  6 months ago

                  To get the same resolution and quality of image in all lighting scenarios, cameras are actually going to be more expensive than LiDAR. Cameras suffer in low light, low contrast situations due to the physical limitations of bending light. More light = bigger lenses = higher cost, when LiDAR works better and is cheaper

        • mojofrododojo@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          6 months ago

          Camera only should obviously be the endgame goal for all robots.

          I can’t tell if you’re a moron or attempting sarcasm but this is the least informed opinion I’ve seen in ages.

          • TypicalHog@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            ·
            6 months ago

            I wasn’t attempting sarcasm, so maybe I’m a moron idk. Fair, it likely I’m uninformed. I just know my daddy Elon said something about how solving shit with camera only is probably the best path and will pay off.

    • Geobloke@lemm.ee
      link
      fedilink
      English
      arrow-up
      18
      ·
      6 months ago

      No it doesn’t. Every life stolen matters and if it could be found that if tesla could have replicated industry best practice and saved more lives so that they could sell more cars then that is on them

    • PresidentCamacho@lemm.ee
      cake
      link
      fedilink
      English
      arrow-up
      17
      ·
      6 months ago

      This is the actual logical way to think about self driving cars. Stop down voting him because “Tesla bad” you fuckin goons.

      • gallopingsnail@lemmy.sdf.org
        link
        fedilink
        English
        arrow-up
        14
        ·
        6 months ago

        Tesla’s self driving appears to be less safe and causes more accidents than their competitors.

        “NHTSA’s Office of Defects Investigation said in documents released Friday that it completed “an extensive body of work” which turned up evidence that “Tesla’s weak driver engagement system was not appropriate for Autopilot’s permissive operating capabilities.”

        Tesla bad.

        • TypicalHog@lemm.ee
          link
          fedilink
          English
          arrow-up
          6
          ·
          6 months ago

          Can you link me the data that says Tesla’s competitors self-driving is more safe and causes less accidents and WHICH ONES? I would really like to know who else has this level of self-driving while also having less accidents.

        • PresidentCamacho@lemm.ee
          cake
          link
          fedilink
          English
          arrow-up
          4
          ·
          edit-2
          6 months ago

          My argument is that self driving car fatalities have to be compared against human driven car fatalities. If the self driving cars kill 500 people a year, but humans kill 1000 people a year, which one is better. Logic clearly isn’t your strong suit, maybe sit this one out…

        • TypicalHog@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          ·
          6 months ago

          I SAID: “IT ONLY MATTERS IF AUTOPILOT CAUSES MORE NET DEATHS PER MILE TRAVELED RATHER THAN LESS, WHEN COMPARED TO HUMAN DRIVERS!”

    • mojofrododojo@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      ·
      6 months ago

      this is bullshit.

      A human can be held accountable for their failure, bet you a fucking emerald mine Musk won’t be held accountable for these and all the other fool self drive fuckups.

      • sabin@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        6 months ago

        So you’d rather live in a world where people die more often, just so you can punish the people who do the killing?

        • mojofrododojo@lemmy.world
          link
          fedilink
          English
          arrow-up
          8
          ·
          6 months ago

          That’s a terrifically misguided interpretation of what I said, wow.

          LISTEN UP BRIGHT LIGHTS, ACCOUNTABILITY ISN’T A LUXURY. It’s not some ‘nice to have add-on’.

          Musk’s gonna find out. Gonna break all his fanboys’ hearts too.

          • sabin@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            ·
            6 months ago

            Nothing was misguided and if anything your tone deaf attempt to double down only proves the point I’m making.

            This stopped being about human deaths for you a long time ago.

            Let’s not even bother to ask the question of whether or not this guy could ultimately be saving lives. All that matters to you is that you have a target to take your anger out on the event that a loved one dies in an accident or something.

            You are shallow beyond belief.

            • mojofrododojo@lemmy.world
              link
              fedilink
              English
              arrow-up
              5
              ·
              6 months ago

              This stopped being about human deaths for you a long time ago.

              Nope, it’s about accountability. The fact that you can’t see how important accountability is just says you’re a musk fan boy. If Musk would shut the fuck up and do the work, he’d be better off - instead he’s cheaping out left and right on literal life dependent tech, so tesla’s stock gets a bump. It’s ridiculous, like your entire argument.

              • sabin@lemmy.world
                link
                fedilink
                English
                arrow-up
                3
                ·
                6 months ago

                I don’t give a fuck about musk. I think hos Hyperloop is beyond idiotic and nothing he makes fucking works. In fact I never even said I necessarily think the state of Tesla autopilot is acceptable. All I said was that categorically rejecting autopilot (even for future generations where tech can be much better) for the express purpose of being able to prosecute people is beyond empty and shallow.

                If you need to make up lies about me and strawman me to disagree you only prove my point. You stopped being a rational agent who weighs the good and bad of things a long time ago. You don’t care about how good the autopilot is or can be. All you care about is your mental fixation against the CEO of the company in question.

                Your political opinions should be based on principles, not whatever feels convenient in the moment.

                • mojofrododojo@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  6 months ago

                  You stopped being a rational agent who weighs the good and bad of things a long time ago.

                  sure thing, you stan musk for no reason, and call me irrational. pfft. gonna block you now, tired of your bullshit

      • TypicalHog@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        ·
        6 months ago

        Where did I say that a human shouldn’t be held accountable for what their car does?

  • curiousPJ@lemmy.world
    link
    fedilink
    English
    arrow-up
    31
    ·
    6 months ago

    If Red Bull can be successfully sued for false advertising from their slogan “It gives you wings”, I think it stands that Tesla should too.

  • axo@feddit.de
    link
    fedilink
    English
    arrow-up
    29
    ·
    6 months ago

    Accoring to the math in this video: :

    • 150 000 000 miles have been driven with Teslas “FSD”, which equals to
    • 375 miles per tesla purchased with FSD capabilities
    • 736 known FSD crashes with 17 fatalities
    • equals 11.3 deaths per 100M miles of teslas FSD

    Doesnt sound to bad, until you hear that a human produces 1.35 deaths per 100M miles driven…

    Its rough math, but holy moly that already is a completely other class of deadly than a non FSD car

    • dufkm@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      6 months ago

      a human produces 1.35 deaths per 100M miles driven

      My car has been driven around 100k miles by a human, i.e. it has produced 0.00135 deaths. Is that like a third of a pinky toe?

    • NotMyOldRedditName@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      6 months ago

      That number is like 1.5 billion now and rising exponentially fast.

      Also those deaths weren’t all FSD they were AP.

      The report says 1 FSD related (not caused by but related) death. For whatever reason the full details on that one weren’t released.

      Edit: There are billions of miles on AP. In 2020 it was 3 billion

      Edit: Got home and I tried finding AP numbers through 2024 but haven’t seen anything recent, but given 3 billion 2020, and 2 billion in 2019, and an accelerating rate of usage with increased car sales, 2023 is probably closer to 8 billion miles. I imagine we’d hear when they reach 10 billion.

      So 8 billion miles, 16 AP fatalities (because that 1 FSD one isn’t the same) is 1 fatality per 500,000,000 miles, or put into the terms above by per 100mil miles, 0.2 fatalities per 100 million miles or 6.75 times less than a human produces. And nearly all of these fatal accidents were from blatant misuse of the system like driving drunk (at least a few) or using their phone and playing games.

  • kava@lemmy.world
    link
    fedilink
    English
    arrow-up
    27
    ·
    6 months ago

    Is the investigation exhaustive? If these are all the crashes they could find related to the driver assist / self driving features, then it is probably much safer than a human driver. 1000 crashes out of 5M+ Teslas sold the last 5 years is actually a very small amount

    I would want an article to try and find the rate of accidents per 100,00, group it by severity, and then compare and contrast that with human caused accidents.

    Because while it’s clear by now Teslas aren’t the perfect self driving machines we were promised, there is no doubt at all that humans are bad drivers.

    We lose over 40k people a year to car accidents. And fatal car accidents are rare, so multiple that by like 100 to get the total number of car accidents.

    • Blackmist@feddit.uk
      link
      fedilink
      English
      arrow-up
      33
      ·
      6 months ago

      The question isn’t “are they safer than the average human driver?”

      The question is “who goes to prison when that self driving car has an oopsie, veers across three lanes of traffic and wipes out a family of four?”

      Because if the answer is “nobody”, they shouldn’t be on the road. There’s zero accountability, and because it’s all wibbly-wobbly AI bullshit, there’s no way to prove that the issues are actually fixed.

        • Blackmist@feddit.uk
          link
          fedilink
          English
          arrow-up
          17
          ·
          6 months ago

          Accountability is important. If a human driver is dangerous, they get taken off the roads and/or sent to jail. If a self driving car kills somebody, it’s just “oops, oh well, these things happen, but shareholder make a lot of money so never mind”.

          I do not want “these things happen” on my headstone.

      • kava@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        6 months ago

        Because if the answer is “nobody”, they shouldn’t be on the road

        Do you understand how absurd this is? Let’s say AI driving results in 50% less deaths. That’s 20,000 people every year that isn’t going to die.

        And you reject that for what? Accountability? You said in another comment that you don’t want “shit happens sometimes” on your headstone.

        You do realize that’s exactly what’s going on the headstones of those 40,000 people that die annually right now? Car accidents happen. We all know they happen and we accept them as a necessary evil. “Shit happens”

        By not changing it, ironically, you’re advocating for exactly what you claim you’re against.

      • slumberlust@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        6 months ago

        The question for me is not what margins the feature is performing on, as they will likely be better than human error raters, but how they market the product irresponsiblely.

    • machinin@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      ·
      edit-2
      6 months ago

      I was looking up info for another comment and found this site. It’s from 2021, but the information seems solid.

      https://www.flyingpenguin.com/?p=35819

      This table was probably most interesting, unfortunately the formatting doesn’t work on mobile, but I think you can make sense of it.

      Car 2021 Sales So Far Total Deaths

      Tesla Model S 5,155 40

      Porsche Taycan 5,367 ZERO

      Tesla Model X 6,206 14

      Volkswagen ID 6,230 ZERO

      Audi e-tron 6,884 ZERO

      Nissan Leaf 7,729 2

      Ford Mustang Mach-e 12,975 ZERO

      Chevrolet Bolt 20,288 1

      Tesla Model 3 51,510 87

      So many cars with zero deaths compared to Tesla.

      It isn’t if Tesla’s FSD is safer than humans, it’s if it’s keeping up with the automotive industry in terms of safety features. It seems like they are falling behind (despite what their marketing team claims).

      • TypicalHog@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        6 months ago

        Wait. All of this have the same level and capability of self driving as Tesla?

      • NιƙƙιDιɱҽʂ@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        6 months ago

        For example, I dont really trust mine and mostly use it in slow bumper to bumper traffic, or so I can adjust my AC on the touchscreen without swerving around in my lane.

    • suction@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      6 months ago

      Only Elon calls his level 2 automation “FSD” or even “Autopilot”. That alone proves that Tesla is more guilty of these deaths than other makers are who choose less evil marketing terms. The dummies who buy Elon’s crap take those terms at face value and the Nazi CEO knows that, he doesn’t care though because just like Trump he thinks of his fans as little more than maggots. Can’t say I blame him.

  • froh42@lemmy.world
    link
    fedilink
    English
    arrow-up
    25
    ·
    edit-2
    6 months ago

    “If you’ve got, at scale, a statistically significant amount of data that shows conclusively that the autonomous car has, let’s say, half the accident rate of a human-driven car, I think that’s difficult to ignore,” Musk said.

    That’s a very problematic claim - and it might only be true if you compare completely unassited vehicles to L2 Teslas.

    Other brands also have a plethora of L2 features, but they are marketed and designed in a different way. The L2 features are activate but designed in a way to keep the driver engaged in driving.

    So L2 features are for better safety, not for a “wow we live in the future” show effect.

    For example lane keeping in my car - you don’t notice it when driving, it is just below your level of attention. But when I’m unconcentrated for a moment the car just stays on the lane, even on curving roads. It’s just designed to steer a bit later than I would do. (Also, even before, the wheel turns minimally lighter into the direction to keep the car center of lane, than turning it to the other direction - it’s just below what you notice, however if you don’t concentrate on that effect)

    Adaptive speed control is just sold as adaptive speed control - it did notice it uses radar AND the cameras once, as it considers. my lane free as soon the car in front me clears the lane markings with its wheels (when changing lanes)

    It feels like the software in my car could do a lot more, but its features are undersold.

    The combination of a human driver and the driver assist systems in combination makes driving a lot safer than relying on the human or the machine alone.

    In fact the braking assistant has once stopped my car in tight traffic before I could even react, as the guy in front of me suddenly slammed their brakes. If the system had failed and not detected the situation then it would have been my job to react in time. (I did react, but can’t say if I might have been fast enough with reaction times)

    What Tesla does with technology is impressive, but I feel the system could be so. much better if they didn’t compromise saftey in the name of marketing and hyperbole.

    If Tesla’s Autopilot was designed frim ground up to keep the driver engaged, I believe it would really be the safest car on the road.

    I feel they are rather designed to be able to show off “cool stuff”.

    • ForgotAboutDre@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      ·
      6 months ago

      Tesla’s autopilot isn’t the best around. It’s just the most deployed and advertised. People creating autopilot responsibly don’t beta test them with the kind of idiots that think Tesla autopilot is the best approach.

      • Thorny_Insight@lemm.ee
        link
        fedilink
        English
        arrow-up
        4
        ·
        edit-2
        6 months ago

        If Tesla’s self-driving isn’t the best one around then which one is? I’m not aware of any other system capable of doing what FSD does. Manufacturers like Mercedes may have more trust in their system because it only works on a limited number of hand-picked roads and under ideal conditions. I still wouldn’t say that what essentially is a train is better system for getting around than a car with full freedom to take you anywhere.

        • machinin@lemmy.world
          link
          fedilink
          English
          arrow-up
          19
          ·
          edit-2
          6 months ago

          All throughout these comments, you seem deeply, deeply confused. Let’s go over this sloooowly.

          Mercedes has two autonomous systems. Let’s call them MB FSD and MB Autodrive.

          MB FSD has similar features to Tesla’s. It isn’t geo-restricted. You have to pay attention, just like Tesla. It isn’t true autonomous driving, just like Tesla. If you have an accident, you are responsible, just like Tesla.

          MB Autodrive is another feature set. It is L3 autonomy, which means it is limited geographically and the driver should be available to take over when prompted. It also means that the driving is completely autonomous. The driver can be reading, playing on their phone, or simply laying there with their eyes closed. Mercedes will even take legal and financial responsibility for any accidents that happen on their system.

          So, to summarize:

          FSD -type systems: Mercedes and Tesla (and many other car makers)

          Level 3: not Tesla, Mercedes

          True autonomous driving is when the manufacturer takes responsibility for the car’s actions. Anything else is assisted driving. Until Tesla takes responsibility for accidents, you can’t consider them to have certified autonomous driving.

          Is that any clearer to you? After seeing some of your other shilling for Tesla in other posts, maybe there is a reason you don’t want to recognize the advantages of other systems?

          • suction@lemmy.world
            link
            fedilink
            English
            arrow-up
            8
            ·
            6 months ago

            Absolutely correct. It’s so disheartening how many guys like him out there are hurting us all with their admiration for con-men like Trump and Musk and absolute inability to fact check

        • suction@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          6 months ago

          It’s level 2 automation, a lot of other makers have that. You need to look past the juicy marketing language, there’s standards and norms which Tesla cannot go beyond because then it’ll be illegal to drive the cars on public roads.

  • set_secret@lemmy.world
    link
    fedilink
    English
    arrow-up
    23
    ·
    6 months ago

    VERGE articles seem to be getting worse over the years, they’ve almost reached Forbes level, yes this does raise some valid safety concerns. No Tesla isn’t bad just because it’s Tesla.

    It doesn’t really give us the full picture. For starters, there’s no comparison with Level 2 systems from other car makers, which also require driver engagement and have their own methods to ensure attention. This would help us understand how Tesla’s tech actually measures up.

    Plus, the piece skips over extremely important stats that would give us a clearer idea of how safe (or not) Tesla’s systems are compared to good old human driving.

    We’re left in the dark about how Tesla compares in scenarios like drunk, distracted, or tired driving—common issues that automation aims to mitigate. (probably on purpose).

    It feels like the article is more about stirring up feelings against Tesla rather than diving deep into the data. A more genuine take would have included these comparisons and variables, giving us a broader view of what these technologies mean for road safety.

    I feel like any opportunity to jump on the Elon hate wagon is getting tiresome. (and yes i hate Elon too).

    • WormFood@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      ·
      6 months ago

      a more genuine take would have included a series of scenarios (e.g. drunk/distracted/tired driving)

      I agree. they did tesla dirty. a more fair comparison would’ve been between autopilot and a driver who was fully asleep. or maybe a driver who was dead?

      and why didn’t this news article contain a full scientific meta analysis of all self driving cars??? personally, when someone tells me that my car has an obvious fault, I ask them to produce detailed statistics on the failure rates of every comparable car model

      • mojofrododojo@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        6 months ago

        a driver who was fully asleep. or maybe a driver who was dead?

        why does it need to become a specious comparison for it to be valid in your expert opinion? because those comparisons are worthless.

    • PersnickityPenguin@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      6 months ago

      A couple of my criticisms with the article, which is about “autopilot” and not fsd:

      -conflating autopilot and dad numbers, they are not interoperable systems. They are separate code bases with different functionality.

      -the definition of “autopilot” seems to have been lifted from the aviation industry. The term is used to describe a system that controls the vector of a vehicle, is the speed and direction. That’s all. This does seem like a correct description for what the autopilot system does. While “FSD” does seem like it does not live up to expectations, not being a true level 5 driving system.

      Merriam Webster defines autopilot thusly:

      “A device for automatically steering ships, aircraft, and spacecraft also : the automatic control provided by such a device”

  • nek0d3r@lemmy.world
    link
    fedilink
    English
    arrow-up
    18
    ·
    6 months ago

    I love to hate on musky boi as much as the next guy, but how does this actually compare to vehicular accidents and deaths overall? CGP Grey had the right idea when he said they didn’t need to be perfect, just as good as or better than humans.

    • machinin@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      ·
      edit-2
      6 months ago

      Grey had the right idea when he said they didn’t need to be perfect, just as good as or better than humans.

      The better question - is Tesla’s FSD causing drivers to have more accidents than other driving assist technologies? It seems like a yes from this article and other data I’ve linked elsewhere in this thread.

      • nek0d3r@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        6 months ago

        I appreciate this response amongst all the malding! My understanding of the difference in assistive technologies across different companies is lacking, so I’ll definitely look more into this.

  • unreasonabro@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    ·
    edit-2
    6 months ago

    Obviously the time to react to the problem was before the system told you about it, that’s the whole point, THE SYSTEM IS NOT READY. Cars are not ready to drive themselves, and obviously the legal system is too slow and backwards to deal with it so it’s not ready either. But fuck it let’s do it anyway, sure, and while we’re at it we can do away with the concept of the driver’s license in the first place because nothing matters any more and who gives a shit we’re all obviously fucking retarded.

    • letsgo@lemm.ee
      link
      fedilink
      English
      arrow-up
      3
      ·
      6 months ago

      OK.

      Question: how do you propose I get to work? It’s 15 miles, there are no trains, the buses are far too convoluted and take about 2 hours each way (no I’m not kidding), and “move house” is obviously going to take too long (“hey boss, some rando on the internet said “stop using cars” so do you mind if I take indefinite leave to sell my house and buy a closer one?”).

        • letsgo@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          ·
          6 months ago

          I already have (Yamaha MT10), but presumably that has the same problem that cars do (burning fossil fuels); also it’s no good in shit weather (yeah I know that means I need better clothing).

    • TypicalHog@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      ·
      6 months ago

      I swear some people in this thread would call airplane autopilot bad cause it causes SOME death from time to time.

  • Betide@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    6 months ago

    The same people who are upset over self driving cars are the ones who scream at the self checkout that they shouldn’t have to scan their own groceries because the store isn’t paying them.

    32% of all traffic crash fatalities in the United States involve drunk drivers.

    I can’t wait until the day that this kind of technology is required by law I’m tired of sharing the road with these idiots and I absolutely trust self driving vehicles more than I trust other humans.

    • kat_angstrom@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      6 months ago

      I’ve never heard of anyone screaming because they had to scan their own groceries at a self-checkout. Is this a common thing?

    • EvolvedTurtle@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      6 months ago

      I recently learned that at least half of the drivers where I live thing it’s fine to cut me off while we are going 70mph on the highway with no signal

    • Flying Squid@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      6 months ago

      people who are upset over self driving cars

      If you are talking about Teslas, you can’t be upset about something a car doesn’t actually do unless you think it’s actually capable of doing it.

      The only thing I don’t like is that Tesla is able to claim it has a “full self driving” mode which is not full self driving. Seems like false advertising to me.

  • NutWrench@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    6 months ago

    “self-driving cars” are not going to be a thing within our lifetimes. It’s a problem that requires MUCH smarter AIs than we currently have.