• deafboy@lemmy.world
    link
    fedilink
    English
    arrow-up
    61
    ·
    2 months ago

    And they managed to do it without us obsessing about their CEO several times a day? I refuse to believe that!

  • cAUzapNEAGLb@lemmy.world
    link
    fedilink
    English
    arrow-up
    53
    ·
    edit-2
    2 months ago

    As of April 11, there were 65 Mercedes autonomous vehicles available for sale in California, Fortune has learned through an open records request submitted to the state’s DMV. One of those has since been sold, which marks the first sale of an autonomous Mercedes in California, according to the DMV. Mercedes would not confirm sales numbers. Select Mercedes dealerships in Nevada are also offering the cars with the new technology, known as “level 3” autonomous driving.

    Drivers can activate Mercedes’s technology, called Drive Pilot, when certain conditions are met, including in heavy traffic jams, during the daytime, on spec ific California and Nevada freeways, and when the car is traveling less than 40 mph. Drivers can focus on other activities until the vehicle alerts them to resume control. The technology does not work on roads that haven’t been pre-approved by Mercedes, including on freeways in other states.

    U.S. customers can buy a yearly subscription of Drive Pilot in 2024 EQS sedans and S-Class car models for $2,500.

    Mercedes is also working on developing level 4 capabilities. The automaker’s chief technology officer Markus Schäfer expects that level 4 autonomous technology will be available to consumers by 2030, Automotive News reported.

    • Ilovethebomb@lemm.ee
      link
      fedilink
      English
      arrow-up
      34
      ·
      2 months ago

      Hmm, so only on a very small number of predetermined routes, and at very slow speeds for those roads.

      Still impressive, but not as impressive as the headline makes out.

          • ours@lemmy.world
            link
            fedilink
            English
            arrow-up
            8
            ·
            2 months ago

            Having known one, some of their customers love their feature loaded cars to brag about and feel extra special. Some will definitely pay the 2.5k gladly.

        • Veraxus@lemmy.world
          link
          fedilink
          English
          arrow-up
          10
          ·
          edit-2
          2 months ago

          If they assume full liability for any collisions while the feature is active (and it looks like they do), then I can see that being fair.

      • Turun@feddit.de
        link
        fedilink
        English
        arrow-up
        5
        ·
        2 months ago

        Yes, but it’s actually level 3.

        Not the Tesla “full self driving - no wait we actually lied to you, you need to be alert at all times” bullshit.

  • eee@lemm.ee
    link
    fedilink
    English
    arrow-up
    48
    ·
    2 months ago

    U.S. customers can buy a yearly subscription of Drive Pilot in 2024 EQS sedans and S-Class car models for $2,500

    yeah, fuck that.

  • merthyr1831@lemmy.world
    link
    fedilink
    English
    arrow-up
    44
    ·
    2 months ago

    Love how companies can decide who has to supervise their car’s automated driving and not an actual safety authority. Absolutely nuts.

    • Trollception@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      ·
      2 months ago

      Who said there was no safety authority involved? I thought it was part of the 4 level system the government decided on for assisted driving.

    • Cosmic Cleric@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      2 months ago

      Paywalled.

      On a different subject, why would someone downvote a one-word comment that accurately describes what the content is behind?

      • stoly@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        2 months ago

        There are people who are pathologically contrarian. I’ve had to end a friendship over it—the endless need to say something negative about literally everything that ever happens and an unwillingness to be charitable to others.

  • nucleative@lemmy.world
    link
    fedilink
    English
    arrow-up
    24
    ·
    2 months ago

    Wonder how this works with car insurance. Os there a future where the driver doesn’t need to be insured? Can the vehicle software still be “at fault” and how will the actuaries deal with assessing this new risk.

    • machinin@lemmy.world
      link
      fedilink
      English
      arrow-up
      18
      ·
      2 months ago

      I believe Mercedes takes responsibility if there is an accident while driving autonomously.

      • Rinox@feddit.it
        link
        fedilink
        English
        arrow-up
        13
        ·
        2 months ago

        Will it pull a Tesla and switch off the autopilot seconds before an accident?

          • T156@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            2 months ago

            If memory serves, that’s not an intentional feature, but more a coincidence, since if the driver thinks the cruise control is about to crash the car, they’ll pop the brakes. Touching the brakes disengages the cruise control by design, so you end up with it shutting down before a crash happens.

            • nucleative@lemmy.world
              link
              fedilink
              English
              arrow-up
              6
              ·
              2 months ago

              That makes perfect sense. If the driver looks up to notice that he’s in a dangerous, unfixable situation, slams the breaks, disconnecting the autopilot (which have been responaible for letting the situation develop) hopefully the automaker can’t entirely say “not our fault, the system wasn’t even engaged at the time of the collision”

      • Sizzler@slrpnk.net
        link
        fedilink
        English
        arrow-up
        10
        ·
        2 months ago

        And this is how they will push everyone into driverless. Through insurance costs. Who would insure 1 human driver vs 100 bots, (once the systems have a few billion miles on them)

        • nucleative@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          edit-2
          2 months ago

          You’re probably right. Another decade or two and human driver controlled cars might be prohibitively expensive to insure for some or even not allowed in certain areas.

          I can imagine an awesome world where that’s a great thing but also imagine a dystopian world like wall-e as well. I guess we’ll know then which one we chose.

    • Hugin@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      ·
      2 months ago

      Berkshire Hathaway owns Geico the car insurance company. In one of his annual letters Buffett said that autonomous cars are going to be great for humanity and bad for insurance companies.

      “If [self-driving cars] prove successful and reduce accidents dramatically, it will be very good for society and very bad for auto insurers.”

      Actuaries are by definition bad at assessing new risk. But as data get collected they quickly adjust to it. There are a lot of cars so if driverless cars become even a few percent of cars on the road they will quickly be able to build good actuarial tables.

  • daikiki@lemmy.world
    link
    fedilink
    English
    arrow-up
    23
    ·
    2 months ago

    According to who? Did the NTSB clear this? Are they even allowed to clear this? If this thing fucks up and kills somebody, will the judge let the driver off the hook 'cuz the manufacturer told them everything’s cool?

    • maynarkh@feddit.nl
      link
      fedilink
      English
      arrow-up
      17
      ·
      2 months ago

      According to who? Did the NTSB clear this?

      Yes.

      If this thing fucks up and kills somebody, will the judge let the driver off the hook 'cuz the manufacturer told them everything’s cool?

      Yes, the judge will let the driver off the hook, because Mercedes told them it will assume the liability instead.

    • Trollception@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      ·
      edit-2
      2 months ago

      You do realize humans kill hundreds of other humans a day in cars, right? Is it possible that autonomous vehicles may actually be safer than a human driver?

      • KredeSeraf@lemmy.world
        link
        fedilink
        English
        arrow-up
        23
        ·
        2 months ago

        Sure. But no system is 100% effective and all of their questions are legit and important to answer. If I got hit by one of these tomorrow I want to know the process for fault, compensation and pathway to improvement are all already done not something my accident is going to landmark.

        But that being said, I was a licensing examiner for 2 years and quit because they kept making it easier to pass and I was forced to pass so many people who should not be on the road.

        I think this idea is sound, but that doesn’t mean there aren’t things to address around it.

        • Trollception@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          2 months ago

          Honestly I’m sure there will be a lot of unfortunate mistakes until computers and self driving systems can be relied upon. However there needs to be an entry point for manufacturers and this is it. Technology will get better over time, it always has. Eventually self driving autos will be the norm.

          • KredeSeraf@lemmy.world
            link
            fedilink
            English
            arrow-up
            10
            ·
            2 months ago

            That still doesn’t address all the issues surrounding it. I am unsure if you are just young and not aware how these things work or terribly naive. But companies will always cut corners to keep profits. Regulation forces a certain level of quality control (ideally). Just letting them do their thing because “it’ll eventually get better” is a gateway to absurd amounts of damage. Also, not all technology always gets better. Plenty just get abandoned.

            But to circle back, if I get hit by a car tomorrow and all these thinga you think are unimportant are unanswered does that mean I might mot get legal justice or compensation? If there isn’t clearly codified law I might not, and you might be callous enough to say you don’t care about me. But what about you? What if you got hit by a unmonitored self driving car tomorrow and then told you’d have to go through a long, expensive court battle to determine fault because no one had done it it. So you’re in and out of a hospital recovering and draining all of your money on bills both legal and medical to eventually hopefully get compensated for something that wasn’t your fault.

            That is why people here are asking these questions. Few people actually oppose progress. They just need to know that reasonable precautions are taken for predictable failures.

            • Trollception@lemmy.world
              link
              fedilink
              English
              arrow-up
              5
              ·
              edit-2
              2 months ago

              To be clear I never said that I didn’t care about an individual’s safety, you inferred that somehow from my post and quite frankly are quite disrespectful. I simply stated that autonomous vehicles are here to stay and that the technology will improve more with time.

              The legal implications of self driving cars are still being determined and as this is literally one of the first approved technologies available. Tesla doesn’t count as it’s not a SAE level 3 autonomous driving vehicle. There are some references in the liability section of the wiki.

              https://en.m.wikipedia.org/wiki/Regulation_of_self-driving_cars

            • Llewellyn@lemm.ee
              link
              fedilink
              English
              arrow-up
              4
              ·
              edit-2
              2 months ago

              But then it’s good that the manufacturer states the driver isn’t obliged to watch the road. Because it shifts responsibility towards the manufacturer and thus - it’s a great incentive to make technology as safe as possible.

      • Adanisi@lemmy.zip
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        2 months ago

        *at 40mph on a clear straight road on a sunny day in a constant stream of traffic with no unexpected happenings, Ts&Cs apply.

      • stoly@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 months ago

        Only on closed courses. The best AI lacks the basic heuristics of a child and you simply can’t account for all possible outcomes.

  • Ultragigagigantic@lemmy.world
    link
    fedilink
    English
    arrow-up
    17
    ·
    2 months ago

    if it can drive a car why wouldn’t it be able to drive a truck?

    I’m surprised companies don’t just build their own special highway for automated trucking and use people for last mile stuff.

      • blackn1ght@feddit.uk
        link
        fedilink
        English
        arrow-up
        4
        ·
        2 months ago

        The human does it out of self preservation, but the car doesn’t need to feel too preserve itself.

        By getting the in the car, the passengers should be aware of the risks and that if there is an accident, the car will protect pedestrians over the occupants. The pedestrians had no choice but the passengers have a choice of not getting in the vehicle.

        I feel like car manufacturers are going to favour protecting the passengers as a safety feature, and then governments will eventually legislate it to go the other way after a series of high profile deaths of child pedestrians.

        • Thorny_Insight@lemm.ee
          link
          fedilink
          English
          arrow-up
          5
          ·
          2 months ago

          You’re probably over-estimating the likelyhood of a scenario where a self driving car needs to make a such decision. Also take into account that if a self driving car is a significantly better driver than a human then it’s by definition going to be much safer for pedestrians aswell even if it’s programmed to prioritize the passengers.

    • Thorny_Insight@lemm.ee
      link
      fedilink
      English
      arrow-up
      8
      ·
      2 months ago

      Who would buy a car that will sacrifice the passengers in the event of an unavoidable accident? If it’s significantly better driver than a human would be then it’s safer for pedestrians aswell.

    • Skates@feddit.nl
      link
      fedilink
      English
      arrow-up
      7
      ·
      2 months ago

      Yes. As it should be. I’ll buy the car that chooses to mow down a sidewalk full of pregnant babies instead of mildly inconveniencing myself or my passengers. Why the hell would you even consider any other alternative?

    • Rinox@feddit.it
      link
      fedilink
      English
      arrow-up
      4
      ·
      2 months ago

      It’s not really an issue. 99.9% of the time the passengers will already be safe and the pedestrian is the one at risk. The only time I see this being an issue is if the car is already out of control, but at that point there’s little anyone can do.

      I mean, what’s the situation where a car can’t break but has enough control where it HAS to kill a pedestrian in order to save the passengers?

      • MeanEYE@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 months ago

        Tesla on their autopilot during night. All the time basically. There were number of motorcycle deaths where Tesla just mowed them down. The reason? They had two tail lights side by side instead one big light. Tesla thought this was a car far away and just ran through people.

        • Rinox@feddit.it
          link
          fedilink
          English
          arrow-up
          3
          ·
          2 months ago

          That’s a problem with the software. The passengers in the car were never at risk and the car could have stopped at any time, the issue was that the car didn’t know what was happening. This situation wouldn’t have engaged the autopilot in the way we are discussing.

          As an aside, if what you said is true, people at Tesla should be in jail. WTF

  • elrik@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    ·
    2 months ago

    How is this different from the capabilities of Tesla’s FSD, which is considered level 2? It seems like Mercedes just decided they’ll take on liability to classify an equivalent level 2 system as level 3.

    • rsuri@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      edit-2
      2 months ago

      According to the mercedes website the cars have radar and lidar sensors. FSD has radar only, but apparently decided to move away from them and towards optical only, I’m not sure if they currently have any role in FSD.

      That’s important because FSD relies on optical sensors only to tell not only where an object is, but that it exists. Based on videos I’ve seen of FSD, I suspect that if it hasn’t ingested the data to recognize, say, a plastic bucket, it won’t know that it’s not just part of the road (or at best can recognize that the road looks a little weird). If there’s a radar or lidar sensor though, those directly measure distance and can have 3-D data about the world without the ability to recognize objects. Which means they can say “hey, there’s something there I don’t recognize, time to hit the brakes and alert the driver about what to do next”.

      Of course this still leaves a number of problems, like understanding at a higher level what happened after an accident for example. My guess is there will still be problems.

    • GoodEye8@lemm.ee
      link
      fedilink
      English
      arrow-up
      9
      ·
      2 months ago

      You’ve inadvertently pointed out how Tesla deliberately skirts the law. Teslas are way more capable than what level 2 describes, but they choose to stay as level 2 so they wouldn’t have to take responsibility for their public testing

    • philpo@feddit.de
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      It’s not about the sensors, it’s about the software. That’s the solution.

      • skyspydude1@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 months ago

        Please tell me how software will be able to detect objects in low/no-light conditions if they say, have cameras with poor dynamic range and no low-light sensitivity?

    • ShepherdPie@midwest.social
      link
      fedilink
      English
      arrow-up
      22
      ·
      2 months ago

      Because it’s an extremely narrowly defined set of requirements in order to use it. It’s “approved freeways with clear markings and moderate to heavy traffic under 40MPH during daytime hours and clear conditions” meaning it will inch forward for you in bumper to bumper traffic provided you’re in an approved area and that’s it.

      https://www.mbusa.com/en/owners/manuals/drive-pilot

          • JasonDJ@lemmy.zip
            link
            fedilink
            English
            arrow-up
            2
            ·
            2 months ago

            In theory. In practice, it just beeps at you if your sandwich hand is steering.

          • jj4211@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 months ago

            Well, not always hands on wheel. I have spent over an hour straight on an interstate with hands off. Ford’s system watches your eyes and lets your hands stay off if it’s decent conditions and on a LIDAR-mapped freeway. Note I wouldn’t trust it at night (there have been two crashes, both at night with stopped vehicles on freeway), but then I wouldn’t really trust myself at night either too much (there are many many more human caused crashes at night, I’m not sure a human at freeway speed could avoid a crash with a surprise stationary vehicle in middle of the road).

      • Evotech@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 months ago

        Still seems not legal to not pay attention to the road. Wouldn’t fly over here at least.

    • KISSmyOSFeddit@lemmy.world
      link
      fedilink
      English
      arrow-up
      15
      ·
      2 months ago

      They got certification from the authorities, and in the event of an accident, the manufacturer takes on responsibility.

      • melpomenesclevage@lemm.ee
        link
        fedilink
        English
        arrow-up
        4
        ·
        2 months ago

        lol, ‘manufacturer takes on responsibility’ so… I’m just fucked if one of these hits me?

        see a mercedes, shoot a mercedes. destroy it in whatever way you can.

        • KISSmyOSFeddit@lemmy.world
          link
          fedilink
          English
          arrow-up
          27
          ·
          2 months ago

          No you’re guaranteed that the Mercedes that hit you is better insured for paying out your damages than pretty much anyone else on the road that could hit you.

          • melpomenesclevage@lemm.ee
            link
            fedilink
            English
            arrow-up
            9
            ·
            2 months ago

            lol corporations don’t have responsibility though. that’s the whole point of them. they’re machines for avoiding responsibility.

          • Tankton@lemm.ee
            link
            fedilink
            English
            arrow-up
            5
            ·
            2 months ago

            The sad part of this is somehow thinking that payment solves any problem. Like, idk what they would pay me, just bring back my dead wife/child/father whatever. You can’t fix everything with money.

            • QuaternionsRock@lemmy.world
              link
              fedilink
              English
              arrow-up
              8
              ·
              2 months ago

              It only works on a small handful of freeways (read: no pedestrians) in California/Nevada, and only under 40 MPH. The odds of a crash within those parameters resulting in a fatality are quite low.

            • Llewellyn@lemm.ee
              link
              fedilink
              English
              arrow-up
              6
              ·
              2 months ago

              Human drivers are far more dangerous on the road, and you should be applauding assisted driving development.

              • jj4211@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                2 months ago

                This presumes the options are only:

                • Human and no autonomous system watching
                • Autonomous system, with no meaningful human attention

                Key word is ‘assisted’ driving. ADAS should roughly be a nice add, so long as human attention is policed. Ultimately, the ADAS systems are better able to react to some situations, but may utterly make some stupid calls in exceptional scenarios.

                Here, the bar of ‘no human paying attention at all’ is one I’m not entirely excited about celebrating. Of course the conditions are “daytime traffic jam only”, where risk is pretty small, you might have a fender bender, pedestrians are almost certainly not a possibility, and the conditions are supremely monotonous, which is a great area for ADAS but not a great area for bored humans.

  • just_another_person@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    ·
    2 months ago

    It will be litigated almost immediately. There is no current combination of model and hardware platform that a car could reasonably run that could be called “fully self driving” at any useful speed. This thing sounds like parking assist on steroids maybe, or “stalled traffic assist”. They will be sued.

    • explodes@lemmy.world
      link
      fedilink
      English
      arrow-up
      23
      ·
      2 months ago

      Did you read the article? There are already plenty of conditions for activating the self driving mode.

    • cm0002@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      ·
      2 months ago

      There’s tons of conditions

      when certain conditions are met, including in heavy traffic jams, during the daytime, on spec ific California and Nevada freeways, and when the car is traveling less than 40 mph. Drivers can focus on other activities until the vehicle alerts them to resume control.

      I doubt this is a mistake, they must have really high confidence in the tech as well as with the restrictions, not even Tesla had the balls to announce that you could drive distracted.

      • Thorny_Insight@lemm.ee
        link
        fedilink
        English
        arrow-up
        7
        ·
        edit-2
        2 months ago

        not even Tesla had the balls to announce that you could drive distracted.

        That’s the difference between Level 2 and Level 3 full self driving. Teslas are Level 2.

        • cm0002@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 months ago

          That’s what I’m saying, they could have called this a “Ultra advanced level 2” and avoided opening themselves up to a TON of liabilities. Once you start saying this is a level 3 system and you don’t need to pay attention to the road with it, well, that shuts the door to many defenses they could use of it was “just” level 2 if something happens. So that means they must be really confident in their system

    • Thorny_Insight@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      There is no current combination of model and hardware platform that a car could reasonably run that could be called “fully self driving” at any useful speed.

      It’s still not flawless and reguires an attentive driver but Tesla FSD Beta V12 is pretty damn impressive. They made a huge leap forward by going from human code to 100% neural nets. I don’t think we’re too far a way from a true robo-taxi and there’s going to be some humble pie served for the LiDAR/radar advocates. I highly recommend everyone to watch some reviews on YouTube if you aren’t up to speed with the recent changes they’ve made.