• MangoCats@feddit.it
      link
      fedilink
      English
      arrow-up
      3
      ·
      3 hours ago

      Waymo knowing when it is stumped is actually a pretty good thing. Better than just running over cats & small children.

  • CmdrShepard49@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    35
    arrow-down
    1
    ·
    1 day ago

    Curious what the law is with regard to someone in the Philipines driving a car on US roads without a US driver’s license.

  • NotMyOldRedditName@lemmy.world
    link
    fedilink
    English
    arrow-up
    22
    arrow-down
    1
    ·
    edit-2
    1 day ago

    For anyone that is curious, Waymo actually is capable of remote moving the vehicles despite what they say. They do their best not to admit it’s possible, but it’s right in the CPUC filings as a footnote, and probably the only place they’ll ever admit it.

    https://www.cpuc.ca.gov/-/media/cpuc-website/divisions/consumer-protection-and-enforcement-division/documents/tlab/av-programs/tcp0038152a-waymo-al-0003_a1b.pdf

    In very limited circumstances such as to facilitate movement of the AV out of a freeway lane onto an adjacent shoulder, if possible, our Event Response agents are able to remotely move the Waymo AV under strict parameters, including at a very low speed over a very short distance.

    I’m not opposed or knocking that they can do this, but they are lying to or misleading people when they say it can’t be done.

    • ageedizzle@piefed.ca
      link
      fedilink
      English
      arrow-up
      7
      ·
      16 hours ago

      Waymo really seems to be winning out over Tesla with the self-driving thing. I wonder how much of that is really just because Waymo cars have a remote human driving them in situations where a Tesla would just crap out

      • NotMyOldRedditName@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        edit-2
        11 hours ago

        I’m not fully up to speed on Waymo and if they have ever released remote assistance/ miles details, but when Cruise went through that shit storm a year or two ago, it came out that that the cars were asking for help every few miles.

        Cruise was essentially all smoke and mirrors.

        • ageedizzle@piefed.ca
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          11 hours ago

          Interesting. Stuff like this makes me suspicious of the current LLM hype. I know it’s not necessarily language models per se being used by these vehicles, but still. If we were really on the cusp of AGI then I’d expect us to have at least cracked autonomous driving by now.

          • NotMyOldRedditName@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            ·
            edit-2
            11 hours ago

            Ya, I don’t buy the hype around AGI. Like a Waymo drove into a telephone pole because of something they had to fix in their code. I’m not doubting there’s AI involved, neural nets, machine learning, whatever, but this isn’t an AGI type level development. Nor do I think they need an AGI to do this.

            I’m also not convinced this LLM stuff can ever lead to AGI either. I think it can do some pretty impressive things with some very real drawbacks/caveats and there is definitely room to keep improving them, but that the whole architecture is flawed if you want to make an AGI.

            • ageedizzle@piefed.ca
              link
              fedilink
              English
              arrow-up
              1
              ·
              9 hours ago

              Nor do I think they need an AGI to do this.

              Yeah I guess theres a lot of interesting stuff we can do with AI without necessarily achieving AGI. What about programming? Even if we don’t get AGI soon, do you still think LLMs will be snatching up a sizeable chunk of programming jobs?

              • NotMyOldRedditName@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                60 minutes ago

                So I’m developer, I do mobile apps, and I do use Claude/GPT.

                I could be wrong, but I don’t foresee any imminent collapse of developer jobs, but it does have its uses. I think if anything it’ll be fewer lower end positions, but if you don’t hire and teach new devs, that’s going to have repercussions down the road.

                I needed to make a webpage for example, and I’m not a webdev, and it helped me create a static landing webpage. I can tell that the webpage code is pretty shitty, but it does work for it’s purposes. This either replaced a significant amount of time learning how to do it, or replaced me hiring a contractor to do it. But I also am not really any better off at writing a webpage if I needed to make a 2nd one having used it, as I didn’t lean much in the process.

                But setting it all up also did have me have to work on the infrastructure behind it. The AI was able to help guide me through that as well, but it did less of it. That I did learn, and would be able to leverage that for future work.

                When it comes to my actual mobile work, I don’t like asking it do anything substantial as the quality is usually pretty low. I might ask it to build a skeleton of something that I can fill out, I’ll often ask it’s opinions on a small piece of code I wrote and look for a better way to write it, and in that case it has helped me learn new things. I’ll also talk to it about planning something out and getting some insights on the topic before I write any code.

                It gives almost as many wrong/flawed answers as right answers if there’s even a tiny bit of complexity, so you need to know how to sift through the crap which you won’t know if you aren’t a developer. It will tell you APIs exist that don’t. It will recommend APIs that were deprecated years ago. The list goes on and on and on. This also happened while I was making the webpage, so my developer skills were still required to get to the end product I wanted.

                I can’t see how it will replace a sizeable chunk of developers yet, but I think if used properly, it could enhance existing devs and lead to fewer hires needed.

                When I hear things like 30% of Microsoft code is now written by AI, it makes sense why shit is breaking all the time and quality is going down. They’re forcing it to do what it can’t do yet.

                • ageedizzle@piefed.ca
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  3 hours ago

                  Thank you for the detailed response. Most opinions on this topic are very much so based on vibes rather than real experience, so it’s interesting to hear an informed opinion from someone on the inside.

                  I hope to become a software developer one day too (it’s a slow process, because I’m teaching myself in my free time) so I sometimes worry if all the effort I’m putting in is even worth it if LLMs will be doing all the programming in a few years. Do you think that’s a concern? Will these tools continue to develop to that point or are they hitting a wall, like some people are saying?

      • titanicx@lemmy.zip
        link
        fedilink
        English
        arrow-up
        2
        ·
        11 hours ago

        I’m pretty sure with the fact that Tesla can remote control their robots in 90% of circumstances that we have the same effect with Tesla is being remotely driven and they’re just not admitting to it either.

        • NotMyOldRedditName@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          11 hours ago

          So the link I posted was about proving Waymo truly can remote control them if needed even though they deny it, but I would be pretty surprised if Tesla said it wasn’t possible, because their car has the “summon” feature and literally any owner can remotely drive their car with a forward/back button. So regardless of if they do or don’t, they clearly can.

          • titanicx@lemmy.zip
            link
            fedilink
            English
            arrow-up
            1
            ·
            11 hours ago

            Again, others have said that they don’t keep it a secret that their card can be remote driven. Apparently when you ride in them they give you a button to all for help specifically to be driven. Tesla I’m positive do this, but they are going to hide it all day long.

  • Jason2357@lemmy.ca
    link
    fedilink
    English
    arrow-up
    32
    arrow-down
    2
    ·
    1 day ago

    Automation has always been about de-skilling to cheaper, more abuse-able labour, and not about actually eliminating work. This goes all the way back to the broad looms and the luddites. There were still loom workers in the new factories - its just that they were children who could be worked to death for pennies.

  • Deacon@lemmy.world
    link
    fedilink
    English
    arrow-up
    44
    arrow-down
    4
    ·
    2 days ago

    This would have actually been a great thing to not only acknowledge but promote if they weren’t so caught up in their own hype.

    Not that I will ever get into one of those death traps but if you tell the average consumer that any failures in autonomy immediately engage a tele-operator “to keep you moving on your way” they would probably feel better about riding.

    I’ve done tele-driving before and it’s remarkably good, even if latency is a concern.

    It’s the facade of it all, the need to seem to live up to the hype. It’s going to get more people killed.

    • titanicx@lemmy.zip
      link
      fedilink
      English
      arrow-up
      2
      ·
      11 hours ago

      I don’t know I trust these over most of the drivers. Hell I took a lyft last night and on the freeway we were pretty scared with the way the driver was going.

    • ToTheGraveMyLove@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      3
      ·
      13 hours ago

      Nothing could make me feel better about my vehicle being operated remotely by someone in another country. Granted, nothing could make me feel better about my vehicle being operated by a computer either. I’ll drive my damn self, thank you.

    • ChickenLadyLovesLife@lemmy.world
      link
      fedilink
      English
      arrow-up
      17
      arrow-down
      1
      ·
      1 day ago

      any failures in autonomy immediately engage a tele-operator

      One of the problems is that these “failures in autonomy” could include a failure to engage a tele-operator when one is needed.

    • chiliedogg@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      ·
      2 days ago

      I work near downtown Austin, where both Waymo and Robotaxi operate.

      Waymo cars are some of the best drivers on the road because they actuallyt ested their product, use multiple Lidar sensors instead of just cameras, and have remote driver backups for unusual situations.

      Teslas drive like maniacs and will end a ride and tell the driver to get out in the middle of a lane.

    • nixon@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      1
      ·
      2 days ago

      I’ve ridden in a few Waymo’s before, in SF they can be more dependable or easier to get than other ride options. I never felt like I was ever in danger in one.

      Within my handful of experiences with them I’ve never had to use the help button or features to request assistance from a tele-operator but it was clear that they weren’t trying to hide the function from the passengers as the feature was explained and clearly labeled.

      A friend who uses them often told me of the one time he needed to ask for assistance when their Waymo was stuck behind a doordash scooter with its hazard lights on that was either delivering or picking up and blocking a turn lane in downtown SF. The Waymo didn’t know what to do to get around it, my friend hit the button for assistance, a voice came over the speakers asking how they could help, my friend explained the situation and the tele-operator drove the car to safely navigate the situation. He said it was probably 1.5-2mins of tota inconvenience with 75% of that time was him wondering if he should hit the help button or not.

      I understand a lot of AI implementation, such as Amazon Fresh or other business models have been hiding offshored human assistance within their “AI” features, which I do agree with you is deceitful but my experience with Waymo was not that. They did not hide or obfuscate that function and feature of the service but actively informed the passenger of its existence.

      Granted, I haven’t ridden in one for almost a year at this point and I only did so in the SF market so things may have changed since or are different elsewhere.

      Also, I can’t say that I follow the news intently about Waymo, I know they have run over a couple cats but I hadn’t heard anything about them killing people. Has that happened?

  • Zwuzelmaus@feddit.org
    link
    fedilink
    English
    arrow-up
    68
    arrow-down
    1
    ·
    2 days ago

    And these foreign crowd workers know the local traffic rules? Maybe they even have regular drivers licenses?

    • Perspectivist@feddit.uk
      link
      fedilink
      English
      arrow-up
      60
      arrow-down
      1
      ·
      2 days ago

      I think the interventions here are more like: “that’s a trash can someone pushed onto the road - let me help you around it” rather than: “let me drive you all the way to your destination.”

      It’s usually not the genuinely hard stuff that stumps AI drivers - it’s the really stupid, obvious things it simply never encountered in its training data before.

      • MoffKalast@lemmy.world
        link
        fedilink
        English
        arrow-up
        12
        ·
        edit-2
        2 days ago

        Saw this blog post recently about waymo’s sim setup for generating synthetic data and they really do seem to be generating pretty much everything in existence. The level of generalization of the model they seem to be using is either shockingly low or they abort immediately at the earliest sign of high perplexity.

        • Kushan@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          ·
          2 days ago

          I’m guessing it’s the latter, they need to keep accidents to a minimum if they’re ever going to get broad legislation to legalise them.

          Every single accident is analysed to death by the media and onlookers alike, with a large group of people wanting it to fail.

          This is a prime example, we’ve known about the human intervention for a while now but period people seem surprised that those people are in another country.

      • Zwuzelmaus@feddit.org
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        2
        ·
        edit-2
        2 days ago

        it’s the really stupid, obvious things

        Hm. Interesting. But that makes them look even mode incapable than I feared.

        • Perspectivist@feddit.uk
          link
          fedilink
          English
          arrow-up
          15
          arrow-down
          4
          ·
          2 days ago

          Broadly speaking, an AI driver getting stumped means it’s stuck in the middle of the road - while a human driver getting stumped means plowing into a semi truck.

          I’d rather be inconvenienced than killed. And from what I’ve seen, even our current AI drivers are already statistically safer than the average human driver - and they’re only going to keep getting better.

          They’ll never be flawless though. Nothing is.

          • MrScottyTay@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            13
            arrow-down
            2
            ·
            2 days ago

            Ai drivers have run over and crushed people slowly before too though because they didn’t see the person as an “obstacle” to be avoided, or because they were on the ground, it didn’t see them

            • Perspectivist@feddit.uk
              link
              fedilink
              English
              arrow-up
              10
              arrow-down
              4
              ·
              2 days ago

              And they always will. You need to look at the big picture here, not individual cases. If we replaced every single car on US roads with one driven by AI - proven to be 10 times better a driver than a human - that would still mean 4,000 people getting killed by them each year. That, however, doesn’t mean we should go back to human drivers and 40,000 people killed annually.

              • ltxrtquq@lemmy.ml
                link
                fedilink
                English
                arrow-up
                28
                arrow-down
                2
                ·
                edit-2
                2 days ago

                You need to look at the big picture here, not individual cases.

                By that logic…

                We should really be investing in trains and buses, not cars of any type.

                • walden@wetshav.ing
                  link
                  fedilink
                  English
                  arrow-up
                  7
                  arrow-down
                  10
                  ·
                  2 days ago

                  I think your logic is flawed. The discussion is about a specific form of transportation. By your own logic, you should be suggesting that people fly everywhere.

                • RobotToaster@mander.xyz
                  link
                  fedilink
                  English
                  arrow-up
                  12
                  ·
                  2 days ago

                  Tesla made the idiotic decision to rely entirely on cameras, waymo used lidar and other sensors to augment vision.

                • Pennomi@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  8
                  ·
                  2 days ago

                  That’s Tesla, not Waymo. Tesla’s hardware is shit and does not even include lidar. You can’t judge the entire industry by the worst example.

              • zbyte64@awful.systems
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                1 day ago

                Big picture is AI not being able to operate under unusual conditions means that the “10 times better” (if it were only true) has a big fucking caveat where we can’t say the stat will hold true if we replace all drivers.

          • Zwuzelmaus@feddit.org
            link
            fedilink
            English
            arrow-up
            7
            arrow-down
            4
            ·
            2 days ago

            current AI drivers are already statistically safer than

            As long as they use level 3 autonomous cars and then cheat with remote operators instead of using real level 5 cars, such statistics remain quite meaningless.

            However, they tell about the people who use them as arguments.

            • errer@lemmy.world
              link
              fedilink
              English
              arrow-up
              4
              arrow-down
              1
              ·
              2 days ago

              As the OP stated, the low velocity cases are not causing deadly accidents. And you can’t drive by wire at high speed (too much latency). So I doubt it’s affecting the stats in any meaningful way.

              Honestly I much prefer they have a human as a backup than not.

              • [deleted]@piefed.world
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                1
                ·
                2 days ago

                As the OP stated, the low velocity cases are not causing deadly accidents.

                Make humans drive as slow as these cars and deaths will drop too.

    • Chozo@fedia.io
      link
      fedilink
      arrow-up
      42
      ·
      2 days ago

      This used to be my job. They’re not controlling the cars. They’re basically completing real-time CAPTCHAs, telling the car whether the cameras see a stop sign, a bicycle, temporary barriers, etc. If the car can’t identify an object that could possibly cross its path, it pulls over and stops until an operator can do a sanity-check on whatever the car’s confused by. They only need to be able to identify objects on the road, not know the rules of the road.

      • NotMyOldRedditName@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        1 day ago

        This is how it generally behaves, but they are capable of taking direct control in more difficult situations. It’s only very slow maneurvers though, it’s not like they would be driving it down the street. They could move it off the road onto the shoulder though if needed.

        Edit: I am trying to find the source, but having problems. It was only ever mentioned in 1 official waymo document that I’ve seen that it was technically possible. My guess is they say their remote helpers can’t / don’t do it because they truly can’t, and it’s some highly restricted type of person who can, who isn’t classified like these other employees. The whole misleading but technical true kinda speak. I’ll keep looking though because I was really surprised to see them admit it when I saw it in an official document.

        Found it

        https://www.cpuc.ca.gov/-/media/cpuc-website/divisions/consumer-protection-and-enforcement-division/documents/tlab/av-programs/tcp0038152a-waymo-al-0003_a1b.pdf

        In very limited circumstances such as to facilitate movement of the AV out of a freeway lane onto an adjacent shoulder, if possible, our Event Response agents are able to remotely move the Waymo AV under strict parameters, including at a very low speed over a very short distance.

        Looks like I was right as well on terminology, it’s not the remote operators that can do it, it’s the “Event Response” team that can.

        As far as I know this is the only official acknowledgement it’s possible. Everywhere else they say it isn’t, and this is a footnote in that document.

        • Zwuzelmaus@feddit.org
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          23 hours ago

          at a very low speed over a very short distance.

          LOL so when they get in a situation in a tunnel that is 10 or 20 km long (ok you have them only 4km in poor Usa, but we have them here), they first drive it at 10km/h and then they give up after 300m? Because the rules are the rules??

          • NotMyOldRedditName@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            23 hours ago

            From the description it’s really not meant to solve that. In a situation like that they’d have to send someone, but they would be able to get out of the middle of a lane, off to the side, even if that only gives an extra foot or two of space to pass the vehicle.

            Edit: And that’s assuming their remote helpers couldn’t direct the car to drive itself out using their other tool where the AI drives itself with their suggestions.

      • [deleted]@piefed.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 days ago

        That is like the person steering to avoid a collision while cruise control and lane assist are on, it isn’t actually fully autonomous.

  • Etterra@discuss.online
    link
    fedilink
    English
    arrow-up
    10
    ·
    1 day ago

    If you keep doing the work for them, they’ll never learn. They need to figure it out for themselves.

  • THE_GR8_MIKE@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    ·
    2 days ago

    I still have no idea how these are legally able to operate on public roads. Shit seems wild to me. Wouldn’t last 5 seconds here in Chicago, for numerous reasons lol