hayst4ck 10 hours ago

That aside, Elon has been shown to be someone who unapologetically has no integrity whatsoever.

When lack of integrity is rewarded, society decays. If you want to be part of a culture people would choose to adopt, then lack of integrity must be punished severely. If abuse of trust is rewarded, then society at large will see people start to experience distrust as the default rather than trust.

steveBK123 12 hours ago

These numbers raise good points. And a lot of historical Tesla bull "FSD vs human" comparisons overlook the fact that the 'average driver' includes - 18 year old males, 75 year old seniors, drunk, high, driving while texting, etc. This is why the accident stats look something like - a whole lot of people that get in 1 crash per decade or two, and a small slice of people that get in 4 accidents per year.

Simply not being any of those things makes you a 10-100x driver by comparison to the average. So buying FSD or getting in a RoboTaxi that is claimed to be "just as good as the average human" is actually quite bad.

Oh and of course this is all flagged now because Elon is god here.

  • xnx 11 hours ago

    > Simply not being any of those things makes you a 10-100x driver by comparison to the average.

    Good points. I wonder if the most accurate comparison is rideshare drivers? I don't know how their safety fits overall, but they are far too dangerous drivers for my taste.

    • steveBK123 7 hours ago

      Rideshare drivers aren't even that bad against the general population.

      Think about it- if they were so bad, it would be an expensive & time consuming problem to constantly be dealing with insurance and/or fixing out of pocket their wrecked cars!

  • theahura 10 hours ago

    O did it get flagged? I was wondering! I saw this was doing numbers through substack while I was at dinner and then it suddenly seemed to stop

    EDIT: lol looks like it got flagged again

    • steveBK123 7 hours ago

      Typical, anything critical of Elon Inc gets flagged & re-flagged enough to break up discussion.

      Self driving is a real technical problem in the HN domain that is worthy of discussion. Seeing it censored here the same it as on X is disheartening in terms of the entire VC industrial complex omertà.

jasonthorsness 12 hours ago

I've ridden a Waymo (in Phoenix) and it felt very safe. The few times I've tried FSD in Tesla (~1 year ago) it did pretty well, but I had to intervene a few times, HOWEVER - I think part of the problem is I am worried about my responsibility, damage to the vehicle, and being thought of as "that guy" by other drivers (for example if it takes too long at a four-way stop waiting for the other car when it's actually our right-of-way). If I were not responsible for the car, as in a Waymo, I would probably let the Tesla FSD do its thing with less micro-management and it would do fine. I'll try the Robotaxi when it arrives.

  • JumpCrisscross 11 hours ago

    Waymo is magic. I’ve been in a rented Tesla and the FSD was wildly unpredictable. Good enough 99% of the time. But sure to cause a crash in the edge cases. Never experienced that in Waymo.

    • steveBK123 7 hours ago

      Right this has always been and continues to be the problem with Tesla's solutions. 99% good enough / 1% terrifying is not a good way to travel 10k miles/year. Even 99.99% / 0.01% is insufficient.

      One tell - the amount of FSD guys that will quietly admit, if you ask, that their wife won't let them use it when she is in the car :-). Certainly my wife felt that way about AP/EAP in all its variations.

      Women have a higher bar for technology, in that they expect it to actually work, not just be a neat idea.

    • tomjakubowski 11 hours ago

      I'll add that when I am walking or cycling anywhere, I feel a lot safer around Waymos.

      Just yesterday in SF I nearly collided on my bike with a human SUV driver (likely an Uber), who pulled out impatiently into the bike lane to pass a Waymo which was stopped at a light waiting for a pedestrian to finish crossing (against the signal).

  • satisfice 8 hours ago

    The reason I will never get into such a vehicle is not necessarily because I know it to be unsafe— but rather because the arrogance of the people who created that technology means that they probably didn’t bother to test it well.

osrec 12 hours ago

The reluctance to use LiDAR boggles the mind. The rationale provided by Elon is also questionable (humans don't use it, so we shouldn't need to either).

The cars themselves are nothing special at best, and given the political shenanigans the CEO is pulling, it really will turn a lot of people off trusting anything associated with him.

  • steveBK123 12 hours ago

    Yes, the avoidance of LIDAR is some combination of stubbornness and cheapness. Couple that with the fact that the cameras used for vision-only are.. quite bad resolution wise. Like 10 year old iPhone bad, for the new HW4.. the HW3 & prior even worse. Does not inspire confidence.

    • osrec 12 hours ago

      Yes, plus the ever shifting deadlines make me think this is all being somewhat rushed, to finally deliver what they promised ages ago. I personally don't want to use tech like this if it has been rushed. Realistic deadlines with a smooth landing and stable narrative are what you need to inspire confidence.

      Cyber truck build quality was also a red flag, suggesting that pthe company lacks rigor.

  • b8 11 hours ago

    Comma.ai is doing fine just by using cameras. Waymo self driving cars won't drive in conditions that would hinder cameras. Thermal cameras are the next step imo.

    • theahura 10 hours ago

      I'm doubtful. I don't see any comma.ai robotaxis coming online any time soon. When you say "doing fine", what do you mean?

      • bdangubic 10 hours ago

        they’ll be coming up right around the same time as robotaxis… in june… of 2095 :)

jsight 11 hours ago

I've ridden in a Waymo. It was awesome! I've only ridden it a couple of times and for no more than 10 miles each time.

If I had been a safety driver, I would have intervened ~5 times: - ~3 (maybe more) when it stupidly turned on its turn signal while stuck in traffic in a middle lane at a light. Changing lanes wouldn't have even been desirable. - Another time when it parked itself for pickup behind a car that was ready to leave. It blocked him for really no reason - Another time, it was seriously confused by a car backing out of a space in a parking spot near its intended dropoff point and just behaved... strangely, with weird unnecessary lurches.

None of these were safety related. TBH, measuring interventions is really hard for this reason. I remember hearing anecdotal reports that Waymo had a decent number of non-safety interventions back when they had safety drivers.

I wouldn't necessarily suggest being an early adopter of robotaxi with no safety driver either, but I don't think these numbers can be extrapolated to show that with any meaningful confidence. This is especially true given that the initial drives will likely be in geofenced areas.

dlokshin 12 hours ago

"Interventions" might be experiencing the effects of Goodhart's Law. I agree that I intervene in my Tesla about once every 2 weeks, but not because it's about to get into an accident, but because it's doing something I really don't want it to do. Uber drivers do things I don't want them to do, too. If I could intervene, I would.

  • steveBK123 12 hours ago

    Sure I always see these claims that "this time FSD is good", and as a 2017-2022 Tesla owner I am continually in doubt.

    Last experience of FSD was in an Uber 1 year ago where the guy proudly wanted to show it to us. Worked for about 60 seconds until the third turn in stop & go traffic where it near launched into and rear ended stopped traffic ahead. Lots of beeping and human stomping on brakes.

    This was, again, during a "they finally solved it" claim cycle.

    You can tell how much people believe in the product by how quickly these threads get flagged to death.

tjpnz 8 hours ago

How does one get into it when either my companion or I must step into traffic? Based on everything I've seen there isn't any wiggle room for both passengers to enter from the same side, and I'm guessing that's why it's shown with both doors open. Excellent design all around.

throwaway69123 12 hours ago

surely this doesnt account for the robotaxi coverage, seriously doubt the robot taxis will support the same coverage as a tesla does (everything) most likely be limited to major cities youd imagine?

  • JumpCrisscross 11 hours ago

    > most likely be limited to major cities youd imagine?

    Cities have the bureaucracy to regulate. They also, currently, mostly hate Elon.

    Tesla would be better placed trialling in suburbias, where accidents can be more-readily blamed on factors out of the company’s control.

    • steveBK123 7 hours ago

      To me, in terms of Tesla "putting their money where their mouth is" for automated driving, there are two things they've yet to do, and I'll believe it when I see it.

      1) Take on any (even limited amount / limited use case / limited region) liability when FSD, the way Mercedes already has.

      2) Actually launch Robotaxi for really real, at any sort of scale, the way Waymo has

      Right now its the same situation its been for years & years - a lot of talk, and FSD cannot fail.. only be failed, by the driver.

      That is - if it crashes, the driver failed to intervene. But if the driver intervenes & complains about frequency of interventions, the response is that the driver probably is too conservative and intervenes too often.. that the car wouldn't have crashed anyway. Circular logic.

      • dragonwriter 7 hours ago

        Actually, they've taken on liability just from shipping FSD, under the normal rules of legal product liability. They just didn't build a PR campaign around the fact that manufacturers are fully liable from product defects.

        • steveBK123 7 hours ago

          Their lawyers would not agree with you

          • dragonwriter 3 hours ago

            Their lawyers would in any specific case look to find a loophole providing an out, but I’m guessing that, even given Elon’s close association with Trump and the latter’s problems with selecting lawyers, Tesla probably hires lawyers that are familiar with the basic rules of product liability.

  • steveBK123 7 hours ago

    Its actually the opposite - FSD is demonstrably worse in traffic congested, pedestrian/biker/etc populated big cities.

AStonesThrow 10 hours ago

My Waymo account [Phoenix market] is currently showing 225 rides; 1,143 miles; 3,912 minutes.

Honestly I've never regretted summoning a Waymo for a ride. I was typically a public transit user, and bus/train operators are safe, reliable, and impassive people. I can recount with horror, all the bizarre, unbelievable, fucked-up situations that transpired when a human being was summoned with their own damn car (taxi/Lyft/Uber/Veyo). None of that shit goes down when I slide into a Waymo.

Sure, there are technical issues and I have little quibbles with the quality of service provided, but at the end of the day, it gets me from Point A to Point B and it doesn't interpose a creepy 3rd party running the show (unless you consider Alphabet to be the sine qua non of creepy 3rd parties.)

Alphabet has earned my confidence, from their mapping activities and Big Data capability, to all the other infrastructure and logistics necessary to run a project of this scale. Their deployment of Waymo has been simply an evolutionary step in the robot apocalypse, I mean 21st Century urban convenience landscape.

I wouldn't accord any of the same confidence to Tesla or Uber due to the vastly different structure and scope of their business models. I just hope that Waymo is here to stay, because it really is working out well between us.