News
The saga continues with Model X driver involved in Montana crash
Mr. Pang is back this time with a second open letter to Tesla
The Tesla Model X driver involved in a Montana crash while using Autopilot is stirring up controversy once again this time asking Tesla Motors to reveal additional details from the incident. It seems that language differences play a large role in this dispute. Acting as his representative, Steven Xu sent us a second open letter Mr. Pang penned to Elon Musk, in which he takes issue with Tesla’s account of the accident. The open letter reads as follows:
Here is the second letter from my friend, Mr.Pang.
To Tesla Team:
It has been weeks since I published the letter. No one has ever tried to contact us and discus about the crash. To fully understand the reason that caused this crash is critical for all tesla drivers. After awhile tesla published a response towards our letter. Most of parts are fit into the story. However there are few points that I would like to point out.
“From this data, we learned that after you engaged Autosteer, your hands were not detected on the steering wheel for over two minutes. This is contrary to the terms of use when first enabling the feature and the visual alert presented you every time Autosteer is activated.”
I admit that my hands were out of steering wheel after I engaged autopilot. The reason that I was doing that is because I put too much faith in this system. I also believe most Tesla driver would do the something when they
engage autopilot including Elon. The problem here is that Tesla had over advertised this feature by calling it “autopilot”. This feature should named “advance driving assistant”. It is possible that Tesla had known accident like this would come sooner or later. Tesla might think that setting up the term by saying “please put hands on steering wheel at all time” would be response free for Tesla.
2、 As road conditions became increasingly uncertain, the vehicle again alerted you to put your hands on the wheel.
The road condition was better than fine. Lane mark is absolutely clear. Road is flat and there is no incoming car. No matter what my sight was never out of the road. However everything was happened too fast for me to take control. Everything happened in less than a second.
3、No steering torque was then detected until Autosteer was disabled with an abrupt steering action. Immediately following detection of the first impact, adaptive cruise control was also disabled, the vehicle began to slow, and you applied the brake pedal.
No one should avoid the cause of the malfunction of autopilot feature. Since you start explaining it, I realize that you are implying that some sort of force was applied to the steering wheel by me. I had no idea how Tesla got this clue. There are two points I want to make here. First, my hands were not on the steering wheel. Second no obstacle was on the road to alter the steering wheel direction. The one and the only one that was taking control of this entire vehicle and steering it away from the road is autopilot software itself. Somehow I realize if my hands were on the steering wheel with a force, would Tesla blame me for the collision? To me it looks like that if an accident occur by autopilot, either hands are on or not on the steering wheel, Tesla can always find a way out by saying “abrupt steering action”.
Tesla also claimed that “abrupt steering adaptive cruise control was also disabled, the vehicle began to slow.”
This is nowhere near the truth. The real thing is that vehicle was NEVER attended to slow from hitting the first pole towards the last. It only took about a second to hit 12 wood poles. I believe if it wasn’t me who brake the vehicle it would continued cruising. Mr. Huang was injured severely due to high speed impact.
Tesla as a global impact company should respect the truth of every incident. Nothing is more important hand human life. Lying or manipulating towards public about what really happened is unacceptable.
Weeks ago I got contacted by Tesla regarding this accident. Since you cannot find a mandarin translator, we rearranged the call again in four hours. However that was the last time when Tesla tries to contact me. What I am asking is to fully reveal the driving data from the collision. Reliability of Autopilot software matters to hundreds and thousands of Tesla drivers. I wish to know the entire story about what really happened on us on that collision.
Thanks
Sincerely
Mr. Pang
Steven Xu pointed us to comments being made on the Tesla Motors Club forum that seemingly offers Mr. Pang no support at all. In fact, based on those comments, there almost seems to be a cultural bias in play in this situation. One wonders if perhaps things would seem different if they were driving a car in China that only displayed instructions in Mandarin.
Pang’s complaint is very similar to one lodged by a Chinese customer last month whose Tesla crashed on the highway on the way to work. He claimed that the salesman he spoke to before purchasing his car told him specifically that the car could drive itself and proved it by driving with his hands off the wheel during a test drive. Tesla later amended the language it uses to describe its Autopilot system on its Chinese website. It’s possible that same linguistic confusion has a bearing on Mr. Pang’s unfortunate accident.
At this point, it seems the matter will be handled by insurance companies and lawyers. Tesla apparently has had no further contact with Pang. Through Steven, Pang says, “Weeks ago I got contacted by Tesla regarding this accident. Since you cannot find a Mandarin translator, we re-arranged the call again in four hours. However, that was the last time when Tesla tries to contact me.
“What I am asking is to fully reveal the driving data from the collision. Reliability of Autopilot software
matters to hundreds and thousands of Tesla drivers. I wish to know the entire story about what really happened on us on that collision.”
News
Tesla is not sparing any expense in ensuring the Cybercab is safe
Images shared by the longtime watcher showed 16 Cybercab prototypes parked near Giga Texas’ dedicated crash test facility.
The Tesla Cybercab could very well be the safest taxi on the road when it is released and deployed for public use. This was, at least, hinted at by the intensive safety tests that Tesla seems to be putting the autonomous two-seater through at its Giga Texas crash test facility.
Intensive crash tests
As per recent images from longtime Giga Texas watcher and drone operator Joe Tegtmeyer, Tesla seems to be very busy crash testing Cybercab units. Images shared by the longtime watcher showed 16 Cybercab prototypes parked near Giga Texas’ dedicated crash test facility just before the holidays.
Tegtmeyer’s aerial photos showed the prototypes clustered outside the factory’s testing building. Some uncovered Cybercabs showed notable damage and one even had its airbags engaged. With Cybercab production expected to start in about 130 days, it appears that Tesla is very busy ensuring that its autonomous two-seater ends up becoming the safest taxi on public roads.
Prioritizing safety
With no human driver controls, the Cybercab demands exceptional active and passive safety systems to protect occupants in any scenario. Considering Tesla’s reputation, it is then understandable that the company seems to be sparing no expense in ensuring that the Cybercab is as safe as possible.
Tesla’s focus on safety was recently highlighted when the Cybertruck achieved a Top Safety Pick+ rating from the Insurance Institute for Highway Safety (IIHS). This was a notable victory for the Cybertruck as critics have long claimed that the vehicle will be one of, if not the, most unsafe truck on the road due to its appearance. The vehicle’s Top Safety Pick+ rating, if any, simply proved that Tesla never neglects to make its cars as safe as possible, and that definitely includes the Cybercab.
Elon Musk
Tesla’s Elon Musk gives timeframe for FSD’s release in UAE
Provided that Musk’s timeframe proves accurate, FSD would be able to start saturating the Middle East, starting with the UAE, next year.
Tesla CEO Elon Musk stated on Monday that Full Self-Driving (Supervised) could launch in the United Arab Emirates (UAE) as soon as January 2026.
Provided that Musk’s timeframe proves accurate, FSD would be able to start saturating the Middle East, starting with the UAE, next year.
Musk’s estimate
In a post on X, UAE-based political analyst Ahmed Sharif Al Amiri asked Musk when FSD would arrive in the country, quoting an earlier post where the CEO encouraged users to try out FSD for themselves. Musk responded directly to the analyst’s inquiry.
“Hopefully, next month,” Musk wrote. The exchange attracted a lot of attention, with numerous X users sharing their excitement at the idea of FSD being brought to a new country. FSD (Supervised), after all, would likely allow hands-off highway driving, urban navigation, and parking under driver oversight in traffic-heavy cities such as Dubai and Abu Dhabi.
Musk’s comments about FSD’s arrival in the UAE were posted following his visit to the Middle Eastern country. Over the weekend, images were shared online of Musk meeting with UAE Defense Minister, Deputy Prime Minister, and Dubai Crown Prince HH Sheikh Hamdan bin Mohammed. Musk also posted a supportive message about the country, posting “UAE rocks!” on X.
FSD recognition
FSD has been getting quite a lot of support from foreign media outlets. FSD (Supervised) earned high marks from Germany’s largest car magazine, Auto Bild, during a test in Berlin’s challenging urban environment. The demonstration highlighted the system’s ability to handle dense traffic, construction sites, pedestrian crossings, and narrow streets with smooth, confident decision-making.
Journalist Robin Hornig was particularly struck by FSD’s superior perception and tireless attention, stating: “Tesla FSD Supervised sees more than I do. It doesn’t get distracted and never gets tired. I like to think I’m a good driver, but I can’t match this system’s all-around vision. It’s at its best when both work together: my experience and the Tesla’s constant attention.” Only one intervention was needed when the system misread a route, showcasing its maturity while relying on vision-only sensors and over-the-air learning.
News
Tesla quietly flexes FSD’s reliability amid Waymo blackout in San Francisco
“Tesla Robotaxis were unaffected by the SF power outage,” Musk wrote in his post.
Tesla highlighted its Full Self-Driving (Supervised) system’s robustness this week by sharing dashcam footage of a vehicle in FSD navigating pitch-black San Francisco streets during the city’s widespread power outage.
While Waymo’s robotaxis stalled and caused traffic jams, Tesla’s vision-only approach kept operating seamlessly without remote intervention. Elon Musk amplified the clip, highlighting the contrast between the two systems.
Tesla FSD handles total darkness
The @Tesla_AI account posted a video from a Model Y operating on FSD during San Francisco’s blackout. As could be seen in the video, streetlights, traffic signals, and surrounding illumination were completely out, but the vehicle drove confidently and cautiously, just like a proficient human driver.
Musk reposted the clip, adding context to reports of Waymo vehicles struggling in the same conditions. “Tesla Robotaxis were unaffected by the SF power outage,” Musk wrote in his post.
Musk and the Tesla AI team’s posts highlight the idea that FSD operates a lot like any experienced human driver. Since the system does not rely on a variety of sensors and a complicated symphony of factors, vehicles could technically navigate challenging circumstances as they emerge. This definitely seemed to be the case in San Francisco.
Waymo’s blackout struggles
Waymo faced scrutiny after multiple self-driving Jaguar I-PACE taxis stopped functioning during the blackout, blocking lanes, causing traffic jams, and requiring manual retrieval. Videos shared during the power outage showed fleets of Waymo vehicles just stopping in the middle of the road, seemingly confused about what to do when the lights go out.
In a comment, Waymo stated that its vehicles treat nonfunctional signals as four-way stops, but “the sheer scale of the outage led to instances where vehicles remained stationary longer than usual to confirm the state of the affected intersections. This contributed to traffic friction during the height of the congestion.”
A company spokesperson also shared some thoughts about the incidents. “Yesterday’s power outage was a widespread event that caused gridlock across San Francisco, with non-functioning traffic signals and transit disruptions. While the failure of the utility infrastructure was significant, we are committed to ensuring our technology adjusts to traffic flow during such events,” the Waymo spokesperson stated, adding that it is “focused on rapidly integrating the lessons learned from this event, and are committed to earning and maintaining the trust of the communities we serve every day.”