News
How will Tesla Version 8 compare to current Autopilot in the real world?
Tesla’s upcoming Version 8 software will be the company’s most significant Autopilot upgrade since its October 2014 initial release, but how will these updates compare to current Autopilot behavior in the real world?
This will be the first time the company will switch from using the vehicle’s front-facing camera as the core hardware responsible for visual image recognition, to radar technology which will now become the primary sensor used in creating a virtual picture of the vehicle’s surroundings.
With these improvements, to be rolled out via an over-the-air software update in the coming weeks, Model S equipped with the Autopilot hardware suite and Model X should theoretically be able to handle emergency braking situations with more precision, provide a smoother Traffic Aware Cruise Control (TACC) experience, take highway exits on its own, and provide drivers and passengers with an overall safer experience.
Let’s take a look at each of these features and see how Autopilot in Version 8 will differ from current Version 7 capabilities.
Automatic Emergency Braking
Following the much publicized death of Joshua Brown after his Model S crashed into the side of a tractor trailer while driving on Autopilot, reliability of Autopilot’s Automatic Emergency Braking (AEB) feature was immediately put to question. Tesla released a statement stating that the high, white side of the tractor trailer, combined with a radar signature that would have looked very similar to an overhead sign, caused automatic braking not to fire. “Since January 2016, Autopilot activates automatic emergency braking in response to any interruption of the ground plane in the path of the vehicle that cross-checks against a consistent radar signature,” said Tesla.
Spy shots taken from the Naval Air Station reveal Tesla was testing and calibrating its AEB system this past summer. But despite the tests which seemingly show a Model S automatically braking in a staged collision event, Tesla has been overly cautious when it comes to activation of its AEB feature. AEB is reliant on imagery received from its front-facing camera, and supplemented by radar input, to decide on the degree of confidence that would trigger a braking event.
Some Tesla owners have even taken it upon themselves to stage scenarios that would seemingly trigger the AEB response of the vehicle, but to no avail leaving further mystery as to how AEB works.
The current Autopilot system under Version 7 is limited in its ability to reliably detect people or pinpoint false positives such as reflective objects that may appear larger than they are. Tesla uses the concave bottom of a soda can as an example. When the radar signal is reflected back from the can’s bottom dish-shaped surface, the reflected signal is amplified to many times its actual size leading the radar to believe there’s a large object before it. Because of that, programming the AEB system to suddenly engage could lead to a dangerous situation so Tesla decided to limit the scenarios that could actually trigger an automatic emergency braking response.
However, Version 8 will combine the power of fleet learning with “radar snapshots” to improve the vehicle’s ability to more accurately depict the circumstances of an event. In other words, we can expect Autopilot under Version 8 to have a much higher degree of confidence when it comes to engaging automatic emergency braking. Tesla CEO Elon Musk believes this set up will provide safety improvements by a factor of three over existing Autopilot.
Traffic Aware Cruise Control
Beyond being able to track a vehicle that’s directly in front of the car, Version 8 of Autopilot will also be able to see the vehicle ahead of that. Tesla describes this update as follows: Tesla will also be able to bounce the radar signal under a vehicle in front – using the radar pulse signature and photon time of flight to distinguish the signal – and still brake even when trailing a car that is opaque to both vision and radar. The car in front might hit the UFO in dense fog, but the Tesla will not.
The improvement will lead to smoother braking events when TACC is engaged since Autopilot will no longer solely rely on the actions from the vehicle before it. If a hard braking event happened in front of the vehicle that Autopilot is immediately tracking, Version 8 will be able to identify it and slow the Model S (or Model X) even before the vehicle directly ahead may have applied the brakes.
The following video captures an incident whereby the vehicle being tracked by Version 7 of Autopilot could not see the hard braking event that took place two cars ahead. TACC seemingly did not have enough time to stop the Model S.
Being able to see two cars ahead in Version 8 will provide a smoother TACC experience and increased safety.
Improved Auto Lane Change and Freeway Exiting
What we’re particularly excited about is the new feature in Version 8.1 that will allow an Autopilot-equipped Model S and Model X to take highway exits using the onboard navigation system.
Currently, Version 7 of Autopilot is capable of handling lane changes when the driver explicitly uses the turn signal stalk. Signaling left and the vehicle will make a left lane change, and vice versa. However with the ability to punch in a destination through Tesla Nav and have the vehicle assist with freeway exiting, assuming that’s part of the route, in our minds, Tesla is taking a critical step towards the ultimate goal of building fully autonomous self-driving vehicles. It’s a small step, but nonetheless it’s a notable step.
Photo credit: Rob M.
Full details of Tesla Version 8 can be found here.
News
Tesla shares AI5 chip’s ambitious production roadmap details
Tesla CEO Elon Musk has revealed new details about the company’s next-generation AI5 chip, describing it as “an amazing design.”
Tesla CEO Elon Musk has revealed new details about the company’s next-generation AI5 chip, describing it as “an amazing design” that could outperform its predecessor by a notable margin. Speaking during Tesla’s Q3 2025 earnings call, Musk outlined how the chip will be manufactured in partnership with both Samsung and TSMC, with production based entirely in the United States.
What makes AI5 special
According to Musk, the AI5 represents a complete evolution of Tesla’s in-house AI hardware, building on lessons learned from the AI4 system currently used in its vehicles and data centers. “By some metrics, the AI5 chip will be 40x better than the AI4 chip, not 40%, 40x,” Musk said during the Q3 2025 earnings call. He credited Tesla’s unique vertical integration for the breakthrough, noting that the company designs both the software and hardware stack for its self-driving systems.
To streamline the new chip, Tesla eliminated several traditional components, including the legacy GPU and image signal processor, since the AI5 architecture already incorporates those capabilities. Musk explained that these deletions allow the chip to fit within a half-reticle design, improving efficiency and power management.
“This is a beautiful chip,” Musk said. “I’ve poured so much life energy into this chip personally, and I’m confident this is going to be a winner.”
Tesla’s dual manufacturing strategy for AI5
Musk confirmed that both Samsung’s Texas facility and TSMC’s Arizona plant will fabricate AI5 chips, with each partner contributing to early production. “It makes sense to have both Samsung and TSMC focus on AI5,” the CEO said, adding that while Samsung has slightly more advanced equipment, both fabs will support Tesla’s U.S.-based production goals.
Tesla’s explicit objective, according to Musk, is to create an oversupply of AI5 chips. The surplus units could be used in Tesla’s vehicles, humanoid robots, or data centers, which already use a mix of AI4 and NVIDIA hardware for training. “We’re not about to replace NVIDIA,” Musk clarified. “But if we have too many AI5 chips, we can always put them in the data center.”
Musk emphasized that Tesla’s focus on designing for a single customer gives it a massive advantage in simplicity and optimization. “NVIDIA… (has to) satisfy a large range of requirements from many customers. Tesla only has to satisfy one customer, Tesla,” he said. This, Musk stressed, allows Tesla to delete unnecessary complexity and deliver what could be the best performance per watt and per dollar in the industry once AI5 production scales.
Energy
Tesla VP hints at Solar Roof comeback with Giga New York push
The comments hint at possible renewed life for the Solar Roof program, which has seen years of slow growth since its 2016 unveiling.
Tesla’s long-awaited and way underrated Solar Roof may finally be getting its moment. During the company’s Q3 2025 earnings call, Vice President of Energy Engineering Michael Snyder revealed that production of a new residential solar panel has started at Tesla’s Buffalo, New York facility, with shipments to customers beginning in the first quarter of 2026.
The comments hint at possible renewed life for the Solar Roof program, which has seen years of slow growth since its 2016 unveiling.
Tesla Energy’s strong demand
Responding to an investor question about Tesla’s energy backlog, Snyder said demand for Megapack and Powerwall continues to be “really strong” into next year. He also noted positive customer feedback for the company’s new Megablock product, which is expected to start shipping from Houston in 2026.
“We’re seeing remarkable growth in the demand for AI and data center applications as hyperscalers and utilities have seen the versatility of the Megapack product. It increases reliability and relieves grid constraints,” he said.
Snyder also highlighted a “surge in residential solar demand in the US,” attributing the spike to recent policy changes that incentivize home installations. Tesla expects this trend to continue into 2026, helped by the rollout of a new solar lease product that makes adoption more affordable for homeowners.
Possible Solar Roof revival?
Perhaps the most intriguing part of Snyder’s remarks, however, was Tesla’s move to begin production of its “residential solar panel” in Buffalo, New York. He described the new panels as having “industry-leading aesthetics” and shape performance, language Tesla has used to market its Solar Roof tiles in the past.
“We also began production of our Tesla residential solar panel in our Buffalo factory, and we will be shipping that to customers starting Q1. The panel has industry-leading aesthetics and shape performance and demonstrates our continued commitment to US manufacturing,” Snyder said during the Q3 2025 earnings call.
Snyder did not explicitly name the product, though his reference to aesthetics has fueled speculation that Tesla may finally be preparing a large-scale and serious rollout of its Solar Roof line.
Originally unveiled in 2016, the Solar Roof was intended to transform rooftops into clean energy generators without compromising on design. However, despite early enthusiasm, production and installation volumes have remained limited for years. In 2023, a report from Wood Mackenzie claimed that there were only 3,000 operational Solar Roof installations across the United States at the time, far below forecasts. In response, the official Tesla Energy account on X stated that the report was “incorrect by a large margin.”
News
Tesla VP explains why end-to-end AI is the future of self-driving
Using examples from real-world driving, he said Tesla’s AI can learn subtle value judgments, the VP noted.
Tesla’s VP of AI/Autopilot software, Ashok Elluswamy, has offered a rare inside look at how the company’s AI system learns to drive. After speaking at the International Conference on Computer Vision, Elluswamy shared details of Tesla’s “end-to-end” neural network in a post on social media platform X.
How Tesla’s end-to-end system differs from competitors
As per Elluswamy’s post, most other autonomous driving companies rely on modular, sensor-heavy systems that separate perception, planning, and control. In contrast, Tesla’s approach, the VP stated, links all of these together into one continuously trained neural network. “The gradients flow all the way from controls to sensor inputs, thus optimizing the entire network holistically,” he explained.
He noted that the benefit of this architecture is scalability and alignment with human-like reasoning. Using examples from real-world driving, he said Tesla’s AI can learn subtle value judgments, such as deciding whether to drive around a puddle or briefly enter an empty oncoming lane. “Self-driving cars are constantly subject to mini-trolley problems,” Elluswamy wrote. “By training on human data, the robots learn values that are aligned with what humans value.”
This system, Elluswamy stressed, allows the AI to interpret nuanced intent, such as whether animals on the road intend to cross or stay put. These nuances are quite difficult to code manually.
Tackling scale, interpretability, and simulation
Elluswamy acknowledged that the challenges are immense. Tesla’s AI processes billions of “input tokens” from multiple cameras, navigation maps, and kinematic data. To handle that scale, the company’s global fleet provides what he called a “Niagara Falls of data,” generating the equivalent of 500 years of driving every day. Sophisticated data pipelines then curate the most valuable training samples.
Tesla built tools to make its network interpretable and testable. The company’s Generative Gaussian Splatting method can reconstruct 3D scenes in milliseconds and model dynamic objects without complex setup. Apart from this, Tesla’s neural world simulator allows engineers to safely test new driving models in realistic virtual environments, generating high-resolution, causal responses in real time.
Elluswamy concluded that this same architecture will eventually extend to Optimus, Tesla’s humanoid robot. “The work done here will tremendously benefit all of humanity,” he said, calling Tesla “the best place to work on AI on the planet currently.”
-
Elon Musk1 week agoSpaceX posts Starship booster feat that’s so nutty, it doesn’t even look real
-
Elon Musk6 days agoTesla Full Self-Driving gets an offer to be insured for ‘almost free’
-
News6 days agoElon Musk confirms Tesla FSD V14.2 will see widespread rollout
-
News7 days agoTesla is adding an interesting feature to its centerscreen in a coming update
-
News1 week agoTesla launches new interior option for Model Y
-
News1 week agoTesla widens rollout of new Full Self-Driving suite to more owners
-
Elon Musk7 days agoTesla CEO Elon Musk’s $1 trillion pay package hits first adversity from proxy firm
-
News5 days agoTesla might be doing away with a long-included feature with its vehicles




