
Tesla’s Full Self-Driving has a distinct driving personality depending on who you ask: cautious, aggressive, decisive, hesitant. But one TikTok creator says her version has a single defining trait: rudeness. In her viral clip, she says the system routinely cuts off other drivers, and she’s tired of being blamed for behavior she didn’t choose.
The post from worried Tesla owner Cindy K (@cindy.k690) seems to have framed a common complaint and regret in the EV world. Namely, there are times when a Tesla’s assisted or self-driving modes get a road warrior attitude and become extremely risk-friendly and combative.
“There's no car in front of me, there's no car to my left, but there is a car to my right, maybe, like, a car behind, and the Tesla decides to go in that lane, so it completely looks like I just cut that person off, and I don't know why it does that,” Cindy laments in the clip that’s been viewed more than 5,000 times.
‘Mine Does The Same’
Once her clip began circulating, other Tesla owners quickly filled the comments with their own experiences, many of them nearly identical. Some of the most striking replies came from drivers saying FSD can be startlingly bold during low-speed turns or cross-traffic maneuvers.
One owner wrote that both their Model Y and Model 3 “will cut off oncoming cars when I’m turning right at a stop sign/red light and when I’m making a left turn across a lane,” adding they “don’t understand why it just won’t wait until all cars pass.” Another commenter said their car “stays in the furthest lane from where it needs to be then last minute lane changes,” a behavior YouTube testers and independent analysts have documented on earlier FSD builds as well.
Even simple repositioning can cause headaches. One commenter said their Model Y makes routine lane changes that “make me look erratic,” while another said they try to hide behind the mirror out of embarrassment when the car moves too aggressively.
These anecdotes aren’t unusual in the Tesla world. FSD’s lane-change logic has long been one of the most discussed aspects of the system, showing up in countless user reviews and road-test channels on YouTube, including Dirty Tesla, AI DRIVR and Chuck Cook, who has posted dozens of examples of inconsistent turning and unexpected lane selection movements in real-world traffic.
FROM THE TRENDING NEWS DESK
Viral bits from across the social media landscape
Our team of experts tracks what's trending so you don't have to—from viral videos to online debates that have everyone talking.
The thread wasn’t all frustration. A smaller group of commenters pushed back, saying they haven’t experienced unpredictable moves at all. Several pointed out that Cindy never mentioned which version of FSD she was running, which can drastically change how the system behaves.
Tesla’s neural network–based FSD stack has evolved rapidly over the last year, especially since the debut of the Full Self-Driving (Supervised) model, which is built on end-to-end machine learning. Owners running v12 or v14 frequently chimed in to say their cars rarely perform abrupt or socially awkward maneuvers.
Tesla does not publish detailed public release notes explaining lane-selection logic changes for each update, though executives have shared on X that FSD’s driving behavior will “improve dramatically” with each new version as neural networks continue to be refined. The company markets FSD Supervised as a “beta” product that requires continuous driver monitoring, and the National Highway Traffic Safety Administration has issued multiple recalls urging Tesla to ensure drivers remain alert and remain responsible for the vehicle at all times.
Why Tesla’s Lane Choices Can Feel Unpredictable
Part of the friction stems from how Tesla’s end-to-end neural network interprets traffic. Instead of explicitly programming rules, such as always waiting for a larger gap or maintaining lane position until near an exit, FSD relies on learned patterns from millions of real-world video clips. Tesla described this architecture shift in its 2023 AI Day presentation and subsequent public statements.
This can result in driving styles that feel uncannily human in some settings and overly confident in others, especially when interacting with cross-traffic or cars lingering in a blind spot. The system also performs predictive path planning, meaning it may pre-emptively move toward an upcoming turn far before a human would, or try to optimize throughput by selecting what it believes is the “faster” lane, even if that choice clashes with social driving norms.
Environmental factors can also play a role. Road markings, GPS precision, merges, diverging routes and even slight inconsistencies in traffic flow can nudge the system toward decisions that look abrupt to nearby drivers. Tesla says the car continuously assesses “occupancy networks” to predict where vehicles, pedestrians, and obstacles will be, though the company does not share the full parameters of its decision-making hierarchy.
For the creator, the emotional fallout is as big a part of the story as the technical one.
“I don’t need my car to make me an even worse driver,” Cindy jokes in the clip, pointing out that she already considers herself imperfect behind the wheel. Her comments hint at a growing cultural tension around autonomous and semi-autonomous driving: When the car makes the decision but you take the blame, embarrassment becomes part of the ownership experience.
Tesla owners who defend the system say these issues will diminish as software matures.
The TikTok post adds another real-world data point to a conversation happening across Tesla forums, EV groups, and online communities: Full Self-Driving is advancing fast, but it’s still learning to navigate not just roads, but people.
Drivers like Cindy hope the next update will make the car less eager to slip into occupied lanes and, ideally, make her look a little less like a menace on the road.
InsideEVs reached out to Cindy via direct message and comment on the clip. We’ll be sure to update this if they respond.