Issue
09
Getting
Closer to FSD
(An EV driving in Full Self-Driving mode)
January,
2019
Tesla is leaps and
bounds ahead of any competition in this field
(like Google's Waymo and GM's Cruise, especially in 2023
as you'll see below)
"Tesla's
Full Self-Driving feature uses vision-based computer neural networks
to perceive and understand the world, just like humans do. But its
"eyes" (cameras) are looking in six different directions
simultaneously, which is five more than humans can do.
Via our unique fleet learning approach, we are able to collect anonymized
data from our millions of vehicles on the road meaning, the
neural networks have learned from many thousands of (good) drivers,
which is more driving scenarios than the average human driver could
ever learn from (including the unusual and weird ones)." |
Keep in mind, what
you see on the screen is what the car's computer "sees",
and if it sees a stop sign, it will stop, and it will treat a red light
just as you would.
This is thanks to Tesla's industry-leading software and hardware.
These cars are traveling
around driven by the car's computer.
Because the software is still in the beta testing phase, the 14,000+
beta testers must
keep their hands on the wheel and their foot covering the brake pedal
just in case.
(UPDATE: As of 2023, no injuries or fatalities. Just one dented traffic
cone.)
The screen you see
in the photos below is in the
middle of the dashboard on Tesla vehicles...
A better look at
the traffic light visualization
(This shot was from someone driving at night;
you can tell because the screen background is dark
and you can see that the icon of the car's headlights are on.)
Whatever the car
"sees" it will avoid.
But if it must hit a trashcan or a person, it "knows" to hit
the trashcan.
It "sees"
all the cones!
This pic shows you
that the stop signs are not "GPS mapped";
they are actually being "seen" and recogized by the car's
computer.
BTW, a half second later, the other pedestrian appeared on the screen
too.
Here in a UK Tesla,
the car can now hug the correct side
of this two-way road when lane markings completely disappear.
People are reporting that this has greatly improved because of the ability
of the car to see and comprehend what we see and comprehend.
And it will dodge puddles unless it can't see far enough ahead to know
it can do so safely.
Note: This person was using AutoPilot on secondary roads
where it's not supposed to be used, but this does give the
fleet neural network a lot of data needed for Level 4 FSD
(which is when you can go to sleep in the car).
And the computer
can now "see" lane markings too!
But it doesn't just recognize them, it knows what they mean.
The cars can now
recognize stop signs, traffic lights, and other road markings and objects,
and this is being demonstrated to Tesla owners through the "FSD
Visualization Preview Mode" as seen above. This is not yet
full Level 4 FSD... it's still Level 3 (Driver Assistance Features),
so there still needs to be someone in the driver's seat and paying attention
to the road. With level 4, no human driver is needed.
There is now a button
you can tap when the car doesn't recognize something that
it should, and this will send "Tesla Central" the video footage,
GPS location, and other
data to help improve FSD. With the "fleet learning" of hundreds
of thousands of
cars on the road, the fleet's "neural net" will learn quickly,
and this info is then
shared with all Teslas via Over-The-Air updates.
[End of the January
2019 portion]
UPDATE!
(If you're not familiar with Tesla's self-driving
technology, start at the top of this page)
August 25, 2023
An auspicious date
Today,
an "alpha" release of version 12 of Tesla's self-driving software
was demonstrated via a livestream of Tesla's CEO Elon Musk having this
software drive his car around Palo Alto, California. Only a handful
of people are testing this new version because version 12 is a huge
change from all previous versions. It is "end-to-end" AI neural
networks, and no longer contains hundreds of thousands of lines of hardcoded
instructions written by programmers. The main AI training computer is
teaching itself how to drive by watching tons of videos
of Tesla drivers driving (the good drivers). This is how this version
is being taught how to drive (just like when you taught your teenager
to drive).
So when, during this drive by Elon, the car approached a roundabout,
unlike the previous versions that had programming that said, "When
entering a roundabout, do this if there are no cars coming, and do this
if there are cars coming, and stop if there is a stop sign, etc",
there was no such programming... the car simply knew how to handle the
roundabout because the multi-billion-dollar master training computer
watched tons of Tesla cars handling all kinds of roundabouts, and then
wrote its own software which was incorporated into the main software
that was sent to Elon's car via an Over-The-Air software update (which
all Teslas can do). And not having to follow tons of lines of computer
code, the car's self-driving computer can respond even quicker than
before, and this will result in even less accidents, and
in even more lives being saved.
This is a game-changing advancement, and it's only possible because
of the billions of dollars Tesla has spent and continues to spend on
"compute hardware", and because of the billions of miles of
driving data from its fleet of 4.5 million cars that Tesla owners are
driving around... all cars send driving data to "Tesla Central"
(not locations and not where you like to shop, just driving data, like
speed, acceleration, deceleration, steering wheel position, angular
momentum, brake and accelerator pedal positions, and video from its
eight cameras that correspond with all that data).
And think about this: No other company would do a livestream
of their latest autonomous driving software; they would record the video
and edit out what they didn't want shown. What if Elon's car made an
error in "judgment"? That would get negative press all over
the mainstream news (actually, it did make one mistake, but because
this is unfinished software, the driver is ready to take control at
all times, so there were no consequences. More about the mistake below).
Livestreaming the 45 minute drive shows the confidence Tesla's CEO has
in the new software (and the chief engineer of this new software was
along for the drive).
And there were no pre-planned routes (as would be done by other car
companies to minimize unexpected results); you could see that Elon was
picking random spots to drive to, and one was a busy college campus
area with lots of pedestrians (don't worry, there is a neural network
devoted to VRUs... Vulnerable Road Users, such as adults, children,
dogs, bicyclists, motorcyclists, wheelchairs, baby strollers, and runners/joggers).
And he picked that location based on comments he was getting during
the livestream!
Watching the video of this game-changing technology had to be akin to
watching the first successful flight of the Wright Brothers' airplane.
Imagine a time in the not-too-distant future when all
cars are self-driving, and instead of 40,000 deaths annually due to
vehicular accidents, that number goes down to 400, where 380 are due
to small children and distracted adults walking out into the road from
behind a parked car where even the best driver couldn't stop in time,
and the other 20 deaths per year are caused by computer error. That's
39,980 less deaths thanks to self-driving cars.
The
mistake
Elon's red
car (shown below), running an early build of version 12 of the
self-driving software, was stopped at a red light in the lane
shown below, intending to go straight. Both lights were red, and
then the light for the left turn lane changed to green, and Elon's
car started moving! But it was a round green light
and not a green arrow. So the computer, seeing
the red light change to green, mistook the round green light for
its light, and started accelerating (Elon immediately hit the
brakes, so, no, his car did not run a red light
as some headlines would have you believe). If the left turn light
was an arrow as it should be the mistake
would not have happened. The fix? Teach the software that sometimes
left turn lights can be a round light (when the Department of
Traffic screws up royally). Below is a photo of the actual lights
as viewed from Elon's car as part of the video streamed from his
phone.
|
Along for the historic
drive was the chief engineer of this new software technology,
Tesla's head of AI Software Engineering, Ashok Elluswamy, who Tweeted...
And
if you're wondering about the multi-billion-dollar giant super-computer
that watches all the videos and creates the software that will be sent
to the cars, it includes 10,000 of these chips (at $40,000 each chip)...
And
Tesla is building a data center that will use their own custom-made
chips
(because NVIDIA can't provide Tesla with enough of their chips)
Here's a sneak peak at a tiny portion...
Yep, Tesla is obviously
not just a car company!
November 24,
2023
FSD v12 rolls out to Tesla employees
Version 12 is a
highly-anticipated update because it transitions FSD Beta from a rules-based
to a neural network-path-based AI approach. As such, it will base decisions
on human driving data rather than rules set by a programmer. The straight
programming code went from 300,000 lines of code down to 300 (presumedly
the remaining code affords extra protection for pedestrains).
"Tesla in FSD"
= 14,000 cars
And GM's self-driving
Cruise division just shut down after one
of its cars had an altercation with a pedestrian, and after
Cruise lied to authorities about what happened.
How Tesla
haters attempt to get you
to believe that Tesla is in last place
when it comes to Self-Driving software
when they're really in first place
(It's easy, just fabricate a chart)
Look where Tesla
is placed (circled in red). Last place.
Look where Tesla actually is by all metrics (red dot).
Big difference.
(And some of those names shouldn't even be on the chart!)
An
example of a "mistake" that FSD made
The
Tesla in the photo below is in the left turn lane, signaling to make
a left. It doesn't have to wait for a left green arrow; it can go. But
even though there are no oncoming cars, it's not going. It waits until
that white opposing car (which is also making a left) goes, and then
it makes its left. The driver said that he would have
gone sooner and not waited.
So why did FSD make this "mistake"? Its "eyes" are
in the center of the car (behind the rearview mirror, shown in orange),
so the driver's eyes had a better view of possible left lane oncoming
traffic than the computer did, so the computer waited until it had a
clear view. It waited until it was safe. So, not really a "mistake".
I've been in that same position, and I had to move my head as far left
as I could to see if there was oncoming traffic in that left lane. I
know people who did not have a good enough view of that
left lane oncoming traffic, went, and got hit. This is an example of
"better to be safe than sorry". (But I'm sure if there was
a car behind the Tesla, they would have been blowing their horn.)
How
Teslas are using their computer's neural network processing to provide
a feature...
in this case, "Park Assist"
The
photo below is showing the Park Assist screen. The representations
of the surrounding vehicles are being created by the car's computer
"seeing" them with the car's eight cameras, remembering where
the vehicles are in space, and then creating representations of where
they are on the screen so the Tesla can self-park without hitting anything.
We're told that in the future, the renderings of the surrounding vehicles
will look better, but that's just for our benefit. Note:
None of this is being done with the ultrasonic sensors that most cars
have today (those eight little button-shaped things around a car). This
is being done just with vision!
How
the mainstream media misrepresents
news about Tesla to paint them a negative light
(and there are multiple
motivations for them to do this)
The
screenshot below displays a graphic that is misleading (intentionally).
It would appear that Tesla's AutoPilot has caused 736 accidents and
17 fatalities. But it hasn't. Here's how they accomplish this deceit.
1) They use the term "AutoPilot" instead of the correct term
"Full Self-Driving" (FSD). AutoPilot is part of FSD, and FSD
has been in the testing phase since 2019 and is still in the testing
phase; it has not been released as "finished" as of the publishing
of the "news report" below (and they know this). When the
drivers who are testing this software turn on this software, they are
reminded to keep their hands on the wheel and be ready to apply the
brakes if needed. They are reminded that they are responsible
for the car's behavior. And there have been occasions where the test
drivers were not paying close enough attention to the road, and an accident
resulted. This is not a fault of the software.
2) There have been many reports of Teslas crashing where FSD was engaged,
but when investigated, FSD was found not to be engaged,
and the crash was due to "driver error". So to include those
crashes in the below "crashes" figure is irresponsible. But
that describes lots of media outlets to a T (including The Washington
Post). Consider that there are 1.4 million Teslas on the road in the
U.S., and their rate of crashes is less than any other
car brand (primarily due to AutoPilot which is used on highways). But
does Yahoo Finance mention this in the spirit of "balanced reporting"?
No.
So, to report those figures below and attribute them to "AutoPilot"
is negligent and irresponsible. To date, no one has died as a direct
result of them properly using AutoPilot; in fact, the opposite is true.
Yes, people have died driving Teslas, but people have also died driving
every other car model, and less people die in Teslas than in other brands
(also not mentioned by Yahoo Finance). And the people who died driving
Teslas where it was their fault were found to be driving recklessly,
negligently, or irresponsibly. And some Tesla drivers died because it
was the other car's fault. So that "fatalities" figure is
an attempt by this media outlet to spread FUD (fear, uncertainty, doubt)
about Tesla vehicles.
Other Tesla Software
News
With the car's
latest regular software update, Tesla implemented another cool
customer requested feature. Now when you honk your horn, 20 seconds
of the built-in dashcam footage (10 seconds on either side
of the honk) will automatically be saved to the cloud. The eight
cameras are always recording when you're driving, but the system
only records the last 10 minutes of time (after 10 minutes it
overwrites anything that came before those 10 minutes). You can
manually save the last ten minutes by tapping a button on the
screen, but this new feature makes it easier and quicker to save
camera footage.
I remember when someone Tweeted this suggestion to Elon, and he
Tweeted back, "Good idea!". And, boom, here it is! Try
getting that level of customer service from the
CEOs of GM or Toyota!
|
And
with the latest software update, Tesla also implemented another
customer requested feature, and this one is great if more than
one person shares the car. Each person's keyfob is now "keyed"
to a ton of settings that can be part of a "driver profile".
No more having to choose your profile when you get into the car.
Now the car knows who you are and enacts the settings in your
driver profile, automatically.
And look at all the features and settings that can be saved...
|
More misrepresentation
JerryRigEverything
is a sellout. He's now being paid by Ford to talk up Ford EVs, and he
thinks this should include him trashing Teslas, which he had spoken
highly of prior to his Ford deal. Jerry had this to say about Mercedes
beating Tesla at the self-driving game...
But
there are a ton of restrictions with Mercedes "Level 3" ProPilot
autonomous system that Jerry fails to mention (because if he mentioned
them, you'd likely say, "What good is this system!")
It only works on 0.2% of U.S. roads, and can only be used on highways
The car's speed must be under 40 MPH (only good for stop-and-go
traffic)
You can't be the only car on the road (there must be a car
in front of you)
The weather must be good (no rain, and above 40°F)
It must be daytime
No construction zones (no orange cones or white bollards)
Tesla's
Full Self-Driving system has none of these restrictions.
You can use it on any road in the U.S. Because it is currently a "supervised"
system, you just need to keep your eyes on the road and lightly touch
the steering wheel every 30 seconds (for now). When this system is good
enough to be Level 3 (no constant driver attention needed), in an apples-to-apples
comparison with Mercedes' system, it blows it away. And it's very close
to reaching Level 3 status, and when it does, all 2.5 million Teslas
in the U.S. can have it. But, to be fair, the 9,000 Mercedes EQS EVs
that have been sold can have ProPilot now. Yep, context is important
Jerry.
More
about Tesla
|