Issue
09
Getting
Closer to FSD
(An EV driving in Full Self-Driving mode)
January,
2019
Tesla is leaps and
bounds ahead of any competition in this field
(like Google's Waymo and GM's Cruise,
especially in 2023 as you'll see below)
"Tesla's
Full Self-Driving feature uses vision-based computer neural networks
to perceive and understand the world, just like humans do. But its
"eyes" (cameras) are looking in six different directions
simultaneously, which is five more than humans can do.
Via our unique fleet learning approach, we are able to collect anonymized
data from our millions of vehicles on the road meaning, the
neural networks have learned from many thousands of (good) drivers,
which is more driving scenarios than the average human driver could
ever learn from (including the unusual and weird ones)." |
Keep in mind, what
you see on the screen is what the car's computer "sees",
and if it sees a stop sign, it will stop, and it will treat a red light
just as you would.
This is thanks to Tesla's industry-leading software and hardware.
These cars are traveling
around driven by the car's computer.
Because the software is still in the beta testing phase, the 14,000+
beta testers must
keep their hands on the wheel and their foot covering the brake pedal
just in case.
(UPDATE: As of 2023, no injuries or fatalities. Just one dented traffic
cone.)
The screen you see
in the photos below is in the
middle of the dashboard on Tesla vehicles...
A better look at
the traffic light visualization
(This shot was from someone driving at night;
you can tell because the screen background is dark
and you can see that the icon of the car's headlights are on.)
Whatever the car
"sees" it will avoid.
But if it must hit a trashcan or a person, it "knows" to hit
the trashcan.
It "sees"
all the cones!
This pic shows you
that the stop signs are not "GPS mapped";
they are actually being "seen" and recogized by the car's
computer.
BTW, a half second later, the other pedestrian appeared on the screen
too.
Here in a UK Tesla,
the car can now hug the correct side
of this two-way road when lane markings completely disappear.
People are reporting that this has greatly improved because of the ability
of the car to see and comprehend what we see and comprehend.
And it will dodge puddles unless it can't see far enough ahead to know
it can do so safely.
Note: This person was using AutoPilot on secondary roads
where it's not supposed to be used, but this does give the
fleet neural network a lot of data needed for Level 4 FSD
(which is when you can go to sleep in the car).
And the computer
can now "see" lane markings too!
But it doesn't just recognize them, it knows what they mean.
The cars can now
recognize stop signs, traffic lights, and other road markings and objects,
and this is being demonstrated to Tesla owners through the "FSD
Visualization Preview Mode" as seen above. This is not yet
full Level 4 FSD... it's still Level 3 (Driver Assistance Features),
so there still needs to be someone in the driver's seat and paying attention
to the road. With level 4, no human driver is needed.
There is now a button
you can tap when the car doesn't recognize something that
it should, and this will send "Tesla Central" the video footage,
GPS location, and other
data to help improve FSD. With the "fleet learning" of hundreds
of thousands of
cars on the road, the fleet's "neural net" will learn quickly,
and this info is then
shared with all Teslas via Over-The-Air updates.
[End of the January
2019 portion]
UPDATE!
(If you're not familiar with Tesla's self-driving
technology, start at the top of this page)
August 25, 2023
An auspicious date
Today,
an "alpha" release of version 12 of Tesla's self-driving software
was demonstrated via a livestream of Tesla's CEO Elon Musk having this
software drive his car around Palo Alto, California. Only a handful
of people are testing this new version because version 12 is a huge
change from all previous versions. It is "end-to-end" AI neural
networks, and no longer contains hundreds of thousands of lines of hardcoded
instructions written by programmers. The main AI training computer is
teaching itself how to drive by watching tons of videos
of Tesla drivers driving (the good drivers). This is how this version
is being taught how to drive (just like when you taught your teenager
to drive).
So when, during this drive by Elon, the car approached a roundabout,
unlike the previous versions that had programming that said, "When
entering a roundabout, do this if there are no cars coming, and do this
if there are cars coming, and stop if there is a stop sign, etc",
there was no such programming... the car simply knew how to handle the
roundabout because the multi-billion-dollar master training computer
watched tons of Tesla cars handling all kinds of roundabouts, and then
wrote its own software which was incorporated into the main software
that was sent to Elon's car via an Over-The-Air software update (which
all Teslas can do). And not having to follow tons of lines of computer
code, the car's self-driving computer can respond even quicker than
before, and this will result in even less accidents, and
in even more lives being saved.
This is a game-changing advancement, and it's only possible because
of the billions of dollars Tesla has spent and continues to spend on
"compute hardware", and because of the billions of miles of
driving data from its fleet of 4.5 million cars that Tesla owners are
driving around... all cars send driving data to "Tesla Central"
(not locations and not where you like to shop, just driving data, like
speed, acceleration, deceleration, steering wheel position, angular
momentum, brake and accelerator pedal positions, and video from its
seven cameras that correspond with all that data).
And think about this: No other company would do a livestream
of their latest autonomous driving software; they would record the video
and edit out what they didn't want shown. What if Elon's car made an
error in "judgment"? That would get negative press all over
the mainstream news (actually, it did make one mistake, but because
this is unfinished software, the driver is ready to take control at
all times, so there were no consequences. More about the mistake below).
Livestreaming the 45 minute drive shows the confidence Tesla's CEO has
in the new software (and the chief engineer of this new software was
along for the drive).
And there were no pre-planned routes (as would be done by other car
companies to minimize unexpected results); you could see that Elon was
picking random spots to drive to, and one was a busy college campus
area with lots of pedestrians (don't worry, there is a neural network
devoted to VRUs... Vulnerable Road Users, such as adults, children,
dogs, bicyclists, motorcyclists, wheelchairs, baby strollers, and runners/joggers).
And he picked that location based on comments he was getting during
the livestream!
Watching the video of this game-changing technology had to be akin to
watching the first successful flight of the Wright Brothers' airplane.
Imagine a time in the not-too-distant future when all
cars are self-driving, and instead of 40,000 deaths annually due to
vehicular accidents, that number goes down to 400, where 380 are due
to small children and distracted adults walking out into the road from
behind a parked car where even the best driver couldn't stop in time,
and the other 20 deaths per year are caused by computer error. That's
39,980 less deaths thanks to self-driving cars.
The
mistake
Elon's red
car (shown below), running an early build of version 12 of the
self-driving software, was stopped at a red light in the lane
shown below, intending to go straight. Both lights were red, and
then the light for the left turn lane changed to green, and Elon's
car started moving! But it was a round green light
and not a green arrow. So the computer, seeing
the red light change to green, mistook the round green light for
its light, and started accelerating (Elon immediately hit the
brakes, so, no, his car did not run a red light
as some headlines would have you believe). If the left turn light
was an arrow as it should be the mistake
would not have happened. The fix? Teach the software that sometimes
left turn lights can be a round light (when the Department of
Traffic screws up royally). Below is a photo of the actual lights
as viewed from Elon's car as part of the video streamed from his
phone.
|
Along for the historic
drive was the chief engineer of this new software technology,
Tesla's head of AI Software Engineering, Ashok Elluswamy, who Tweeted...
And
if you're wondering about the multi-billion-dollar giant super-computer
that watches all the videos and creates the software that will be sent
to the cars, it includes 10,000 of these chips (at $40,000 each chip)...
And
Tesla is building a data center that will use their own custom-made
chips
(because NVIDIA can't provide Tesla with enough of their chips)
Here's a sneak peak at a tiny portion...
Yep, Tesla is obviously
not just a car company!
November 24,
2023
FSD v12 rolls out to Tesla employees
Version 12 is a
highly-anticipated update because it transitions FSD Beta from a rules-based
to a neural network-path-based AI approach. As such, it will base decisions
on human driving data rather than rules set by a programmer. The straight
programming code went from 300,000 lines of code down to 300 (presumedly
the remaining code affords extra protection for pedestrains).
"Tesla in FSD"
= 14,000 cars
And GM's self-driving
Cruise division just shut down after one
of its cars had an altercation with a pedestrian, and after
Cruise lied to authorities about what happened.
How Tesla
haters attempt to get you
to believe that Tesla is in last place
when it comes to Self-Driving software
when they're really in first place
(It's easy, just fabricate a chart)
Look where Tesla
is placed (circled in red). Last place.
Look where Tesla actually is by all metrics (red dot).
Big difference.
(And some of those names shouldn't even be on the chart!)
An
example of a "mistake" that FSD made
The
Tesla in the photo below is in the left turn lane, signaling to make
a left. It doesn't have to wait for a left green arrow; it can go. But
even though there are no oncoming cars, it's not going. It waits until
that white opposing car (which is also making a left) goes, and then
it makes its left. The driver said that he would have
gone sooner and not waited.
So why did FSD make this "mistake"? Its "eyes" are
in the center of the car (behind the rearview mirror, shown in orange),
so the driver's eyes had a better view of possible left lane oncoming
traffic than the computer did, so the computer waited until it had a
clear view. It waited until it was safe. So, not really a "mistake".
I've been in that same position, and I had to move my head as far left
as I could to see if there was oncoming traffic in that left lane. I
know people who did not have a good enough view of that
left lane oncoming traffic, went, and got hit. This is an example of
"better to be safe than sorry". (But I'm sure if there was
a car behind the Tesla, they would have been blowing their horn.)
How
Teslas are using their computer's neural network processing to provide
a feature...
in this case, "Park Assist"
The
photo below is showing the Park Assist screen. The representations
of the surrounding vehicles are being created by the car's computer
"seeing" them with the car's seven cameras, remembering where
the vehicles are in space, and then creating representations of where
they are on the screen so the Tesla can self-park without hitting anything.
We're told that in the future, the renderings of the surrounding vehicles
will look better, but that's just for our benefit. Note:
None of this is being done with the ultrasonic sensors that most cars
have today (those eight little button-shaped things around a car). This
is being done just with vision!
How
the mainstream media misrepresents
news about Tesla to paint them a negative light
(and there are multiple
motivations for them to do this)
The
screenshot below displays a graphic that is misleading (intentionally).
It would appear that Tesla's AutoPilot has caused 736 accidents and
17 fatalities. But it hasn't. Here's how they accomplish this deceit.
1) They use the term "AutoPilot" instead of the correct term
"Full Self-Driving" (FSD). AutoPilot is part of FSD, and FSD
has been in the testing phase since 2019 and is still in the testing
phase; it has not been released as "finished" as of the publishing
of the "news report" below (and they know this). When the
drivers who are testing this software turn on this software, they are
reminded to keep their hands on the wheel and be ready to apply the
brakes if needed. They are reminded that they are responsible
for the car's behavior. And there have been occasions where the test
drivers were not paying close enough attention to the road, and an accident
resulted. This is not a fault of the software.
2) There have been many reports of Teslas crashing where FSD was engaged,
but when investigated, FSD was found not to be engaged,
and the crash was due to "driver error". So to include those
crashes in the below "crashes" figure is irresponsible. But
that describes lots of media outlets to a T (including The Washington
Post). Consider that there are 1.4 million Teslas on the road in the
U.S., and their rate of crashes is less than any other
car brand (primarily due to AutoPilot which is used on highways). But
does Yahoo Finance mention this in the spirit of "balanced reporting"?
No.
So, to report those figures below and attribute them to "AutoPilot"
is negligent and irresponsible. To date, no one has died as a direct
result of them properly using AutoPilot; in fact, the opposite is true.
Yes, people have died driving Teslas, but people have also died driving
every other car model, and less people die in Teslas than in other brands
(also not mentioned by Yahoo Finance). And the people who died driving
Teslas where it was their fault were found to be driving recklessly,
negligently, or irresponsibly. And some Tesla drivers died because it
was the other car's fault. So that "fatalities" figure is
an attempt by this media outlet to spread FUD (fear, uncertainty, doubt)
about Tesla vehicles.
Advantages
of a computer doing the driving
instead of a human
* It
can see in six different directions at the same time, a human can only
see in one direction at a time, meaning, a human must take their eyes
off the road to look at the sideview mirrors or the rearview mirror
or their blind spots. A computer is always looking forward.
* A computer can't be distracted
* A computer can't get tired or fall asleep at the wheel
* A computer can't drive drunk
* A computer can't get into a heated argument with the passenger and
take its eyes off the road for too long
* A computer can't do "road rage" (which is one reason why
I want all the cars around me to be driven by a computer)
* A computer can be programmed to be courteous to pedestrians and other
cars
* A computer can have a faster reaction time than a human can (could
be the difference between life and death for a distracted pedestrian)
* A computer can't be scared and freeze up for a moment before reacting,
like a human can (like when a child suddenly appears in the road from
between two parked cars)
* A computer can't have a "senior moment" (which is when seniors
start thinking about turning in their drivers license)
* When driving, a computer can think only about what its doing and nothing
else; its full attention is on driving, unlike a human (a computer can't
daydream)
* A computer can be programmed to be a great driver from its very first
day on the road
* If the road's lane lines are too faint to see, it's a problem for
us but not for a computer. It simply sizes up the "drivable space",
calculates how many lanes there should be there, and drives
accordingly! On a Tesla, you even get to see the lane lines represented
on the screen even though they can't be seen on the road! Humans can't
do this.
Yes, computers can "crash", and if its a driving computer,
this could result in an actual crash. But the reasons for computer crashes
are a known entity, and a crash-proof computer can be designed, like
the computers in use on spacecraft. And that's how Tesla designs its
driving computer (can't say the same for other car makers). But even
a crash-proof computer could malfunction, and people could die because
of this. But if the approximately 44,000 vehicular deaths per year in
the U.S. were reduced to 300 deaths because cars became computer driven,
wouldn't those 43,700 prevented deaths be worth the transition? Of course
it would! And over time, those 300 deaths would reduce closer to zero.
"It's
getting more difficult to improve our Full Self-Driving software"
Elon Musk
Why? Because
it's so good now, that Tesla's professional test drivers are getting
bored because there's an "intervention" about every
10,000 miles. That's a long time between interventions, so it's
getting difficult to know which version of the latest build of
the software is best because they're all extremely good. This
is a good problem to have. By-the-way, an "intervention"
is when the safety driver must intervene and take over the steering,
or press the brake or accelerator pedal. And interventions come
in two flavors: a "safety intervention" and a "comfort
intervention". Safety interventions could be serious, but
with the latest version, there are no more safety interventions...
they're all comfort interventions, where the car chose the wrong
lane, and if not immediately corrected, the car will have to reroute
itself to get back on course to its destination. Or the car isn't
going fast enough, so its speed must be manually increased so
as not to piss off those driving behind it. But these will be
corrected as they come up. But they're getting less and less as
time goes by.
|
"But
there are already other driverless ride-hail cars on the road,
so Tesla has lost the race."
Nope. Just
because Tesla wasn't "first to market" doesn't mean
they lost the race. Not when you consider the following...
* When you're
a passenger in a Waymo driverless ride-hail vehicle (in certain
areas of California), there is a human behind the
wheel, you just don't see him. He is sitting in an office somewhere
with multiple screens in front of him, and a steering wheel and
brake and accelerator controls. And these driving monitors are
intervening every now and then, but the passenger has no idea
(and the press doesn't mention it because Waymo is owned by the
very powerful Google). And remember that those humans cost the
company money (salary).
* Waymo driverless vehicles can't drive around in an area until
Waymo pre-maps the entire area so that info can be programmed
into the cars. The Tesla self-driving system doesn't require this.
You can plunk down one of their self-driving cars anywhere, and
it will drive fine, no pre-mapping necessary.
* Waymo (and many others) use many different types of sensors
on the car (cameras, LiDARs, Radar, ultrasonic detectors), and
this can (and does) result in "sensor conflict" where
one type of sensor says one thing, but another type of sensor
says the opposite. This is why, if we can get a vision-only system
to work (just cameras), it is better/safer. And Tesla has done
that. Plus, if you want the self-driving feature on your
car (because it can drive safer than you can), you can't have
that if it's a system that uses all those different sensors (too
expensive). But a vision-only system just has seven inexpensive
cameras, and millions of Teslas have been manufactured from Day
One with those cameras as standard equipment, along with the self-driving
computer! (Yep, Tesla spent money on this equipment many years
before its self-driving software was ready; talk about forward-thinking;
and no, this cost was not simply passed on to the consumer).
* Ride-hail services exist to make money... i.e., to be profitable.
All the other ride-hail services are not making profit yet (and
maybe never will when Tesla's ride-hail service opens for business).
Why is this? As mentioned above, other driverless cars have tons
of very expensive sensors on them, so it's difficult to scale
their fleet. But in the U.S., there are 2 million Tesla that have
the self-driving equipment already installed, and they are ready
to be part of Tesla's ride-hail service at the push of a button
wherever there are Teslas (and there are tons of them in California).
And when that happens, good-bye to all the other ride-hail services
(because Tesla's will be less expensive to use due to the economies-of-scale).
So, when the software is "ready for primetime", it will
be uploaded to all the Teslas. And that's when it's game-over.
So, who really
won the race?
|
Other Tesla Software
News
With the car's
latest regular software update, Tesla implemented another cool
customer requested feature. Now when you honk your horn, 20 seconds
of the built-in dashcam footage (10 seconds on either side
of the honk) will automatically be saved to the cloud. The seven
cameras are always recording when you're driving, but the system
only records the last 10 minutes of time (after 10 minutes it
overwrites anything that came before those 10 minutes). You can
manually save the last ten minutes by tapping a button on the
screen, but this new feature makes it easier and quicker to save
camera footage.
I remember when someone Tweeted this suggestion to Elon, and he
Tweeted back, "Good idea!". And, boom, here it is! Try
getting that level of customer service from the
CEOs of GM or Toyota!
|
And
with the latest software update, Tesla also implemented another
customer requested feature, and this one is great if more than
one person shares the car. Each person's keyfob is now "keyed"
to a ton of settings that can be part of a "driver profile".
No more having to choose your profile when you get into the car.
Now the car knows who you are and enacts the settings in your
driver profile, automatically.
And look at all the features and settings that can be saved...
|
More misrepresentation
JerryRigEverything
is a sellout. He's now being paid by Ford to talk up Ford EVs, and he
thinks this should include him trashing Teslas, which he had spoken
highly of prior to his Ford deal. Jerry had this to say about Mercedes
beating Tesla at the self-driving game...
But
there are a ton of restrictions with Mercedes "Level 3" ProPilot
autonomous system that Jerry fails to mention (because if he mentioned
them, you'd likely say, "What good is this system!")
It only works on 0.2% of U.S. roads, and can only be used on highways
The car's speed must be under 40 MPH (only good for stop-and-go
traffic)
You can't be the only car on the road (there must be a car
in front of you)
The weather must be good (no rain, and above 40°F)
It must be daytime
No construction zones (no orange cones or white bollards)
Tesla's
Full Self-Driving system has none of these restrictions.
You can use it on any road in the U.S. Because it is currently a "supervised"
system, you just need to keep your eyes on the road (the computer can
tell when you aren't). When this system is good enough to be Level 3
(no constant driver attention needed), in an apples-to-apples comparison
with Mercedes' system, it blows it away. And it's very close to reaching
Level 3 status, and when it does, all 2.5 million Teslas in the U.S.
can have it. But yes, the 9,000 Mercedes EQS EVs that have been sold
in the U.S. can have ProPilot now. Yep, context is important Jerry.
Tesla's
FSD computer can read the words painted on the road
Why
is this ability important? Look at the photos below. The UPS truck was
hiding the STOP sign from view (a human driver's view and the computer's
view). You can tell the computer wasn't seeing the STOP sign because
there wasn't a STOP sign being displayed on the screen. But luckily
the word "STOP" was painted on the road, which is why the
computer stopped the car at the intersection (the word "STOP"
is circled in yellow, and it appears on the screen but it's very difficult
to see it in the photo). Would you have stopped at this
intersection? There have been lots of accidents at intersections for
this very reason; the STOP sign being hidden by a large parked truck
(which is why the UPS driver should be fired).
Notice, the car is
doing 20 MPH, no feet on pedals or hands on the wheel
And
how do you know the Tesla slowed down for the stop? Easy.
See the dark blue chevrons (arrows) within the car's Intended Route
graphic (the blue line emanating from the car)? Those chevrons mean
the car is slowing. If those chevrons were facing the other way
forward that would mean the car was accelerating.
More
about Tesla
|