Here's the video of a Model 3 doing FSD
https://www.youtube.com/watch?v=tlThdr3O5Qo
If you want the full details of either the hardware, or the software I'd recommend watching the full presentation. Musk had the heads of those teams give extremely detailed and technical presentations:
https://youtu.be/Ucp0TTmvqOE?t=4308
Starts with the hardware guy about 1 hour 12.
Nvidia response:
In unveiling the specs of his new self-driving car computer at this week’s Tesla Autonomy Day investor event, Elon Musk made several things very clear to the world.
First, Tesla is raising the bar for all other carmakers.
Second, Tesla’s self-driving cars will be powered by a computer based on two of its new AI chips, each equipped with a CPU, GPU, and deep-learning accelerators. The computer delivers 144 trillion operations per second (TOPS), enabling it to collect data from a range of surround cameras, radars and ultrasonics and power deep neural network algorithms.
Third, Tesla is working on a next-generation chip, which says 144 TOPS isn’t enough.
At NVIDIA, we have long believed in the vision Tesla reiterated: self-driving cars require computers with extraordinary capabilities.
Which is exactly why we designed and built the NVIDIA Xavier SoC several years ago. The Xavier processor features a programmable CPU, GPU and deep learning accelerators, delivering 30 TOPs. We built a computer called DRIVE AGX Pegasus based on a two chip solution, pairing Xavier with a powerful GPU to deliver 160 TOPS, and then put two sets of them on the computer, to deliver a total of 320 TOPS.
And as we announced a year ago, we’re not sitting still. Our next-generation processor Orin is coming.
That’s why NVIDIA is the standard Musk compares Tesla to—we’re the only other company framing this problem in terms of trillions of operations per second, or TOPS.
https://blogs.nvidia.com/blog/2019/04/23/tesla-self-driving/
What they aren't mentioning is that DRIVE AGX Pegasus has a 500 watt TDP, while the Tesla chip is 72 watt.
How does it handle rain
If you watch the presentation on the neural network the guy explains how it handles X situation, someone specifically asked him about snow. They have the ability to ask the whole Tesla fleet to send them an image when they encounter something that looks like what they want, and they can then take that image and annotate it and train the neural net against it so it can recognise those objects.
They are using that to predict cut ins, so the car is able to predict when you will be cut off regardless of if the indicator is on or not.
The same way you handle rain.
Completely disregarding said rain and flying along like a madman?
The current build probably does just that. It sounds like the current control aspect and how to deal with things is mostly manually written, and they are just starting to add bits of the control of the car into the neural net. They are working on moving it mostly towards the neural net so they don't have to manually write code for everything the vision side might encounter.
So it gathers data via cameras, right? How does the system respond to the images from several of those cameras being distorted by rainwater on the lens?
I doubt it's very affected by rainwater.
The cameras are recessed behind protective glass plates, and probably has aperture openings large enough where random rain droplets won't do much other than reducing the contrast and introducing minor optical artifacts.
It must be able to recognise rain on the cameras as Tesla control the auto wipers with the vision system. The repeater cameras are probably the most likely to be subjected to rain issues I guess.
Nice. My computer can't drive itself, let alone me drive it.
Sorry, you need to Log In to post a reply to this thread.