Transportation

Next Year Elon Musk Aims To End Employment As We Know It With A Robot Humanoid


A lot of people liken Tesla to Apple, and one thing that makes this comparison valid is the use of big reveal events like last year’s Battery Day, where the $25,000 Tesla was first mooted. The most recent one on Thursday was AI Day. After Tesla announced it was planning a camera-only approach for autonomous driving, the company had some justification to do about how it could still deliver on its promise of Full Self Driving. But the event hinted at something much more significant than autonomous cars: an end to human labor as we know it.

It wasn’t the slickest presentation by Tesla standards. Musk even suggested the event itself needed some AI to run more smoothly. Unlike Battery Day, the AI Day was extremely technical, and the major headlines were harder to discern through most of the slideshows. However, there is an underlying theme to Tesla’s approach that echoes debates from 50 years ago from Hubert Dreyfus, published in his books “What Computers Can’t Do” in 1972 and updated 20 years later in “What Computers Still Can’t Do”. You can’t build AI based on theoretical models alone, you need to develop it based on a lifetime of experience, just like humans do. Tesla isn’t just relying on reality to deliver that, however.

Of course, all AI research for self-driving cars is now based on extensive real-world trials. The advantage that Tesla has over competitors isn’t so much technical, as one of scale. You won’t solve the problems of AI by driving one car around and inferring what you can from that. But if you have a fleet of hundreds of thousands of cars ingesting information, as Tesla does, you might be able to. The first company to deliver true autonomous driving will be the one that has amassed enough driving experience in the shortest space of time.

One of the developments Tesla has made by learning from its cars’ experience is that the system is now moving from interpreting each camera on its cars individually and then putting them together to make a model of the world, to combining the image data first, and then making the model. This has led to a huge improvement in accuracy in the interpretation of features. Another problem was to do with time, on multiple levels. An object that is hidden temporarily behind another hasn’t disappeared – you just can’t see it. So the AI needs to be able to make educated guesses about where a hidden object is, and also remember signs it has seen earlier in the road that refer to where the car is now, such as which lane is for straight ahead, and which is for turning.

Tesla now caches environmental features like these over time to deal with these kinds of situations. Tesla also demonstrated how its new video AI modules could use memory and inference to very closely match radar’s ability to tell the distance and velocity of objects, tacitly explaining why it thinks it can do without radar in the future.

But Tesla has another way of training its self-driving AI more quickly: simulation. Tesla isn’t the only company using this method. Oxobotica in the UK is employing a similar system, relying on “deepfakes”. Simulation helps provide even more data beyond the raw imagery from Tesla’s fleet. Tesla creates realistic roads and traffic conditions using procedural algorithms, to take training beyond what is available in the “real” world. This can then be used to train systems with more scenarios than the real world has so far provided.

Tesla’s switch to cameras-only for its self-driving raised quite a few eyebrows when it was announced, but Thursday’s presentation contained a possible answer why it might just work. Cameras have inherent problems like human eyes. They don’t see so well at night, or in heavy rain, or if the lens is dirty. However, Tesla’s ability to remember objects over time appears to be what makes this less of an issue, because if an object has been seen once, the memory can guess where it is even when it has been obscured. The presentation showed how this could deal with detecting moving vehicles in a snowstorm quite effectively.

If Tesla has scale in raw data, it intends to have massive scale for processing its AI training. This will be accomplished through Project Dojo. With the help of Taiwanese chipmaking giant TSMC (which also manufactures AMD’s CPUs and GPUs), the company has developed its own 7nm chip called D1 specifically aimed at AI compute, and even a custom datacenter architecture to host it. Each D1 chip can deliver 326TFLOPs of 16-bit floating point compute power, compared to NVIDIA’s flagship A100 Ampere GPU, which can only achieve 312TFLOPs. Tesla intends to combine 3,000 of these into a datacenter able to deliver an unprecedented 1.1EFLOPS of compute dedicated to developing its AI models.

As with the hint of the $25,000 car at last year’s Battery Day, however, Musk saved his most revolutionary announcement to a short section at the end of the other presentations. After explaining that Tesla cars are now semi-sentient robots on wheels, he suggested that this naturally would lead the company to develop a humanoid robot that would also benefit from all this AI development, as well as other Tesla strengths such as sensors and battery technology. Musk promised the first prototype in 2022.

Musk also mused on the implications of this Tesla Bot. He claimed it would be friendly and designed so that humans could overpower it or at least run away from it, as its top speed would be 5mph. But he also discussed how it would replace human labor, and potentially end the need for people to work for a living anymore, unless they wanted to. He even mentioned universal basic income in passing, where everyone gets a livable amount of money as standard and will only need to work to get luxuries above the subsistence level. So not only does Musk want to move everyone over to self-driving electric cars and colonize Mars. He also wants to make the dream of AI robot helpers for every household a reality.

You can watch the whole AI Day presentation here.



READ NEWS SOURCE

Also Read  Ford Gets $9.2 Billion Loan To Expand Electric Vehicle Manufacturing