“Autonomous Vehicles and Autonomous Driving [brought with it a huge amount] Everyone thought of publicity that, by 2020 or 2021, we would see a significant number of autonomous vehicles and autonomous services and autonomous robots. It did not happen. I think there is agreement that the reason is the lack of mature sensing techniques. “
On the face, that is to say, a strange thing for Ram Muchness, vice president of product at a company called Ambe Robotics. Eventually, Arbe makes sensors that help drive autonomous vehicles. It is a bit like Apple CEO Tim Cook said that the reason for the decline in the smartphone market last year is that no smartphone is good.
However, machness has a point. And it is one that he hopes will see the company’s new technology – unveiled at all of this year’s virtual CES – help the change. With its new sensing technology, it believes that it won’t last long when more autonomous driving technology is hitting the road. This time for real.
Machness states that the misconception made by many of the people who built self-driving cars was that the algorithm they created for autonomous driving would provide complete information about the world in which these cars were driving. It did not happen. Instead of having the right information about the world they were going into, they were grappling with the challenges of sensing, which they needed to build algorithms before solving, which would power autonomous technologies for various applications. do.
This is to try to teach someone who wants to do his work in an office, who just has to suffer the loss of power. One problem needs to be solved before another can be attempted. And so far this has not been possible.
Might help to replace next-gen radar?
Radar makes a comeback
Radar in particular has not been taken seriously as a way of getting autonomous vehicles to see the world, other than by means of other means to detect the velocity of objects that have already been exposed to other sensors Have identified through the medium. Much of the main discussion involves computer vision using standard cameras or LIDAR, which refers to the bounce lasers used to measure distances. Both approaches have their positivity and negativity.
Radar, which includes bounced sound waves, is much longer than LIDAR, but also has some very big challenges.
As an example, the matchbox shows a black screen with a handful of bright orange dots that extend across its surface. It seems that someone has scattered a small amount of colored paint on a dark wall or, perhaps, a reflection of city lights in the water at night. It is almost impossible to work out what you are seeing. This is traditional radar, he said, a technology that many cars are equipped with today for things like parking sensors, but which nobody really takes seriously for imaging. What we are “seeing” is a street view, complete with other cars and an assortment of additional obstacles.
Muchness jumps in the second video and we are now seeing a psychedelic dashcam of a car that looks like it is going downhill from slanted roads. Apart from the fact that it looks like it was filmed Cruel-Style heat vision, it is fully readable – let alone machines, by humans.
The major upgrades, he said, are the amount of channels broadcast and received by radar. The camera image is machined according to the number of pixels. “If I count the number of channels in today’s radars, they have 12 channels,” he said. “More advanced people have 48 channels. We see some competitors working towards 192 channels. [We’ve developed radar with] 2,000 channels. This is the success. We are able to process them simultaneously. “
As announced on January 11 at CES, Arbe’s new radar technology promises “4D” radar imaging for autonomous vehicles, a next-gen with the ability to detect and identify high-resolution objects separately. Thanks to radar that is 100 times more detailed than any other radar. in the market. This “2K ultra-high resolution” radar technology promises to be “road ready” by 2022, a year from now.
The company is working with a large number of large, but as yet undisclosed partners to dent this technology in future road-possession vehicle platforms. “The problem that Ambe is trying to solve is to bring in an imaging radar that has almost zero false alarms and very high resolution. [to autonomous vehicles,]”Muchness said.
One of the major advantages of radar is the possibility to use it in inclement weather conditions. “Things that are very sensitive to cameras and LIDAR – such as fog, rain, or dust – are significantly less sensitive to radar technology,” Maches said.
Living up to the hype?
Not to say that is the case here, but the CES demo, at its best, can be massaged to improve technology. Anyone can demo live. (Steve Jobs, famously, showcased the original iPhone in 2007 on a model that would fail if he didn’t follow a precise series of steps while working.) Performing in the virtual era – such as LiveStreamed Virtual Shows such as CES 2021 – threw even more opportunities for misrepresentation.
When it comes to autonomous vehicles and imaging, there are a lot of question marks. Until the problem of autonomous driving is met (and what does it actually mean?), There will be disagreement about the best way to build one. For example, Lidar has its staunch supporters, while Tesla CEO Elon Musk described it as “unnecessary” and “a fool’s mistake”.
But such new possibilities do not represent rival rivals, they represent successes that can be part of smart hybrid systems leveraging the best of all worlds. In this capacity, Arbe is not alone in announcing autonomous sensory successes at CES. At this year’s show, Seoul Robotics – a South Korean company – is offering its first large-market product, a multi-industry plug-and-play, next-gen LIDAR solution. Another startup, Cognetta, is introducing Real 2 SIM, a new product that takes data recording from drives and automatically transforms into simulations and datasets.
It is not just self-driving cars that can benefit this technology. Arbe, for its part, has a larger focus on improving autonomous delivery robots so that they can better navigate in the real world. “For the first generation, [creators went] Overkill with the amount of sensors they are using, ”said Maches. “But to try and reduce costs, [they are now trying to] Reduce the amount of sensors, but also increase the safety of those robots and their ability to move everywhere. “
The same technology can also be used to drive autonomous trucks, buses, drones, and more that will come on the road in increasing numbers over the next several years.
Autonomous vehicles have been a major-grabbing part of CES since at least 2013. But this year, coronoviruses have snatched the flash of the live event, hoping that what emerges will focus more on the substance and solve some problems. Autonomous vehicles have been partially in the realm of science fiction for so long.
Because who doesn’t want to see the next generation of vehicles with self-driving technology?