Links to Pages

Friday, December 15, 2017

AEye

What was fascinating to me here is how similar the description of what was up with LiDAR during the CC seemed to what is being described here by AEye.

It's a multi-level problem -- software, logic and hardware, but there are a lot of people working on it all over the place.

Thx Ron & Tom


*********************

MIT Technology Review

What you see is being warped by the inner workings of your brain, prioritizing detail at the center of the scene while keeping attention on the peripheries to spot danger. Luis Dussan thinks that autonomous cars should have that ability, too.

*******************

Most autonomous cars use lidar sensors, which bounce laser beams off nearby objects, to create accurate 3-D maps of their surroundings. The best versions that are commercially available, made by market leader Velodyne, are mechanical, rapidly sweeping as many as 128 stacked laser beams in a full 360 degrees around a vehicle.

But even though they’re good, there are a couple of problems with those mechanical devices. First, they’re expensive (see “Lidar Just Got Way Better—But It’s Still Too Expensive for Your Car”). Second, they don’t offer much flexibility, because the lasers point out at predetermined angles. That means a car might capture a very detailed view of the sky as it crests a hill, say, or look too far off into the distance during low-speed city driving—and there’s no way to change it.

*************
The leading alternative, solid-state lidar, uses electronics to quickly steer a laser beam back and forth to achieve the same effect as mechanical devices. Many companies have seized on the technology because they can be made cheaply. But the resulting sensors, which are on offer for as little as $100, scan a regular, unvarying rectangular grid and don’t offer the standard of data required for driving at highway speeds (see “Low-Quality Lidar Will Keep Self-Driving Cars in the Slow Lane”).

AEye wants to use solid-state devices a little differently, programming them to spit out laser beams in focused areas instead of a regular grid. The firm isn’t revealing detailed specifications on how accurately it can steer the beam yet, but it does say it should be able to see as far as 300 meters with an angular resolution as small as 0.1 degrees. That’s as good as market-leading mechanical devices.

AEye’s setup doesn’t scan a whole scene in such high levels of detail all the time, though: it will scan certain areas at lower resolution, and other areas at higher resolution, depending on the priority of the car's control software.


From Microvision CC

And third the 3D LiDAR market, where our solid state 3D sensing LiDAR technology can target emerging applications in industrial, consumer and automotive segments. Finally, we’re developing revolutionary advances to our laser beam scanning or LBS platform initially applying them to the display solution for a major technology company that could later be extended to all of the markets and engine solutions that we’re targeting. We expect this new platform and the performance it will offer for both display and 3D sensing will further distinguish us from the competition.

********************

Moving onto 3D LiDAR engine status. Our team made stellar progress last quarter and shrunk the original demo from CS by 8 times, while doubling the 3D point cloud output to 5.5 million points per second, which is one of the largest in industry today, all will add an better software visualization tools to allow product developers to better access to better assess our 3D point cloud.

As a result of high customer interest to evaluate our 3D LiDAR scanning engine, we plan to accelerate availability of a development kit for this engine from an originally planned timeframe of second quarter of 2018 to December of this year. Almost six months ahead of the original schedule.

The solid state 3D LiDAR development kit will possess important attributes supported by our customers. It has very dense 3D point cloud as I just mentioned very well wait and see and dynamic scaling performance that allows to tradeoff between high-spatial resolution and high temporal performance. Why are these performance features considered to be important you ask? 3D sensing particularly for automotive and industrial applications typically involves detect and classifying various objects before decisions can be made.
If ADAS system has the ability to increase spatial resolution on the fly to better classify given object it’s a very desirable feature. On the other hand if ADAS system tries to simply detect a fast moving object before it’s been classified then improving temporal resolution could become important. Our 3D LiDAR sensor can deliver such capabilities, which we believe gives us a real comparative advantage over others. We plan to use the development kits of 3D LiDAR engine for exploring new products and application opportunities with OEMs in industrial consumer and automotive sectors.

****************
Let’s now switch to the other development programs that we’ve been executing on. Starting with ADAS, since last year we were under contract with another major tech company to develop prototypes for their ADAS solution. I’m pleased to tell you that we completed all the deliverables to this customer in Q3 recognized revenue and receive the final payments. We anticipate that after evaluating our solution they will inform us about the next steps. Regarding AR, in addition to ADAS remember we had augmented reality project. The deliverables for another major technology company were also completed and we received all payments. This customer is also evaluating the demonstrators we delivered.

AEye Funding

No comments:

Post a Comment