> Most automotive Lidar already operate in a “photon starved regime”, ~200-300 photons per return[0]. If you spread that over the entire scene, your snr drops quickly.
Translating: Normally you have a large single sensor per laser, which makes measurements at a very high rate. With flash lidar, you split the sensor up like an image sensor. In a normal image sensor, each pixel can collect light for a long time, but if you do that with lidar you have no distance resolution. The sensor is sitting idle 99% of the time, and you pay in sensitivity and accuracy.
Array sensors, MEMs, and phased arrays all struggle because they're all really good at small-angle differences, while the reason for scanning lidar is large-angle differences. Maybe one day we'll start making curved dies and it'll be easier to have a really wide FOV without needing multiple sensors.
You can actually make curved dies already- there’s a company doing that for image sensors, if you thin silicon down it becomes flexible