I remember being blown away by a TED talk were "minimum snap trajectories" are planned for quadcopters to fly through hoops and slots.
It's really cool to see this happening fully autonomously and at such high speed. I wonder if the use of AI means that the approach is fundamentally different, or if it uses the same principle of minimizing snap?
https://www.ted.com/talks/vijay_kumar_robots_that_fly_and_co...
It's fundamentally different, it's using an RL trained network that gets the drone state (position, orientation, velocity) as input and directly outputs motor commands.
Do you happen to know the bandwidth of the telemetry fed to the network? How many bytes/s.
Not much since it doesn't take images as it's input and it's running on an embedded mcu. Based on the papers linked elsewhere in here the state vector is 24 floats and it's running at 1kHz so around 100 kB/s.