[ad_1]

Have you ever wondered how you can walk or jog, with your head bouncing up and down, while still focusing on an object either nearby or far away? Have you noticed how you can do the same and judge distance, speed of object, and minute details of that object quickly and accurately? Well, the reason you can do this so well is how the mind using frame bursting of images from your memory, and retinal jitter to help you quickly fill in the details, meanwhile your visual cortex fills in the blanks – all this happening in micro-seconds using a brain that is barely drawing 20-watts of power. Wow, talk about a state-of-the-art organic design and technology – impressive my fellow human.

Of course, some animals and birds do this even better than we do, with much smaller brains. Consider if you will an owl, hawk, or bald-eagle. The phrase “Eagle Eyes” is apropos here, think about it. Using biomimicry strategies perhaps we can make our UAV (unmanned aerial vehicle) or drone video imaging more powerful and acute – and in doing so, consider for a moment the number of applications this will affect? How are we doing so far with these concepts? Well, 3-axis gimbals are the most sought by small drone owners, but why have a 3-axis if you can make a 4,5,or 6-axis gyro stabilization gimbal for better video resolution and accuracy. That would certainly assist in stabilizing the video camera, so too do quad copter designs which are quite stable even in moderate turbulence.

Let's talk about strategies for a moment – to get to that eagle eye ability we see in nature. One patent, “Apparatus and methods for stabilization and vibration reduction,” US 9277130 B2, duly states: “Currently, there exists primarily four methods of vibration dampening commonly employed in photography and videography to reduce the effects of vibration on the picture: software stabilization, lens stabilization, sensor stabilization, and overall shooting equipment stabilization.”

What if we also work with visual recognition systems for frame bursting, only focusing on things that meet our mission criteria, “OR” are complete anomalies (out of place). In a human mind, things out of place often trigger the N400 brain wave, evoking curiosity, nuance, or interest. We can program the same using algorithms requiring the video camera to; investigate, identify and act. Or, as Colonel Boyd's “OODA Loop Strategy” suggests: Observe, Orient, Decide, and Act. And the fighter pilot who can do that quickest should win the aerial dog-fight provided they make good use of their energy and air-speed. Good advice, even if we borrow it to discuss how best to program a UAS (unmanned aerial system) to complete a task or mission.

In one paper ” Model-based video stabilization for micro aerial vehicles in real-time,” the abstract states; “The emerging branch of Micro aerial vehicles (MAVs) has attracted a great interest for their indoor navigation capabilities, but they require a high quality video for tele-operated or autonomous tasks. A common problem of on-board video quality is the effect of undesired movement, and there are different approaches for solving it with mechanical stabilizers or video stabilizer software. Very few video stabilizer software can be applied in real-time and their algorithms do not consider intentional movements of the tele-operator.”

Indeed, this is the problem and it is a real one if we ever hope to send drones out to do autonomous missions, whether delivering a package or work as a flying security guard for let's say a commercial construction site.

That paper goes on to suggest a way to solve some of these challenges, namely: “A novel technique is introduced for real-time video stabilization with low computational cost, without generating false movements or decreasing the performance. Our proposal uses a combination of geometric transformations and outliers rejection to obtain a robust inter-frame motion estimation, and a Kalman Filter based on a dynamic model.”

Now then, although there are folks working on these things, it is obvious that until the sensors, imaging and equipment get better at such tasks, we will not fulfill the desire to allow drones to do work autonomously in a safe and efficient manner garnering the benefits we expect of these technologies in the future. I hope you will consider my thoughts here and some of my recommendations to borrow strategies from nature to accomplish such goals.

Cites:

A.) “Vision-Based Detection and Distance Estimation of Micro Unmanned Aerial Vehicles,” by Fatih Gokce, Gokturk Ucoluk, Erol Sahin and Sinan Kalkan. Sensors 2015, 15(9), 23805-23846; doi: 10.3390/s150923805

B.) Thesis: “Accelerated Object Tracking with Local Binary Features,” by Breton Lawrence Minnehan of Rochester School of Technology; July 2014.

C.) “Model-based video stabilization for micro aerial vehicles in real-time,” by Wilbert G Aguilar and Cecilio Angulo.

D.) “Real time Megapixel Multispectral Bioimaging,” by Jason M. Eichenholz, Nick Barnetta, Yishung Juanga, Dave Fishb, Steve Spanoc, Erik Lindsleyd, and Daniel L. Farkasd.

E.) “Enhanced Tracking System Based on Micro Inertial Measurements Unit to Measure Sensorimotor Responses in Pigeons,” by Noor Aldoumani, Turgut Meydan, Christopher M Dillingham, and Jonathan T Erichsen.

[ad_2]

Source by Lance Winslow