DronePaint: A human-swarm interaction system for environment exploration and artistic painting

Researchers at Skolkovo Institute of Science and Technology (Skoltech) in Russia have recently developed an innovative system for human-swarm interactions that allows users to directly control the movements of a team of drones in complex environments. This system, presented in a paper pre-published on arXiv is based on an interface that recognizes human gestures and adapts the drones' trajectories accordingly.

Quadcopters, drones with four rotors that can fly for long periods of time, could have numerous valuable applications. For instance, they could be used to capture images or videos in natural or remote environments, can aid search-and- and help to deliver goods to specific locations.

So far, however, drones have rarely been deployed for these applications and have instead been primarily used for entertainment purposes. One of the reasons for this is that complex missions in unknown environments require users operating the drones to have a basic understanding of sophisticated algorithms and interfaces.

"For example, imagine yourself as a rescue team member exploring a building after a crucial natural disaster," Valerii Serpiva, one of the researchers at Skoltech who carried out the study, told TechXplore. "When you arrive at the place, you don't know its current state, floor plan, etc., so if you plan to use drones with flashlights and cameras on board, you either need to sit and program them for a long time or operate them manually, relying only on your own dexterity."

The challenges associated with the operation of drones in unknown environments have so far significantly limited their applicability. The researchers thus set out to create a system that could simplify the operation of drones on behalf of both expert and non-expert users.

"Another good example of how drones could be used is the art industry, where -based light shows and graffiti painting have recently become quite popular," Serpiva said. "In March this year, for instance, the GENESIS company deployed 3281 flashing drones in the night sky, breaking the previous world record. What could be more interesting than making such an amazing show interactive, providing spectators the ability to change swarm flight in real-time?"

The main objective of this recent work was to provide  with a simpler and more intuitive interface for controlling large-scale robot swarms in both known and unknown environments. The system created by the team, dubbed DronePaint, could also be used to realize beautiful art shows or produce artistic paintings with the support of drones."Our work was inspired by several previously developed systems that integrated drones in art, like DroneGraffiti and BitDrones," Serpiva said. "DronePaint, however, introduces a novel approach to generate swarm trajectories, with a straightforward idea behind it: one of the most intuitive ways to convey the desired path to the swarm could simply be to draw it in the air, the same way we draw a path in labyrinth puzzles."

The human-drone interaction system developed by the researchers has three primary modules, all based on deep neural networks (DNNs). These modules are: a human-swarm interface, a trajectory processing module and a swarm control module.

"When a human wants to deploy the swarm and give it the next command, he/she positions him/herself in front of the camera, pointing an index finger up: for DronePaint it serves a signal that it's time to record swarm trajectory," Serpiva explained. "In our work, we designed a trajectory drawing interface based on the MediaPipe Deep Neural Network, developed by the Google team and trained on our dataset."

The DronePaint trajectory drawing interface allows users to generate an input trajectory for the drone swarm. An operator can also observe the trajectory resulting from his/her drawing in real-time and erase it if he/she spots a mistake.

The raw drawings produced by users cannot be applied to drones straight away, as the proposed paths need to first be corrected by the trajectory processing module. After filtering and interpolating a drawn trajectory, this module divides it into equal segments that are suitable for the robots and sends the data it derived to the drone control module.

"Each drone carries an LED ring onboard with retroreflective tape aimed at the image brightness, repeating the hand-drawn figure on a larger scale. To experience the light pattern in midair we use time-lapse video mode to record continuous light trajectory in mid-air" Serpiva said. "When developing DronePaint, we were focused on the core idea of the multi-mode control system, allowing us to adjust multiple swarm parameters with a limited number of hand gestures."

The system's drone control module uses the data it received from the trajectory processing module to generate the drone commands necessary to perform a given trajectory. In addition, it ensures that these commands result in robust swarm flight with few delays.

"The idea behind our research was to make the navigation of the swarm for operator as easy as possible," says Dzmitry Tsetserukou, Professor, Ph.D., Head of Intelligent Space Robotics Laboratory at Skoltech. "The reasonable question is why not to use the speech recognition. The problem is that drones generate strong noise that harms the voice perception. Gestures appeared to be the universal tool of interaction of human with the swarm of drones. Interestedly, birds such as ravens use gestures to point out things and communicate with each other. "

The swarm control interface introduced by this team of researchers at Skoltech is among the first systems that allow users to operate drones and generate trajectories for them simply by drawing paths with their hands. This could greatly simplify the operation of drones and make it easier for artists, search and rescue teams, or other non-expert users to use drones in their work.

"When designing an artistic light show, for instance, the operator can also switch from path drawing to shape correction and adjust the swarm size or shape, similar to how we adjust the brush in a graphical application," Serpiva said. "The interaction scenarios proposed in our paper (e.g., artistic painting and environment exploration) could definitely benefit from the advantages of sequential gesture control to preserve formation control while performing the intuitive drawing of swarm trajectories, inapplicable by direct teleoperation."

The DronePaint system can easily be accessed and used by users worldwide, as it is available as a software toolkit and does not require the use of wearable devices or other systems. In a series of initial tests, Serpiva, Tsetserukou and their colleagues found that it could recognize gestures with high accuracy (99.75%) and could successfully produce various swarm behaviors.

"There are a variety of ways in which we can broaden the research and continue improving the DronePaint technology," Serpiva said. "Let us focus on some key points though. Firstly, we will try to resolve the limitations the current version of the system might have in different lighting conditions, such as low hand detection rate or latency in pattern recognition. Further in the future, we are planning to apply a full-body gesture control to increase the variety of commands, keeping the natural and intuitive control process to the user."

Serpiva, Tsetserukou and their colleagues now plan to increase the number of drones that users will be able to operate using the system. Ultimately, this could unlock new features, for instance allowing users to draw or construct drone structures in 3D environments using the same gesture control interface.

The researchers have so far avoided the integration of wearable devices for tactile feedback, such as gloves, as this would contradict the core idea of the technology they developed. They are thus currently trying to devise strategies to improve the users' perception of the controlled space and distances that does not involve external bulky devices.

"In the future we are also planning to devise systems to read imagined hand gestures from posterior parietal cortex (PPC), using BMI," Tsetserukou said. "With DNN decoding of neural activity patterns we can potentially not only guide the swarm in some direction but also split the swarm formation into the pieces or decide the leading drone so that others will follow it. Dynamic behavior (speed, acceleration, jerk) of each agent can be related with the level of operator's anxiety/calm to achieve smooth drone trajectories."

You have successfully subscribed!
This email has been registered
ico-collapse
0
Recently Viewed
Top
ic-expand
ic-cross-line-top