WRALtechwire.com logo

Study: Interpreting drone images poses challenge for first responders

Drones

Oct, 12 2017, 11:53 AM

Just how useful are images from drones for first responders in an emergency? A new study from N.C. State finds that interpreting that information can be a challenge.

“This tells us that incorporating UASs into some situations, such as emergency response, may not necessarily be as useful as one might think,” says Doug Gillan, a professor of psychology at NC State and co-author of the paper, says of the study titled “Eye In The Sky: Investigating Spatial Performance Following Perspective Change”

Gillan and Stephen Cauffman, a Ph.D. student at NCSU who wrote the research paper, report that in the study:

"Participants were shown randomly ordered image pairs of aerial and ground views of objects in a virtual city and were asked to make judgments about where a missing object was in the second image of the pair. ... The results were consistent with previous research and showed that congruent trials (aerial-aerial and ground-ground) resulted in less error and response time."

They found that comparing two aerial views got the best results, but that switching from an aerial view to a ground view posed the biggest challenge for study participants, according to Matt Shipman in a story written for NCSU's news service.  When shown an aerial view followed by a ground view, participants took at least a second longer to estimate where the missing object was – and their estimates were four times farther away from the correct placement of the object than when comparing two aerial views, he reported.

“Because UASs [unmanned air systems] operate at heights that most normal aircraft do not, we are getting new aerial perspectives of our surroundings,” Cauffman said. “We wanted to know how good people are at integrating these perspectives into their perception of the real world environment – which can be relevant in situations such as security or emergency response operations.

“For example, if we’re using UASs to identify a trouble spot, how good are we at using visual information from UASs to point to the correct spot on a map?”


Study abstract: “Eye In The Sky: Investigating Spatial Performance Following Perspective Change”

Authors: Stephen J. Cauffman and Douglas J. Gillan, North Carolina State University

Presented: Annual Meeting of the Human Factors and Ergonomics Society, Oct. 9-14 in Austin, Tex.

Abstract: Unmanned Aerial Systems (UASs) are becoming more prevalent in civilian use, such as emergency response and public safety. As a result, UASs pose issues of remote perception for human users (Eyerman, 2013). The purpose of this experiment was to test the effects of combining aerial and ground perspectives on spatial judgments of object positions in an urban environment. Participants were shown randomly ordered image pairs of aerial and ground views of objects in a virtual city and were asked to make judgments about where a missing object was in the second image of the pair. Response times and error were collected with error being calculated using the Euclidean distance formula. The results were consistent with previous research and showed that congruent trials (aerial-aerial and ground-ground) resulted in less error and response time. It was also shown that there was a significant four-way interaction between stimulus image, response image, object density, and stimulus duration. The results of this study are intended to provide the basis for future work in understanding the underlying reasons behind spatial errors that might occur during use of UASs and lead to design implementations for interfaces to reduce these errors.






Menu