[caption id="attachment_10974" align="aligncenter" width="500"] Wisdom of crowds?[/caption] Data analytics are valuable because of their ability to filter huge volumes of data, pick out the bits that are relevant to a specific question, and set those aside for analysis. The process may be valuable, but can also be expensive. It’s also potentially useless in situations in which actually collecting the data is beyond the reach of even the biggest-data analytics and highest-power computers. It turns out, however, that the same models of data gathering and analysis used in so-called “Big Data” can also be applied in a real-world context, using clusters of humans to find or analyze data rather than computers. Researchers at the University of Pennsylvania have created a research process that enlists the (presumably idle) brains of ordinary people into a giant, highly distributed bio-computing cluster able to accomplish analytical tasks simply and inexpensively that even supercomputers might not accomplish at all. A new paper published in the Journal of Internal Medicine describes how a crowdsourcing project was successful in finding, photographing and cataloging the locations of more than 1,400 automated external defibrillators (AEDs) in 800 buildings throughout the Philadelphia area. The project, called the MyHeartMap Challenge, was launched to create an easy-to-use guide to the locations of defibrillators simple enough for people without medical training to use effectively. AEDs used by non-specialists before EMTs arrive contribute to the survival of victims in 60 percent of the cases in which they're used, according to emergency services statistics cited by The Wall Street Journal. "Finding AEDs during this contest was a very hard task—many AEDs, we found, are in places people wouldn't think to look during an emergency," said MyHeartMap Challenge director Raina Merchant, an assistant professor of Emergency Medicine at UPenn's Perelman School of Medicine. Though walking around and locating specific physical objects is not the kind of function available to most big-data analytics packages, Merchant's research team found 21 other medical-research projects that included crowdsourced analytics—even in studies that did not involve walking around to identify specific objects. "Studies we reviewed showed that the crowd can be very successful, such as solving novel complex protein structure problems or identifying malaria infected red blood cells with a similar accuracy as a medical professional," according to study co-author Benjamin Ranard, a third year medical student at Penn. The team concluded that crowdsourced analytics work well in four types of tasks: problem solving, data processing, surveillance/monitoring and surveying. The quality of the results and idiosyncrasies of the crowds made successful projects difficult for other researchers to replicate, but the principle of distributing analytical effort among a large number of human volunteers remains valid, and could deliver results as good as those from computers at a far smaller cost, according to Merchant, who is heading a new Social Media Lab that will research how and when crowdsourcing can be effective. She also plans to extend the MyHeartMap Challenge to other cities, in order to create similar maps of defibrillators or other sources of emergency medical help. She also plans to continue researching crowdsourced analysis as a positive addition to computer analysis—analysis and modeling of data, not simply more efficient ways to send people out looking for things in the real world. "Every health field from studying chronic diseases to global health has a potential need for human computing power that crowdsourcing could fill to accelerate research," Merchant said. "Prior work has heralded crowdsourcing as a feasible method for data collection, but a clear roadmap for the types of questions crowdsourcing could answer and the ways it could be applied has been lacking.”   Image: Pan Xunbin/Shutterstock.com