As a researcher, you’re probably using your smartphone as part of your work. With Elsevier’s Hivebench lab notebook, for example, scientists can prepare, conduct and analyze experiments from their smart devices.
You may also be using your smartphone for the science itself – to monitor climate fluctuations in your lab, for example, or photographing the bacterial colonies in your Petri dish. Or, you may be among the growing number of researchers who are harnessing the power of smart phones and citizen science for your own work.
Consumers have embraced the smartphone since Apple’s memorable iPhone launch in 2007, with scientists embracing the smartphone soon after. Now the two worlds are converging, with scientists harnessing the power of citizen science with newer and smarter technologies.
For example, the smartphone can be a powerful device to collect data – especially with the various sensors that the modern smartphone carries. The most common built-in sensors are the accelerometer, gyroscope, magnetometer, GPS receiver, microphone and camera. Other sensors that are seen in the higher-end models are for example gravity and rotational vector sensors, and environmental sensors such as barometers, photometers, thermometers and for example air humidity sensor. Some of the newer models even have a heart rate monitor built in, or a pedometer to track your steps, and there is even a Japanese model that can detect radiation levels.
While obviously all are built in to serve consumer’s needs, each of these sensors can also be very well deployed for scientific purposes. Here, one can distinguish between passive and active sensing tasks. Passive sensing enable automated collection of smartphone generated data, for example by the accelerometer, GPS data and ambient noise levels. Active sensing tasks ask for an active user contribution such as taking a picture, tagging a place or sending a text message.
But the smartphone’s potential goes beyond the obvious, with new applications being discovered all the time. What about searching for ultra-high energy cosmic rays with the image sensors in your smartphone camera? The scientists suggesting this idea, in a 2016 article in Astroparticle Physics, think that if sufficient users run a dedicated application on their phone, an array composed of these smartphones could start picking up air showers generated by cosmic rays. Crayfis just launched a beta version of their app, with this demo video:
Another example: the accelerometer in your smartphone being used to detect seismic disturbances and thereby helping to predict an upcoming earthquake and enabling people to evacuate in time. To participate, you just download the “MyShake” app and have it run in the background on your mobile device. The computers at UC Berkeley will then distinguish between real seismic disturbances and your normal movements and draw their conclusions.
An area where smartphones are being used frequently for science is in Earth observations. Here, they are used to collect data about the Earth's physical, chemical and biological systems. While “classic” remote sensing devices such as satellites and spectometers can analyze large areas in one go, the sensors in smartphones are suitable for making measurements near the ground’s surface – things that could otherwise be easily missed due to obstructions such as trees, clouds or low vegetation, or simply because they are very small. One of the most successful examples of an earth observation program is the eBird, an ornithology app, which has gathered nearly 370 million bird observations from more than 150,000 participants since 2002. Read more in this Biological Conservation article: "The eBird enterprise: An integrated approach to development and application of citizen science" (freely available until May 31, 2017).
In the health sciences, smartphones have become a widely used source of information. The accelerometer is often used to measure and recognize physical and biological activity; this data is combined with data from the gyroscope and the magnetometer to get an even more accurate estimate of one’s physical activities. Heart rate monitoring, fall detection of the elderly and measuring sedentary time are just a few examples of other measurements now gathered by smartphones. There are now also experiments with obese volunteers taking pictures of their meals. Computer-aided food identification and quantity estimation logarithms make it possible to feed back information to them about their caloric intake and help them adapt healthier eating patterns.
And lastly, smartphones can generate data for “smart cities.” By tracking and understanding individuals’ mobility using GPS, optimal routes can be calculated, traffic can be better forecasted, abnormal events can be detected and logistical pathways improved. Of course, there are also immediate benefits. Having difficulty finding a parking spot? Just hook up your smartphone and easily locate people that look like they are about to leave their parking spot nearby.
All these examples show that smartphones are rapidly getting a central spot in the evolution of social networks, green applications, global environmental monitoring, personal and community healthcare, sensor augmented gaming, virtual reality and smart transportation systems.
At Elsevier, we are proud to present a selection of smartphone based articles for citizen science. We have made these articles freely available until May 31, 2017. See our virtual special issue below.
Scientific research conducted or facilitated at least in part by nonprofessional scientists or the general public has become known as citizen science. One of the earliest documented cases of a citizen science project is attributed to the astronomer Denison Olmsted, as detailed by Drs. Mark Littmann and Todd Suomela in this Endeavor article: “Crowdsourcing, the great meteor storm of 1833, and the founding of meteor science.” In 1833, Olmsted witnessed what has become known as the Leonid Meteor Storm and had an article published in the local newspaper to ask readers to send in their observations so he could compare them. This was promptly picked up by the national media, leading to observations from all over the country and a very well documented phenomenon.
A famous example of a long-term citizen science project is the “@home” programs: Einstein@Home, MilkyWay@home and SETI@home. They all ask volunteers to make use of the computing capacity of their personal computers while in sleeping mode. SETI@home, run by UC Berkeley, uses this combined computer power to analyze radio signals to search for signs of extraterrestrial intelligence. Given the vast amount of available radio astronomy data, this is something they would never be able to do themselves. While the project has not lead to any ET contacts so far, it did prove the viability and practicality of the "volunteer computing" concept.
Another highly successful project is Zooniverse, which originated from the Galaxy Zoo project set up in 2007. Galaxy Zoo intended to get “some” help from the public to classify a mind-blowing number of 900,000 galaxies that the Sloan Digital Sky Survey had detected. This rapidly became so successful that more projects were launch, not only in astronomy but ecology, cell biology, the humanities and climate science. Today, Zooniverse comprises more than 50 citizen science projects, and the project team has published nearly 100 scientific papers based upon the results.
Any space-related Zooniverse scientific article that is published in an Elsevier journal is made freely accessible for a period of 12 months after publication. This is to show that we sympathize greatly with these types of projects and also want to enable volunteers to read the papers they have contributed to.
Open science is a broad term that reflects how technology is changing the future of science, making research more open collaborative, transparent and efficient. Elsevier is working with the wider research community to empower researchers to be more effective, better equipped to collaborate and share their research. This includes improving how science is communicated and understood by both specialized and general audiences. For example:
- AudioSlides: Elsevier journals give authors the option to upload a short webcast-style audio presentation describing their research in their own words. These presentations are available for free to everyone through ScienceDirect.
- Science & People: In Berlin, an event series inspired by science slams and pub quizzes brings together citizens, researchers, political decision-makers, and representatives of the city’s bustling tech and startup scene. Science&People was launched in early 2016 to facilitate conversation between scientists and the public, showing why science matters and creating a hub where people can actively engage with socially-relevant research topics. It’s a joint pilot project of Stifterverband für die Deutsche Wissenschaft (the German Association for the Promotion of Humanities and Sciences), the Fraunhofer-Verbund IUK-Technologie (Fraunhofer Information and Communication Technology Group), part of the Fraunhofer Association, Wissenschaft im Dialog (Science in Dialogue), and Elsevier.
- Free access for media: Journalists are skilled at translating science for the wider public. Elsevier provide free access to ScienceDirect for credentialed media to help them write research stories. Elsevier also provides access to Wikipedia editors.
- Elsevier Connect: Our online community and news site publishes stories about the science and health research published in Elsevier journals and the technology we are developing to make it more accessible. Since its launch in 2012, over 1,200 stories have been published, which reach a monthly average of over 160,000 visitors and 4.8 million views a year.
In addition, Elsevier is the second largest open access publisher and a leading enabler of green open access. Read more about open access at Elsevier.
We also provide a variety of other access options. For example, PatientINFORM and PatientACCESS give patients and their caregivers access to important medical research. In times of global crises, disease and natural disasters, Elsevier develops online resource centers to provide free access to medical research, online tools and expert advice. These include resource centers for Zika and Ebola. We also partner with the US National Library of Medicine on the Emergency Access Initiative to provide temporary free access to full text articles to healthcare professionals, librarians and members of the public affected by disasters. And for institutions in the developing world, we provide free or low-cost access via Research4Life.
Read more about Elsevier’s programs for science and society.
Using a smartphone for science: a virtual special issue
- Colin J. Ferster and Nicholas C. Coops: A review of earth observation using mobile personal communication devices, Computers & Geosciences (February 2013)
- Michael Discher et al: Investigations of touchscreen glasses from mobile phones for retrospective and accident dosimetry, Radiation Measurements (June 2016)
- Michael N. Fienen and Christopher S. Lowry: Social.Water—A crowdsourcing tool for environmental data acquisition, Computers & Geosciences (December 2012)
- Jason G. Su et al: Integrating smart-phone based momentary location tracking with fixed site air quality monitoring for personal exposure assessment, Science of the Total Environment (February 2015)
- Audrey de Nazelle et al: Improving estimates of air pollution exposure through ubiquitous sensing technologies, Environmental Pollution (May 2013)
- Ian K. Bailiff et al: Retrospective and emergency dosimetry in response to radiological incidents and nuclear mass-casualty events: A review, Radiation Measurements (November 2016)
- John J. Guiry et al: Activity recognition with smartphone support, Medical Engineering & Physics (June 2014)
- Alireza Sahami Shirazi et al: Already up? Using mobile phones to track & share sleep behavior, International Journal of Human-Computer Studies (September 2013)
- Akram Bayat et al: A study on human activity recognition using accelerometer data from smartphones, Procedia Computer Science (August 2014)
- Shane A. Lowea and Gearóid ÓLaighin: Monitoring human health behaviour in one's living environment: A technological review, Medical Engineering & Physics (February 2014)
- Abderrahim Bourouis et al: M-Health: Skin Disease Analysis System Using Smartphone's Camera, Procedia Computer Science (June 2013)
- Daniel Whiteson et al: Searching for ultra-high energy cosmic rays with smartphones, Astroparticle Physics (June 2016)
- P. Melchior: Crowdsourcing quality control for Dark Energy Survey images, Astronomy and Computing (July 2016)
- E. Bertin: Web-based visualization of very large scientific astronomy imagery, Astronomy and Computing (April 2015)
Smart cities and society
- Stefano Chessa et al: Mobile crowd sensing management with the ParticipAct living lab, Pervasive and Mobile Computing (October 2015)
- Miao Lin and Wen-Jing Hsu: Mining GPS data for mobility patterns: A survey, Pervasive and Mobile Computing (June 2014)
- Ellie D’Hondt at al: Participatory noise mapping works! An evaluation of participatory sensing as an alternative to standard techniques for environmental monitoring, Pervasive and Mobile Computing (October 2013)
- Rajib Rana et al: Ear-Phone: A context-aware noise mapping using smartphones, Pervasive and Mobile Computing (February 2015)
- Delphine Christin et al: uSafe: A privacy-aware and participative mobile application for citizen safety in urban environments, Pervasive and Mobile Computing (October 2013)
- Kun-Chan Lan and Wen-Yuah Shih: An intelligent driver location system for smart parking, Expert Systems with Applications (April 2014)
- Adam Craig et al: Experience sampling: Assessing urban soundscapes using in-situ participatory methods, Applied Acoustics (February 2017)
- Niki Martinel: A supervised extreme learning committee for food recognition, Computer Vision and Image Understanding (July 2016)
- Stefan Gessler: PDAs as mobile WWW browsers, Computer Networks (December 1995)
- Silvia Giordano and Daniele Puccinelli: When sensing goes pervasive, Pervasive and Mobile Computing (February 2015)
- Young-Seol Lee and Sung-Bae Cho: Activity recognition with android phone using mixture-of-experts co-trained with labeled and unlabeled data, Neurocomputing (February 2014)