Consider us your emerging tech update. We blog daily about breaking news, innovation, success stories and more.
The Push, Sponsored by Mutual Mobile
The Push, Sponsored by Mutual Mobile

We'll send the top stories to your inbox.

Sensors: The tiny bits of hardware that make devices smarter

By Evan Wade / August 19, 2014


Sensors: The tiny bits of hardware that make devices smarter

In a lot of ways, the phone in your pocket (and the tablet on your counter, the Fitbit on your wrist, etc.) is nothing but a collection of sensors. Sure, other components help make our devices what they are, but none of the sleek electronic toys we interact with on a daily basis would be nearly as functional or fun without the sensors inside them.

Accelerometers, which tell your phone which way you’re holding it (among other things), are sensors. So are the gyroscopes that help your phone determine how it’s moving in a 3D space. Cameras require them to work, and so do GPS chips. Every time some new app or feature has brought a smile to your face, a sensor has very likely been at play, and they’ll continue to be a huge part of your overall experience well into the future.

A sensor for every need

As with any computer, the hardware inside your smartphone comes included for a very specific purpose. The above-mentioned accelerometers, for instance, let devices know whether to put their screens in landscape or portrait view, while more obvious additions like cameras and GPS chips respectively allow users to take photos and send/receive accurate location info. This way of thinking makes sense from a financial and design perspective: No self-respecting manufacturer (or end-user, for that matter) wants to pay extra for hardware they won’t use, and every centimeter inside such a small form factor is critical.

It’s when those components get opened to other businesses, however, that things get truly interesting. Because they aren’t responsible for building our phones and mobile OSes, third-party developers often approach the stuff behind the touchscreen with a different question in mind: How do I use all this beautiful hardware?

Take a look at Word Lens. The app, recently purchased by Google for an undisclosed amount, is only capable of what it does thanks to sensors in the devices it runs on: It needs a camera to read the words the phone is pointed at, a gyroscope to see which direction the user is pointing, and so on. Similar augmented reality apps like Wikitude (available on iOS and Android) also make use of GPS sensors, providing useful information to users based on their geographical location.

Of course, accelos and gyros and GPS chips are far from the only sensors found in today’s devices, especially newer ones. NFC, a close-range communications technology central to Google’s slate of mobile payment initiatives, uses NFC sensors to find compatible, readable objects in near proximity. Samsung’s TecTiles, for example, allow users to program basic instructions onto a small sticker, like putting a phone in sleep mode or setting an alarm. Google Wallet uses the same idea to allow users to perform mobile payments, all by tapping their phone against a designated NFC-capable surface.

While the aforementioned sensors are exciting enough, several others are making moves towards standardization, promising that the smart device industry of the future will be even more exciting than it is today.

mHealth, sensors, and the a la carte approach

Much of our excitement for the future of sensors comes from the mHealth sector, a field making advancements in terms of tech and popularity at a rate best described as explosive.

Think about the ubiquitous Fitbit and other popular exercise-tracking solutions. Tiny and passive though the gadgets may be, the average health tracker can record data like heart rate, distance covered, speeds travelled, and sport-specific info like the height of your jump, all through the series of specialized sensors hidden under the cover.

While phones have offered pedometers since the beginning and manufacturers are working to integrate better fitness capability, mHealth has trended towards secondary-device status thus far. Not that that’s a bad thing. If anything, offering specialized, third-party hardware to those who want it allows end users to take a much more personalized approach to their individual health and exercise goals.

Take the Wello by Azoi, an upcoming multi-sensor iPhone/Android case capable of reading all sorts of vital health statistics. By putting their fingers in specific spots on the case’s back, users can get info like oxygen saturation, temperature, and even ECG readings. One TechCrunch update says the device “[packs]… more than $2000 worth of medical equipment in one $199 device. Whatever comes of the gadget, it’s fair to assume we’re getting a glimpse of what mHealth’s future may hold.

Sensors of the future

One bustling area of the sensor revolution is biometrics. The concept of biometrically controlled devices kind of started and faltered with the Motorola Atrix’s fingerprint scanner, but that same effort was reborn in the iPhone 5s. The addition of fingerprint scanners into tech is now expected, it looks like the human eye could be the next major biometric focus. Amazon’s entry into the smartphone market came complete with eye-tracking functionality, and Samsung – a company known for improving upon other leaders’ innovations – has been toying around with an iris scanner for quite some time now.

The Dynamic Perspective feature on the Fire Phone already lets users view content in 3D without holding their head in one rigid spot. Future devices might let you select apps by focusing on a single spot on the screen. A joint wearable/smartphone solution could offer even cooler features. (How awesome would it be to look at an item in a store via Google Glass for a few seconds, then have its pricing info and other details beamed directly to your phone?)

Sensors bring plenty of group benefits to the table, too, especially when it comes to environmental factors. Today, a so-called fine-dust sensor (as reported by Gizmag and others) can transmit individual user data to a larger pollution database; similar sensors could be included as standard in every smartphone in the future, giving researchers and end users alike a comprehensive data set much larger than any specially-equipped research team could provide.

Sensing the mobile revolution

The trick, to paraphrase one Embedded Intel post on the future of smartphone sensors, will be the kind of improvement that made smartphones possible in the first place. Since technology always trends towards more power, smaller size, and lower production costs, it’s easy to imagine sensors placed in future devices will be able to do more with less in terms of cash and processing power. As that trend continues to touch the sensors inside our favorite devices, expect your hardware to do more and more amazing things, via both original manufacturers and the third-party devs who are eventually given access to the tools.

For now, however, smartphone users the world over can consider themselves in the best sort of technological position—one where the gadgets we have are downright awesome and the future looks even better. Some of the stuff our phones and tablets can do is already magical, and improvement of existing features is an inevitable outcome of competition between industry leaders. Considering the importance of sensors to even the lowest-end phones, it’s beyond exciting to think what might make our current slate of devices look primitive in comparison.

Learn more about the future of wearable sensors by reading our Helpful Wearables Trend Report.