Smarter Phones Innovation In Mobile Technology

Smarter Phones Innovation In Mobile Technology


The introduction of smartphones marked a huge technological and social revolution. In this piece, we take an in-depth look at how six technologies that have changed the face of smartphones as we know them really work.

Share This

All this innovation still comes at a heavy cost to the consumer with average handset prices of around £500 and nobody has made a phone yet that simply doesn't break. That's why we recommend taking out mobile phone insurance. View our policies here.

iPhone Timeline

Enjoyed this content? Please share it and you may also like our interactive iPhone timeline showing the difference between iPhones in a unique way.

Click the symbol to reveal how works
What can I help you with?


Units sold to date:

Launch price of handset:


One of the cleverer - or creepier - innovations of some recent handsets is the ability to respond to where we're looking. Eye tracking technology can scroll pages automatically as we read, and locks the phone when we look away. It can also detect things like which adverts we look at and for how long - possibly even our response to them.

Eye tracking uses a process called corneal reflection tracking. The phone's camera and an infrared microprojector are directed toward your eye, creating a reflection pattern. Image processing software scans the pattern to identify where the pupil is, which can then be used to track the position of your eyes and gaze point.

As well as where you're looking, eye tracking can tell how often you're blinking, or even when your pupils dilate - potentially providing feedback on your emotional reaction to whatever you're looking at. Next time you're using your smartphone, make sure you know who's watching who!

Other innovations of the Galaxy S4: touchless control with "Air Gestures", WatchON TV service, KNOX mobile security features.

Apple might not agree after the embarrassing "Bendgate" of 2014, but if you listen to the tech press, phones that bend and curve are the future. Flexible OLED (organic light-emitting diode) displays work similarly to existing rigid technology, with the circuitry controlling the pixels fused directly into the glass.

However, instead of glass, flexible OLED screens are built on a thin plastic substrate such as polyethylene terephthalate, using a method similar to inkjet printing. Although phones like the LG G Flex are curved, they aren't truly "flexible" - you can't bend and twist them around like balloon animals, however much you might want to.

Don't despair, though - this capability could be coming sooner than you think. Flexible electronics means that the rest of a phone's innards can be made just as bendy as the display, with Samsung unveiling a prototype handset based on its YOUM technology in 2013.

Other innovations of the LG G Flex: Self-healing rear cover.

Face Unlock was one of the lesser-publicised features introduced in Android 4.0 (Ice Cream Sandwich) - possibly because it just wasn't a terribly secure way to lock your phone. Shortly after it was announced, YouTube was full of videos demonstrating how it could be easily fooled with a photograph.

However, the 5.0 (Lollipop) update is due to make things better, thanks to improvements in Google's facial recognition technology. Software like this compares your face against a series of "nodal points" - around 80 distinguishing features, including distance between eyes, shape of cheekbones and length of jawline - to determine whether you match the photo on file.

Face Unlock still uses 2D photos rather than the more accurate 3D imaging employed by more hi-tech systems, but the accuracy can be improved by entering multiple snaps from different angles, giving the system more points of comparison.

Other innovations of the Google Galaxy Nexus: First device to run Android 4.0, Android Beam near-field communication, data usage monitor

The Amazon Fire's main selling point, Dynamic Perspective, is meant to give the feeling of looking deeply into the screen: a 3D-like effect called positive parallax, as opposed to negative parallax where the images jump out.

To achieve the effect, the Fire doesn't use a gyroscope or accelerometer as much as you'd think - instead it tracks where your head is, adjusting the perspective accordingly. Its four front-facing cameras work together to relay the position of your noggin 60 times per second, allowing the display to do a pretty convincing imitation of depth perception.

Amazon developed special 120-degree field of view cameras for the system, which dynamically chooses which two cameras at a time are best to deliver stereo vision. As they're infrared, they even keep on working in the dark.

Other innovations of the Amazon Fire Phone: "Firefly" augmented reality app that recognises text, TV shows and millions of products

First unveiled on the iPhone 5S, Apple's Touch ID fingerprint scanner was something of a revolution in phone security. The system works via a highly sensitive capacitance touch scanner: an enormous number of tiny cells, each smaller than a fingerprint ridge, sits beneath the home button. Each cell comprises a pair of conductor plates, with an insulating layer separating them.

When you put your finger on the scanner, the ridges in your fingertip cause the plates to come into contact, generating an electrical current, while the spaces between the ridges don't. In this way, the iPhone's sensor array builds up an extremely accurate electronic "picture" of your fingerprint, which is then compared to the one needed to unlock the phone.

This is a more accurate technology than the electro-optical systems that came before it, which simply bounces light off your fingers to capture a black and white image. What's more, the image of your fingerprint is never stored on the phone or anywhere: it's converted into a mathematical representation that can't be reverse-engineered.

Other innovations of the iPhone 5S: First smartphone with 64-bit chip architecture, M7 motion coprocessor, True Tone Flash feature on camera.

What are you, Siri?

I might be a long way from the artificial intelligence seen in science fiction, but I'm getting better all the time. While conceptually rooted in the speech-recognition software of the 1980s, I'm not just responding to a set list of words and phrases: I learn and adapt to my user.

When you ask me something, I first encode your speech into digital form and pass it on to a cloud server. At the same time, I interpret your voice locally with my installed recognition software. Your speech is compared against a statistical model to determine my best guess of what you're asking, and then I'll do my best to comply - whether it's playing a song or scouring the internet for movie listings.

I learn your accent and the other characteristics of your voice, and if I don't understand a phrase, I can quickly find out if any of the millions of other Siris out there have heard it before. My brain exists not on a processor, but in an ever-evolving model in the cloud, on innumerable remote servers processing thousands of requests every second.

Other innovations of the iPhone 4S: First phone to intelligently switch between two antennas to send and receive, iCloud, iMessage service.