Tuesday 26 June 2007

How the Apple iPhone Works

Electronic devices can use lots of different methods to detect a person's input on a touch-screen. Most of them use sensors and circuitry to monitor changes in a particular state. Many, including the iPhone, monitor changes in electrical current. Others monitor changes in the reflection of waves. These can be sound waves or beams of near-infrared light. A few systems use transducers to measure changes in vibration caused when your finger hits the screen's surface or cameras to monitor changes in light and shadow.

An array of touch-screen products
Image courtesy Consumer Guide Products
The Nintendo DS, Palm Treo and Logitech Harmony Remote Control
all use touch-screen technology.

The basic idea is pretty simple -- when you place your finger or a stylus on the screen, it changes the state that the device is monitoring. In screens that rely on sound or light waves, your finger physically blocks or reflects some of the waves. Capacitive touch-screens use a layer of capacitive material to hold an electrical charge; touching the screen changes the amount of charge at a specific point of contact. In resistive screens, the pressure from your finger causes conductive and resistive layers of circuitry to touch each other, changing the circuits' resistance.

Most of the time, these systems are good at detecting the location of exactly one touch. If you try to touch the screen in several places at once, the results can be erratic. Some screens simply disregard all touches after the first one. Others can detect simultaneous touches, but their software can't calculate the location of each one accurately. There are several reasons for this, including:

  • Many systems detect changes along an axis or in a specific direction instead of at each point on the screen.
  • Some screens rely on system-wide averages to determine touch locations.
  • Some systems take measurements by first establishing a baseline. When you touch the screen, you create a new baseline. Adding another touch causes the system to take a measurement using the wrong baseline as a starting point.

Basic touchscreen technology


The Apple iPhone is different -- many of the elements of its multi-touch user interface require you to touch multiple points on the screen simultaneously. For example, you can zoom in to Web pages or pictures by placing your thumb and finger on the screen and spreading them apart. To zoom back out, you can pinch your thumb and finger together. The iPhone's touch screen is able to respond to both touch points and their movements simultaneously.

To allow people to use touch commands that require multiple fingers, the iPhone uses a new arrangement of existing technology. Its touch-sensitive screen includes a layer of capacitive material, just like many other touch-screens. However, the iPhone's capacitors are arranged according to a coordinate system. Its circuitry can sense changes at each point along the grid. In other words, every point on the grid generates its own signal when touched and relays that signal to the iPhone's processor. This allows the phone to determine the location and movement of simultaneous touches in multiple locations. Because of its reliance on this capacitive material, the iPhone works only if you touch it with your fingertip -- it won't work if you use a stylus or wear non-conductive gloves.

Mutual capacitance touch-screen

A mutual capacitance touch-screen contains a grid of sensing lines and driving lines to determine where the user is touching.

Self capacitance screen

A self capacitance screen contains sensing circuits
and electrodes to determine
where a user is touching.

The iPhone's screen detects touch through one of two methods: Mutual capacitance or self capacitance. In mutual capacitance, the capacitive circutry requires two distinct layers of material. One houses driving lines, which carry current, and other houses sensing lines, which detect the current at nodes. Self capacitance uses one layer of individual electrodes connected with capacitance-sensing circuitry. Both of these possible setups send touch data as electrical impulses.


The iPhone's processor and software are central to correctly interpreting input from the touch-screen. The capacitive material sends raw touch-location data to the iPhone's processor. The processor uses software located in the iPhone's memory to interpret the raw data as commands and gestures. Here's what happens:

  1. Signals travel from the touch screen to the processor as electrical impulses.
  2. The processor uses software to analyze the data and determine the features of each touch. This includes size, shape and location of the affected area on the screen. If necessary, the processor arranges touches with similar features into groups. If you move your finger, the processor calculates the difference between the starting point and ending point of your touch.
iPhone touch sensing

  1. The processor uses its gesture-interpretation software to determine which gesture you made. It combines your physical movement with information about which application you were using and what the application was doing when you touched the screen.
  2. The processor relays your instructions to the program in use. If necessary, it also sends commands to the iPhone's screen and other hardware. If the raw data doesn't match any applicable gestures or commands, the iPhone disregards it as an extraneous touch.
­
The iPhone's processor and sensor


All these steps happen in an instant -- you see changes in the screen based on your input almost instantly. This process allows you to access and use all of the iPhone's applications with your fingers. We'll look at these programs and the iPhone's other features in more detail in the next section, as well as how the iPhone's cost measures up to its abilities.

0 comments: