Apple Patent Takes iPhone Picture Quality to the Next Level
A recently published Apple (NASDAQ:AAPL) patent uncovered by Apple Insider reveals that the iPhone maker has developed an optical image stabilization method to create “super-resolution” images. As noted by Apple in the patent background information, many of today’s digital cameras include an optical image stabilization, or OIS, system. OIS is typically used to compensate for a user’s involuntary shaking that can cause captured images to appear blurry or otherwise distorted. OIS systems implement this image stabilization effect by either moving the lens, or by moving the entire lens and sensor camera module.
However, while OIS systems are usually only used to compensate for user movements, Apple proposed combining the capabilities of an OIS system with an image processor in order to create extremely high-resolution images. This is achieved by weaving together one or more supplemental images that are captured in rapid succession. As noted by Apple in the patent abstract, after the initial image is captured, the OIS processor “adjusts the optical path to the electronic image sensor by a known amount” and captures a second image that is offset from the first image “by no more than a sub-pixel offset.”
This process can be repeated numerous times with various degrees of “sub-pixel-offset.” Finally, the various sample images are combined together to create one super-resolution image. As noted by Apple, this technology can be implemented in multiple types of image capturing devices, including “a digital camera or a mobile multifunction device such as a cellular telephone, a personal data assistant, a mobile entertainment device, a tablet computer, or the like.”
While some industry watchers have accused Apple of falling behind in the smartphone camera race by not boosting the iPhone camera’s pixel count, the Cupertino-based company has traditionally favored other methods for improving overall image quality. Although Apple’s flagship smartphone still uses the 8-megapixel rear camera sensor that was first introduced in the iPhone 4S, it has developed various other technologies to help improve image quality, including the True Tone dual LED flash that was implemented in the iPhone 5S. Apple also recently acquired SnappyLabs, the company behind the SnappyCam app that enables the iPhone’s camera to take twenty to thirty high-resolution photos per second.
On the other hand, rival Samsung (SSNLF.PK) has consistently boosted the pixel count for its Galaxy smartphones with each generation. The Galaxy S4 features a 13-megapixel camera, while the newly released Galaxy S5 features a 16-megapixel camera.
Although it is not known if Apple has any near-future plans to use this technology in its iPhones and iPads, it should be noted that the “Super-Resolution Based on Optical Image Stabilization” patent appears to confirm a rumor reported by Taiwan’s China Post earlier this year. According to unnamed Nomura Securities analysts cited by the China Post, Apple will concentrate on “improved optical image stabilization” technology for the iPhone 6, rather than increasing the pixel count, in order to maintain the device’s slim design.
More from Wall St. Cheat Sheet:
- Apple’s Health-Monitoring EarPods Revealed to Be a Hoax
- Apple Store Sales Drop as Fans Wait for the New
- These Analysts Think Ignoring the Cloud Will Kill Apple
Follow Nathanael on Twitter (@ArnoldEtan_WSCS)