Fascinated by how much time and attention Apple puts in to the camera system that’s become an integral part of each year’s new iPhone release? Then you’re not alone. As Chris Welch reports for The Verge, Apple was the subject of a recent (and much-discussed) 60 Minutes segment, in which the company shed some light on its camera development operation and revealed that it has 800 employees working on the iPhone’s camera.
This team of “engineers and other specialists” is led by Graham Townsend, who took Charlie Rose on a tour of the company’s camera testing lab. Townsend explained that the iPhone’s camera module includes “over 200 separate individual parts,” and demonstrated how Apple can simulate a wide range of lighting conditions to test out a camera system’s performance.
Welch notes that while many of Apple’s competitors certainly conduct the same tests, the sheer size of the team Apple has working on the iPhone’s camera reveals what a big priority the camera has become. Welch notes, “Apple has built entire ad campaigns around the iPhone’s camera, and always makes it a point to highlight improvements with each new iPhone revision.”
So for Apple fans looking for clues about what that 800-person team is working on next, the logical question becomes: What should we expect from the camera that Apple’s building for next year’s iPhone 7? Luckily, there are already enough rumors and reports to start piecing together what Apple’s likely planning for the iPhone 7’s camera — or at least what we’re hoping comes next for the iPhone’s camera.
1. Optical image stabilization
Apple introduced optical image stabilization capabilities in the iPhone 6 Plus, but didn’t implement the technology in the smaller iPhone 6. It did the same thing when it came time to build the iPhone 6s and iPhone 6s Plus, so that anyone who wanted the feature, which this time around can also ensure clear videos as well as photos, was torn between the standard iPhone 6s and the larger iPhone 6s Plus.
It’s only logical than with next year’s iPhone models, Apple would make optical image stabilization standard in both the iPhone 7 and the iPhone 7 Plus. The feature, which has appeared in not only high-end but also mid-range Android smartphones, too, is beginning to register on mobile photographers’ radars, and not everyone wants to spend more money (and carry a bigger phone around) to feel confident that their photos and videos will be as clear as possible.
2. Dual lenses or multiple sensors
As Lisa Eadicicco reported for Business Insider before the release of the iPhone 6s, those with an eye on what Apple is planning next have long speculated that the company will eventually make an iPhone camera with dual lenses. A report released by a Taiwanese publication over the summer claimed that Apple has been working on such a camera system for three years.
In that time, Cupertino has reportedly run into technical issues and supply chain problems with a potential dual lens system. But the company has also acquired a camera technology startup — one called LinX, with expertise in multiple sensor camera systems — that could help solve the technical issues, and has aligned itself with a supplier, Largan Precision, that has expanded its production capacity and could offer a fix for the supply chain problems.
If Apple applies LinX’s technology to the iPhone 7’s camera system, that could result in better low-light performance, reduced noise, and perhaps even 3D object modeling. And the camera might not even have to protrude from the back of the phone, as it does on the iPhone 6 and iPhone 6s.
Despite the appearance of Apple patents on dual-sensor cameras and the rumors that preceded the release of the iPhone 6s, the latest version of Apple’s iPhone didn’t feature a camera system with two lenses. So speculation is mounting that a dual-lens setup will make an appearance on next year’s iPhone 7. If so, it wouldn’t be the first smartphone to feature dual lenses, and other smartphone makers including Samsung and HTC are reportedly working on smartphones that integrate such systems.
3. Six-element lenses
Also ahead of the iPhone 6s’s announcement, Business Insider learned that the phone’s camera system uses so-called five-element lenses — and that Apple was already ordering six-element lenses for the camera it’s designing for the iPhone 7. So what are the elements in these lenses, and why are they so interesting when you’re trying to figure out what goes into an iPhone’s camera system?
An element is a layer of plastic that makes up a part of the lens, and each element actually functions as a standalone lens. (They’re made of high-quality plastic, instead of the traditional glass, because plastic elements are easier to fit into a smartphone and are cheaper to produce than glass.) Combining five — or six — elements into a single lens enables the camera to capture more complex and detailed information.
The six-element lens that Apple is reportedly considering for the iPhone 7 would bring Apple’s lenses up to speed with what its competitors are offering. With the iPhone 6s, Apple upgraded the camera system’s sensor from 8MP to 12MP. A six-element lens would help it more closely match, for instance, the OnePlus 2’s six-element lens and 13MP sensor. Even the Samsung Galaxy S5 had a six-element lens, which can absorb more light than a five-element lens would.
Business Insider reports that adding more elements to a camera system’s lens also enables a manufacturer to increase a camera’s maximum aperture. In basic terms, the larger the aperture — traditionally known as the f/stop to photographers — the more light the lens enables to travel into the camera. The amount of light that the lens gathers and focuses determines how well the camera performs in low-light conditions, and the depth of field of the camera’s focus.
4. Improved maximum aperture
On a related note, Ashraf Eassa reports for The Motley Fool that Apple is likely to improve the maximum aperture of the iPhone 7’s camera. He notes that recent Android flagship phones — the logical point of comparison for Apple’s latest iPhones — have had significantly larger maximum apertures than Apple’s iPhone. The Samsung Galaxy S6 and the LG G4 have integrated cameras with maximum apertures of f/1.9 and f/1.8, respectively. But the iPhone 6s, like the iPhone 6 before it, has a maximum aperture of f/2.2.
The next iPhone, the iPhone 7, is very likely to integrate a lens with a larger maximum aperture, which is a logical addition not only as the company wants the iPhone to compete with other top smartphones in terms of image quality, but is a particularly likely addition if Apple wants to continue to hone the low-light capabilities of the iPhone’s camera system.
5. 4k video recording at 60fps
Eassa reports that another area where the iPhone 7 is likely to be playing catchup with the top Android phones is the camera’s video recording capabilities. The iPhone 6s is capable of recording 4k video at 30 frames per second. But Eassa notes that the Android flagships that will roll out in the first half of 2016 are likely to support 4k video recording at 60 frames per second.
Given the fact that Apple wants the iPhone to stay in competition with the top tier of rivals’ smartphones — particularly as the company tries to maintain and grow its large share of the premium smartphone market — 4k video recording at 60 frames per second seems very likely to make an appearance in the iPhone 7.
6. Light splitter cube
As Dave Smith reported for Business Insider early in the year, an innovative technology that’s not actually all that new at all could bring the camera of the next iPhone ahead of the cameras on competing smartphones. Cupertino secured a patent for a camera with a “light splitter cube,” which is a miniaturized version of a system of three charge-coupled devices, or CCDs, which has appeared in many high-end video cameras.
The system excels at capturing light and color clearly and accurately, while negating the blur that would otherwise result from any shaking on the part of the user. In the iPhone’s current camera system, individual pixels capture red, blue, and green light, which are scattered across a single image sensor. But the cube would split the light so that each color would have a sensor to itself, enabling more accurate colors and better results in low-light situations.