Samsung revealed its leading Galaxy S7 and S7 Edge smartphones at the MWC 2016 on Sunday. The newest flagship devices came bearing pretty much similar looking design language as the Galaxy S6 and S6 Edge, while the main improvements resided within the devices themselves. Among the bumped up specs, one new technology that grabbed everyone’s attention was the new Dual Pixel technology employed in the rear camera.
The Dual Pixel, essentially, is an improvement or a natural evolution from the Phase-Detection Auto-Focus technology used by many other smartphones. Firstly, Samsung downgraded from 16MP found in the S6 to 12MP in the S7. The decision to do this was to focus more on quality rather than quantity. The 12MP sensor brings 56% bigger individual pixels with1.4um pixels as compared to the S6.
To understand Dual Pixels one needs to first wrap their heads around PDAF. PDAF traditionally uses less than 10% the total number of pixels equipped with focusing photodiodes and uses it strategically in the sensor area to focus on a subject. This requires more time for focusing. Dual Pixel, on the other hand, uses a 100% of the pixels for extremely precise and fast focusing. Using the whole of the pixels also allows more light in.
This combined with a wider f/1.7 aperture (the S6 had a f/1.9 aperture) allows for a lot more light to come in, 25% more to be more accurate. This results in great low-light captures that are sharp and colourful. The Dual Pixel image sensor splits every single pixel into two photodiodes for on-chip phase detection, promising vastly improved autofocus performance. Much like the human eye, the Dual Pixel technology sends light from the lens to two image sensors seperately to adjust the focus.