Now in the year 2019, other companies are trying to match up to Google in providing smartphones that will compete with Google in taking excellent pictures in low light. Even as they do that, Google is increasing its speed and extending its lead with the introduction of a revolutionary new camera. It is called Google Night Sight.
Page Contents:
About a couple of years ago, Google released the first Pixel smartphone and that device made image quality a whole lot better. After the release of the pixel phone, people could actually get better images on their phones that were far better than what phones in the past were able to produce.
The company calls it Google Night Sight. It is a night sight feature for the Google Camera app on Pixel phones. Further, it lets phone camera see in the dark without requiring additional hardware or cost. Simply put, see the Google Night Sight as a new camera mode for Pixel phones.
Accordingly, the basic concept is in keeping the Pixel phone’s camera shutter open longer than normal as it allows more light in on the camera and this also allow for creating less dark image. Now, Google is also going further than that, and what they are doing is baffling a whole lot of people – the company is bringing the Machine Learning to do some magic on the Night Sight project.
This software can do truly unprecedented and awe-inspiring as the release that Google is serving up to all Pixel models keeps the high quality while providing an easier way to access the mode. Believe it or not, Night Sight is momentous because it is a software change that delivers a leap in performance; one that many companies cannot measure up to. Only a new a new hardware could bring this type of performance in previous times.
In the beginning, Night Sight was not made merely to accomplish a long-exposure mode for your phone. Google made it possible to create something that is more intelligent than the brutish long exposure.
How Google Is Achieving This Feat
Such a boring thing it was in the past that you may have to get a tripod to stabilize your camera to obtain multiple seconds’ worth of light information and then get a brighter image at night than the human eye can see.
Now, Google is able to accomplish that similar result with a hand-held pixel by segmenting the exposure into a burst of consecutively taken frames, which are then reassembled into a single image using the company’s algorithm magic.
This magic is a clear evolution of the HDR+ process pipeline that is used in the main Pixel camera, and it comes with some unique upgrades as additions.
How It Works
Before you snap a picture, Night Sight camera does a lot of multi-factorial calculations. The Pixel takes into account its own movement, or lack of it, using what the firm calls motion metering. It also takes into account the movement of objects in the scene, the amount of light available, to decide how many exposures to take and how long these should be.
It is explained that Night Sight photos will take up to six seconds and up to 15 frames to capture on image at a time. However, Google has placed a limit of one second per exposure if the phone is perfectly still, or a third of a second if it is held in a person’s hand.
Bear in mind that if you must use the technology, it means that you could get six one-second exposures with a Pixel on a tripod or up to 15 briefer exposures when holding the phone in your hand. The good result is that all of them will get to make up one final photo by the time you are done.
How To Make Night Sight Work Better
Google is making use of sophisticated learning-based algorithm that has been trained to discount and discard the tints cast by unnatural light. This effort is made in order to judge white balance in Night Sight.
Google’s photo experts fed the algorithm loads of images in both tinted state and in corrected white balance and they made it to prefer the white balance state automatically
Now, the software is being improved to look at how the log-chrominance histogram of each photo shifts with varying tints. Google has made whiteboard animation about this subject in a bid to enlighten the outside world. One message by Google seems to emphasize the fact that Google is trying its best to newly approach and get the best out of color separation.
Moreover, this machine is learning more than just colors as the experts also claim it is learning something inherent to pictures. Google is not yet confident on this alternative approach or procedure so as to fully deploy it as a default on the Pixel camera but the firm is happy with how it works in night photos.
The Night Sight photo does not just brighten the conventional Pixel image but it also cleans up a ton of ugly noise in the sky and brings in color that would otherwise be unseen.
Limitation of Night Sight
Of course, every good thing has disadvantages and when you make use of the Google Night Sight, sometimes, you will see that your pictures may not look like there were taken at night. It is one painful limitation of this software. This however, could be a deliberate choice of the user as the technology can be manipulated in various ways – all in a bid to achieve the aim of the user of the phone.
The Good Side
In fact, every aspect of Google’s Night Sight is dynamic and automatic. If the phone detects that a scene is dark enough, there will be a suggestion to try night mode. Once the user taps on that, the rest of the work can be done by the software.
However, there are few controls such as the tap-to-focus and the usual exposure slider. Other things you cannot do is tell the camera how many frames you want it to capture or set your own shutter speed.
Also, this software is poor at capturing anything in motion. For instance, it will blur cars passing by and it doesn’t actually deal squarely with bright lights.
But no matter what, Google Night Sight is the biggest thing that happened to taking pictures in poor light.