Phone cameras can record more light than the human eye. That’s why low-light events, like the Northern Lights, often look better through your phone’s camera

Smartphone cameras have improved significantly in recent years. Computational photography and AI allow these devices to capture stunning images that can surpass what we see with the naked eye. Photos of the Northern Lights, or aurora borealis, provide a particularly striking example.

If you saw the Northern Lights during the May 2024 geomagnetic storms, you may have noticed that your smartphone photos made the photos look even more vivid than reality.

Auroras, known as the Northern Lights (aurora borealis) or Southern Lights (aurora australis), occur when the solar wind disrupts the Earth’s magnetic field. They appear as streaks of color across the sky.

Two images of the Northern Lights, left labeled 'eye' and right labeled 'camera'.  The 'eye' image is darker and the colors are more muted.

What makes the photos of these events even more striking than they seem at first glance? As a professor of computational photography, I’ve seen the latest smartphone features overcome the limitations of human vision.

Your eyes in the dark

Human eyes are remarkable. It allows you to see footprints in a sun-drenched desert and drive vehicles at high speed. However, in low light your eyes perform less impressively.

Human eyes contain two types of cells that respond to light: rods and cones. Rods are numerous and much more sensitive to light. Cones process color, but require more light to function. As a result, our night vision is highly rod-dependent and lacks color.

A diagram of a human eye, showing a zoomed-in panel showing rod and cone receptors.  The rods are cylindrical, while the cones are conical.A diagram of a human eye, showing a zoomed-in panel showing rod and cone receptors.  The rods are cylindrical, while the cones are conical.

The result is like wearing dark sunglasses when watching a movie. At night, the colors look washed out and muted. Likewise, under a starry sky, the vibrant hues of the Northern Lights are present, but often too faint for your eyes to see clearly.

In low light, your brain prioritizes motion detection and shape recognition to help you navigate. This trade-off means that the ethereal colors of the aurora are often invisible to the naked eye. Technology is the only way to increase their clarity.

Taking the perfect photo

Smartphones have revolutionized the way people conquer the world. These compact devices use multiple cameras and advanced sensors to collect more light than the human eye can, even in low light. They achieve this by longer exposure times – how long the camera lets light in – larger apertures and increasing the ISO, the amount of light your camera lets in.

But smartphones do more than adjust these settings. They also use computational photography to enhance your images using digital techniques and algorithms. Image stabilization reduces camera shake and exposure settings optimize the amount of light the camera captures.

Processing multiple images creates the perfect photo by stacking multiple images on top of each other. A setting called Night Mode can balance colors in low light, while LiDAR capabilities in some phones keep your images precisely in focus.

A diagram showing a stack of grainy images, smoothed into one clear image.A diagram showing a stack of grainy images, smoothed into one clear image.

LiDAR stands for light detection and ranging, and phones with this setting emit laser pulses to quickly calculate distances to objects in the scene in any kind of light. LiDAR generates a depth map of the environment to improve focus and make objects stand out in your photos.

Two images, left labeled 'optical' and right labeled 'depth' of a person dancing.  The 'optical' image shows what the person would normally look like in the photo, while the 'depth' image shows the silhouette in white against a black background.Two images, left labeled 'optical' and right labeled 'depth' of a person dancing.  The 'optical' image shows what the person would normally look like in the photo, while the 'depth' image shows the silhouette in white against a black background.

Artificial intelligence tools in your smartphone’s camera can further enhance your photos by optimizing settings, applying light flashes, and using super-resolution techniques to get very fine details. They can even identify faces in your photos.

AI processing in your smartphone’s camera

While you can do a lot with a smartphone camera, regular cameras have larger sensors and superior optics, giving you more control over the images you take. Camera manufacturers such as Nikon, Sony and Canon typically avoid tampering with the image, instead allowing the photographer to take creative control.

These cameras give photographers the flexibility to shoot in raw format, allowing you to retain more data from each image for editing and often producing higher quality results.

Unlike dedicated cameras, modern smartphone cameras use AI during and after you take a photo to improve the quality of your photos. As you take a photo, AI tools analyze the scene you’re pointing the camera at and adjust settings such as exposure, white balance and ISO, while recognizing the subject you’re photographing and stabilizing the image. These ensure that you get a great photo when you press the button.

You can often find features that use AI, such as high dynamic range, night mode and portrait mode, enabled by default or accessible through your camera settings.

AI algorithms further enhance your photos by refining details, reducing blur, and applying effects like color correction after you take the photo.

All these features help your camera take photos in low light and contributed to the stunning aurora photos you may have captured with your phone camera.

While the human eye struggles to fully appreciate the otherworldly hues of the Northern Lights at night, modern smartphone cameras overcome this limitation. Using AI and computational photography techniques, your devices allow you to see the bold colors of solar storms in the atmosphere, amplifying the color and capturing otherwise invisible details that even the sharpest eye will miss.

This article is republished from The Conversation, an independent nonprofit organization providing facts and trusted analysis to help you understand our complex world. It was written by: Douglas Goodwin, Scripps College

Read more:

Douglas Goodwin receives funding from the Fletcher Jones Foundation through Scripps College.

Leave a Comment