Back in Sep, Google Street view vehicles received their 1st major upgrade in eight years that should result in high-resolution images. Today, Google analysis detailed a new algorithm that should address a standard flaw with current Street view panoramas.
Looking at Street view, it’s not too difficult to identify a misalignment in the 360-degree panorama that’s created from stitching multiple photos together. Most result in weird visual quirks like non-straight or jagged surfaces, but more seriously, they will create text on signs unreadable.
These errors are primarily due to mis-calibration in the sphere of cameras, timing variations between adjacent cameras, and parallax. while algorithms and lens recalibration are used to tackle these issues, others like “visible seams in image overlap regions will still occur.”
However, Google currently has a 2-stage solution to this issue:
The idea is to subtly warp every input image such that the image content lines up inside regions of overlap. This needs to be done carefully to avoid introducing new kinds of visual artifacts. The approach should even be robust to variable scene geometry, lighting conditions, calibration quality, and lots of other conditions.
The first called Optical Flow finds corresponding pixel locations on a pair of images that overlap then tries to align them. this method is also used by the PhotoScan app for digitizing previous printed out photos with a smartphone camera.
Meanwhile, global optimization then warps all the image to “simultaneously align all of the corresponding points from overlap regions.” The new algorithm works and is presently retroactively restitching existing panoramas to enhance their quality.