Project 6: Image Warping and Mosaicing

Jacob Holesinger, cs194-agj

Overview

The aim of this project was to learn about image warping in order to create mosaics and panoramas. An image such as a panorama is captured by taking multiple photos from the same center of projection. While a simple affine transformation would not suffice to capture the changing perspective, by adding another degree of freedom using a homography instead we can warp one to match the other. Below are some examples of a homographies used to rectify a images.


Source image
Rectified
Source image
Rectified

The homography gives a correspondence between the original image and its warped version. The only problem is that images in the computer are discrete samplings of the continuous world and as a result we may strech the space between the samples during the transformation. We can solve this by working in the opposite direction, iterating over pixels in the would be warped image and tracing back their corespondences in the source image. At this point in the source image, we can instead sample an interpolated version to continuously blend between the discrete pixels.

By Hand Correspondence Panorama Results

A homography has 8 degrees of freedom so we can solve for one given at least 4 (x,y) points. After defining corespondences between two images, we can compute a homography and apply a warp before finally blending the results to produce a panorama. Below are some examples of images, their corespondence points and the resulting composites.

These were taken from inside of vlsb. Below is an example of the point correspondences I entered for each panorama.

Below is the final panorama. There is an artifact of interpolation going off of the edge of the source image on the left.

This is from the Valles Caldera in New Mexico

These pictures were taken at Indian Rock

Another one from New Mexico

Automatic Features

While defining the images by hand can work, it is a bit tedious, especially when trying to make larger panoramas composed of many shots. The first step in creating automatic correspondences is finding parts of the image likely to match between warped versions. Corners work incredibly well for this because they don't change much under warps. To pick out the corners we use a Harris filter and threshold on the results magnitude. This will give the initial crop of points to pick from. Bellow is an example with all of the Harris detector points overlayed.

Source image

In order to prune out some of the feature points, we use an Adaptive Non-Maximal Suppression algorithm which supresses points within a certain radius around the strongest Harris corners in the image. Below is the image after running anms.

Source image

The next step is to find correspondences between all of the detected points in the two images. This is done by examining windows of pixels centered around the points left over. We use an 8x8 downsampled version that has been bias/gain normalized of each original 40x40 pixel window to get the overall description of that area. Below are a couple examples of the 8x8 descriptors.

The correspondences are found by first choosing the first nearest neighbor in feature space for each point as a match and then tossing out those which have a second nearest neighbor that is close to the first. The reasoning is that if the decission is too close too call, this probably isn't a good pair of points to use in the final image. Below is the image after removing these points that your russian grandmother might warn you of choosing.

Source image

Summary

In this project, I learned about a cool warping trick that can be used to rectify images and match perspective. Im suprised by how natural the warped images seem despite being extrapolations of another image. Additionally, I learned about a really cool way to describe images and their relations to one another using automatically detected features.