Project 5 - [Auto]Stitching Photo Mosaics

5A Part 1 - Image Warping and Mosaics

By Bernard Zhao for CS194-26 Spring 2020

All code can be found in P5.ipynb

Shooting the Pictures

These photos were shot from the same spot, just a short walk from my house. I shot enough for a whole 180 degree view, but stitched them together non-cylindrically is too big.

im2 im1

Recovering Homographies

I handpicked 6 correspondences in Photoshop, and then wrote them down in the notebook. I then used the below formulation and solved using least squares to obtain the homography matrix (after reshaping and adding 1).

[x1,1y1,11000x1,1x2,1y1,1x2,1000x1,1y1,11x1,1y2,1y1,1y2,1x1,ny1,n1000x1,nx2,ny1,nx2,n000x1,ny1,n1x1,ny2,ny1,ny2,n][abcdefgh]=[x2,1y2,1x2,ny2,n]\begin{bmatrix} x_{1,1} & y_{1,1} & 1 & 0 & 0 & 0 & -x_{1,1} x_{2,1} & -y_{1,1} x_{2,1} \\ 0 & 0 & 0 & x_{1,1} & y_{1,1} & 1 & -x_{1,1} y_{2,1} & -y_{1,1} y_{2,1} \\ & & & & \vdots & & & \\ x_{1,n} & y_{1,n} & 1 & 0 & 0 & 0 & -x_{1,n} x_{2,n} & -y_{1,n} x_{2,n} \\ 0 & 0 & 0 & x_{1,n} & y_{1,n} & 1 & -x_{1,n} y_{2,n} & -y_{1,n} y_{2,n} \\ \end{bmatrix} \begin{bmatrix} a \\ b \\ c \\ d \\ e \\ f \\ g \\ h \end{bmatrix} = \begin{bmatrix} x_{2,1} \\ y_{2,1} \\ \vdots \\ x_{2,n} \\ y_{2,n} \end{bmatrix}

Check out computeH to see the implementation.

Warping the Images

I used an inverse warp to project im1 onto im2:

im1_warped

Check out warpImage to see the implementation.

Image Rectification

The warping works perfectly for rectification as well. Check out my quarantine setup where I was doing this project:

before after

Now you can (sorta) see my screen from my original perspective!

Mosaic Blending

Now using an alpha channel, we can put im2 and im1_warped together:

combined

What did I learn?

This was a surprisingly tough project because of how tedious offset calculation became. Hopefully this makes the second part easier. I've gotten very familiar with numpy array indexing, which is honestly some magic as well. It was fun to play with my camera manually tweaking the settings. It also was interesting to see the effects of my lens hood in the images, as the imperfect blending makes the difference in lens flare very obvious.

Part 2 - Automatic Stitching

Harris Corners

Since my images had such a high resolution, I had to tweak the min_distance=20 in the starter code to end up with arond 10,000 points:

harris_corners

Adaptive Non-Maximal Suppression

Then, using ANMS, I brought this many points down to 500, all nicely spread out. I used a c_robust value of 0.9.

ri=minjxixj,s.t. f(xi)<crobustf(xj),xjIr_i = \min_{j} | x_i - x_j |, \text{s.t. }f(x_i) < c_{\text{robust}} f(x_j), x_j \in \mathcal{I}

anms

Feature Descriptor Extraction

Then for each of the 500 points, I sampled the 40 by 40 grid surrounding them, then rescaling it down to an 8 by 8:

example_descriptor

Feature Matching

Then using those feature descriptors, we can can match them to those with the smallest distance, but only if the ratio between the smallest and second smallest distance is less than our threshold 0.3.

matched1 matched2

You can already see the points lining up correspondingly in this step, overlaying each other in the general area of the images. However, you can also see some points that don't have a match in the other image.

Ransac

To fix that, we run RANSAC, which I performed 100 iterations of, using a threshold of 0.5

ransac1 ransac2

These points all match nicely.

Results

Auto: mosaic1 By Hand: mosaic1 Auto: mosaic2 By Hand: mosaic1 Auto: mosaic3 By Hand: mosaic1

Coolest thing I learned

I learned that implementing a research paper (albiet an easier version) isn't nearly as intimidating as I thought!