This project consisted of taking pictures from the same position in space but facing different directions, recovering homographies between the pictures both manually and automatically with feature detection, and finally warping the images using the homographies to create a panorama.
I took a few sets of pictures inside and outside my house and of a painting on my wall.
I annotated the four corners of each image using ginput from matplotlib:
I computed the homography matrix between the two sets of points and obtained:
After trying to warp one to look like the other the result was decent:
After manually defining a rectangle with the proper dimensions, I used the resulting homography matrix to rectify both of the images:
Manually choosing the points actually gave a pretty bad panorama, because they were slightly misaligned:
After running Harris corner detection on one of my images, I got so many corners that it was difficult to tell what was actually a corner. Making the corner pixels red just turned the entire image rather red:
After implementing everything from the MOPS paper, I got good results. I combined three different viewpoints in the last image to create a bigger panorama.
The coolest thing I learned from this project was how to be a numpy pro! I spent so long fiddling with numpy arrays.