This project aimed to understand how to warp, interpolate, and morph faces. In this project we apply Delaunay triangulation, affine transformation, interpolation, and cross-dissolution to transform faces into one another.

To define points, I labeled 82 pts in photoshop that resembled several well known face templates (Face++/Mod++). I then added pseduo points in the corner of the images to help with triangulation. I then calculated the average of our two point arrays and then did delaunay triangulation on those points. Here is an image of labeled points as well as the triangulation applied to the image.

This section was by far the most difficult section to understand conceptually. To find the midway face, we first need to warp and interpolate from the source shape to the destination shape (our face image and the averaged points, respectively). To do this, we loop through each of our triangulations, compute the affine transformation from the source to the destimation, apply a polygon mask, and then do a dot product of the inverse of the affine transformation and the polygon mask. Afterwards, we need to interpolate points within each color channel. I probably spent upwards of 10 hours trying to figure out how to interpolate. After that, we blend the image by the equation $$alpha*warped_{im1} + (1-alpha)*warped_{im2}$$

To morph, I made 20 frames of images that had varying alphas. My morph sequence is below.

For this section, I used the Danes IMM face database. In addition to the fiducial points they had, I added four more on the corner to help with triangulation.

I then averaged the points across the entire database to get the average template space. Then to get the mean face, I applied the delaunay triangulation to these average template and used it as the destination shape. I looped through each triangulation, and then through each image, and warped/interpolated the source shape to the desimtation. I then blended the warped images by taking the average of each image. The mean face is shown below.

Here are some examples of faces being morphed to the mean face. The top row is the original image and the bottom row is the morphed image.

Then I attempted to average my face to the mean face - instead of reannotating my points, I saw that all of the points in the danes points were contained in my annotation. I subselected those points and moved the corner points of the mean image in to resemble my corner points. I additionally added white space on either side of my face to have a similar size image to the mean face.

My warped face to the mean face shape is below. Unfortunately, it looks a little distorted. It may be because I was looking down for my picutre. My face is also looking a bit toward the left.

I then attempted to morph the mean face to my face shape. The results are less distorted, but you can definitely see a change in head tilt, face shape, and head direction.

To make a caricature, I extrapolated the population mean and scaled it by alpha.

I attempted to morph my partner's face into the average female space for the danes template. I did this by selecting only female faces and finding the average points between them and getting the mean population image that way.

Unfortunately, there was distortion again. This may have to do to some spacing issues.

I then tried to find an average female face online and I hand annotated that image to match my original annotations. Again, I found a lot of distortion which also seems to be caused by image size/spacing issues