Proj 4: Face Morphing

CS 194-26: Face Morphing

Midway Face

This part of the project created the outline for all other parts and was thus basically the hardest to implement; in order to create an algorithm to compute the midway face between two faces, we first would find the average geometry by defining geometry manually (or using predefined geometry). From this, we compute a triangulation using Delaunay triangulation, and then compute morphs between the average shape and each face's full shape. Finally, we could morph each pixel by first finding what triangle it's in, using that triangle's inverse affine translation matrix to translate it to the new geometry, and then weighting it by 0.5. Results were as follows:

This process was modularized so that, in future parts, we could use the logic involved in warping faces in order to warp other faces without having to duplicate code; particularly, methods were created to warp faces given two face images, two sets of points, and a target geometry.

Morph Sequences

Using the same logic as is present in the previous part, we were able to create full morph sequence GIFs by weighting the averages of each of the above over the course of 45 different frames. Instead of applying the flat 0.5 * image1_points + 0.5 * image2_points, we apply the formula
(1 - (frame#/(#frames-1))) * image1_points + (frame#/(#frames-1)) * image2_points. The image was also cross-dissolved at the same rate using the same method. Compiling these into one GIF using imageio's mimsave, we get the following GIFs for the same source images as above. I had to compress the GIFs so they are not a full 45-frames in this, but rather half that.

"Mean Face"

Using the same methods as used in the midway face, we were able to compute more complex averages over a population by scaling them together by 1/(number of faces) rather than 1/2. Using the Dane dataset, we compute the mean face. We then can morph population faces onto the mean and vice versa; I also morphed my face onto the population mean and vice versa, which didn't turn out too well (I believe this is because of the angle that I took the photo at and the difference in positioning with respect to the camera, i.e. i think my head was a little too close so the forehead came out extremely large and offscale; there are no data points in the set that control for forehead placement, so this would be consistent with my explanation).

Population mean

Each face warped to mean

My face warped to mean

Mean warped to my face


In this part, we extrapolate from the mean; essentially, instead of warping my face directly to the mean, we warp by a scalar of how far my face was from the mean. The effect of this is that, as the scalar gets greater or lesser, we create images that accentuate or further deprioritize the parts of my face that differ from the mean.

-1 -0.5 0 0.5 1 1.5 2

Bells/Whistles: Male/Female Changing

I tried to use essentially the same method as extrapolation to "change the gender" of my picture, using the average of the female faces instead of the average of male faces to extrapolate from (thus, -1 should be the most female version, +2 should be the most male version). However, given how the extrapolation didn't really work, there is not much difference between that part and these images. You can notice, though, that as the scalar increases, the size of the nose, lips, etc. grows, so one could reasonably guess that the lips of the female average are rather small (or at least smaller than the lips in my picture).

-1 -0.5 0 0.5 1 1.5 2