Project 3: Face Morphing

Chelsea Ye, cs194-26-agb


Overview

In this project I produce a 'morph' animation of my face into someone else's face, compute the mean of a population of faces and extrapolate from a population mean to create a caricature of myself. The morphing process is consisted of warping images into the "mean" shape and dissolve the colors between two images.

Defining Correspondences

The warp is controlled by defining a correspondence between the two pictures. The correspondence should map eyes to eyes, mouth to mouth, chin to chin, ears to ears, etc., to get the smoothest transformations possible. I start with defining 40 pairs of corresponding points of the two images by hand with the same ordering. Then I produce a set of mean points by averaging the two set from two images, and compute a Delaunay triangulation of the mean points. Delaunay triangulation is good for avoiding overly skinny triangles.


40 correspondence points in my face
Computing the "Mid-way Face"

The set of mean points from previous part defines a mean shape of the two faces. The next step is to warp both faces into the mean shape. I implement an affine warp for each triangle in the triangulation from the original images into this new shape. In computeAffine(tri1_pts, tri2_pts) I compute the affine transformation matrix A between two triangles defined by vertices tri1_pts and tri2_pts. Finally, I use linear interpolation to dissolve the colors from two images.
Here I morph my face into Korean singer Baekhyun and Chinese actor Ju Jingyi.


My face

Baekhyun's face

Mean face

My face

Ju Jingyi's face

Mean face
The Morph Sequence

I implement morph(im1, im2, im1_pts, im2_pts, tri) to compute warping and color dissolve at a parameter t, and then produce 45 frames of intermediate shape/color by looping t through [0,1]. The frames are used to contruct an animation of the morphing process.



The "Mean face" of a population

In this part I compute the average face of a group of Danish computer scientists from an online face dataset by subsets of gender and facial expressions. Then I warp my face into the average geometry and warp the average face into my geometry.


Average male face with no facial expressions

Average female face with no facial expressions

Average female face with smile

My face in average female geometry

Average female face in my geometry
Caricatures: Extrapolating from the mean

In this part I produce a caricature of my face by extrapolating from the population mean of female computer scientists with no facial expression. The extrapolation is implemented by adding the ratios of difference between my face and mean face to my face. Following example images are computed by different ratios.


0.3x extrapolation

0.5x extrapolation

0.8x extrapolation
Bells and Whistles

  • We made a face morphing music video with some students in the class. Check it out here!
  • I change the gender of my face to male by using the population mean face from Part 4, computing the geometric difference between female and male mean faces, and then adding to my face geometry.

  • Me with 0.5x male

    Me with 0.8x male

    Me with 1x male