Modeling Urban Dance Textures

Jason Zou

[Report] [Github]

Skilled dancers are often characterized by a unique style of movement that distinguishes them from others, suggesting that a computational representation of these audio-dependent movement patterns may be possible. For this project, we look at a very small slice of the bigger picture by using motion graphs to generate novel motions in the style of a target subject independently of audio.


Top-Level Approach

[2] [3] [5]

Sample Results

Original Data
All rights belong to the original creators.

Generated Clips
The top clip shows a normal shortest-path solution between the starting and end frame, while the bottom clip shows a "creativity" influenced solution. Note that due to a small dataset and thus a small motion graph, the only additional difference between the methods for this specific traversal (controlled for randomness) is the completion of the wiggle at the beginning of the motion. Otherwise, the generated motion is exactly as seen in ETA.

Sample Failure Case

Shortest Path
"Creative" Path
Here we observe no difference in the generated path between the two methods, likely due to a small dataset limiting the capabilities of the motion graph.


Sample Non-Constrained Traversals

True Random Walk
Pseudo-Random Walk
Examples of non-constrained motion graph traversal that loosely correspond to "freestyle". Pseudo-random corresponds to our "creativity" method in a non-constrained setting.