What is it about?

Computer games and simulations are getting closer to creating environments that look and feel like the real world. Developers use motion capture to make the movements realistic in computer-generated models. Here, a computer model imitates the actions or movements of people captured in the real world. However, this method is limited in scope by the type of actions that have been recorded. This paper presents a data-driven method that can show realistic movements for players in a basketball game. Using data from motion capture, the model learns how two basketball players interact with each other and the ball. It then produces movements for the players that reflect offensive and defensive moves in the game. The moves are mapped to a game controller. This allows a player to perform detailed basketball skills with in-game characters. The method can also be used to produce motions that involve multiple contacts like sitting on chairs, opening doors, and carrying objects.

Featured Image

Why is it important?

A basketball player's movements constantly change as they dribble or intercept the ball. Capturing interactions between multiple contacts between the body, the ball, and the ground can be a complex task. Rather than animate the entire body, this system captures the motion of each body part. It then learns how the body parts interact with external objects. With this approach, characters can move in many ways in scenarios where fast and dynamic movements are required, such as in a basketball game. KEY TAKEAWAY: The method developed in the study simulates many movements primarily for in-game characters in a basketball game. It can also be used to produce movements with multiple contacts. The animation system can be used to create characters for VR sports training and video games.

Read the Original

This page is a summary of: Local motion phases for learning multi-contact character movements, ACM Transactions on Graphics, August 2020, ACM (Association for Computing Machinery),
DOI: 10.1145/3386569.3392450.
You can read the full text:



Be the first to contribute to this page