Bringing on-screen avatars to life is a time-consuming task but a new AI advancement could make characters behave more realistically than ever before.
Recently, Carnegie Mellon University developed a new system with the help of DeepMotionInc, which could cut production times for things like dribbling animations, to be used in basketball video games.
Credit: CMU Computer Science / Youtube
Normally, these movements are achieved by humans, usually by the basketball players themselves, wearing motion-capture suits, a process that is both time-consuming as well as costly.
The new system does not need people in suits, just minimal video input. It learns via deep reinforcement learning using motion capture of the movements performed by people who dribble basketballs. It’s a trial and error process that requires millions of trials but the results render a more plausible ball movement.
“Once the skills are learned, new motions can be simulated much faster than real-time,” said Jessica Hodgins, a professor of computer science and robotics at Carnegie Mellon.
The skills are learned by the program in two stages. Firstly, it learns how to control the arms and generally master locomotion and afterwards, how to use them in order to control the motion of the ball. And then, there you have it – a more realistic and fluid movement.
Obviously, there is further work to be done to ensure these movements don’t affect the character’s balance and even more work if the system would have to learn to address other sports, such as soccer, where the game maneuvers are tightly tied to balance. Nonetheless, the technology is going in the right direction and the games of tomorrow might look so much like the real thing we’ll have trouble telling them apart. Honestly? I can’t wait for it.
Follow TechTheLead on Google News to get the news first.