Transdiciplinary workshop on human motion analysis and synthesis
 2015 June 26, Inria Rennes, Markov Room


9h00 - 9h15 Introduction, Julien Pettré

9h15 - 11h15 Session 1 

9h15 - 9h55 How does the sole geometry of goal oriented locomotor trajectories reveal the role of perception?

Jean Paul Laumond
Jean-Paul Laumond, LAAS-CNRS, Toulouse, France

Summary: Goal oriented human locomotor trajectories live in the 3-dimensional space R2xS1. From a control viewpoint, the dimension of the control space is spanned by three vector fields: respectiveley forward, crab-steering mode and rotation velocities. At first glance, the geometry of the locomotor trajectories may be explained by an optimality principle whose cost function is expressed only in that vector field basis. The hypothesis may be explored by inverse optimal control techniques [1]. However such an hypothesis necessary implies some symmetry properties. We discuss an experimental protocol that invalidates the statement [2]:  goal oriented locomotor trajectories are not symmetric in R2xS1. We then show how another cost function that includes the bearing angle minimization accounts for the geometry of the paths. 
[1] K. Mombaur, A. Truong, J.P. Laumond, From human to humanoid locomotion : an inverse optimal control approach, Autonomous Robots, Vol. 28, N. 3, 2010.
[2]M. Sreenivasa, K. Mombaur, J.P. Laumond, Walking paths to and from a goal differ : On the role of bearing angle in the formation of
human locomotion paths. PlosOne, Vol. 10(4), 2015

Bio: Jean-Paul Laumond, IEEE Fellow, is a roboticist. He is Directeur de Recherche at LAAS-CNRS (team Gepetto) in Toulouse, France. His research is devoted to robot motion. In the 90's, he  has been the coordinator of two  European Esprit projects PROMotion (Planning RObot Motion) and MOLOG (Motion for Logistics), both dedicated to robot motion planning and control. In the early 2000's he created and managed Kineo CAM, a spin-off company from LAAS-CNRS devoted to develop and market motion planning technology. Kineo CAM was awarded the French Research Ministery prize for innovation and enterprise in 2000 and the third IEEE-IFR prize for Innovation and Entrepreneurship in Robotics and Automation in 2005. Siemens acquired Kineo CAM in 2012. In 2006, he launched the research team Gepetto dedicated to Human Motion studies along three perspectives: artificial motion for humanoid robots, virtual motion for digital actors and mannequins, and natural motions of human beings. He teaches Robotics at Ecole Normale Supérieure in Paris. His works are published in Robotics, Computer Science, Automatic Control and recently in Neurosciences.  He has been the 2011-2012 recipient of the Chaire Innovation technologique Liliane Bettencourt at Collège de France in Paris. His current project Actanthrope (ERC-ADG 340050) is devoted to the computational foundations of anthropomorphic action.

9h55 - 10h35 Behavioral Dynamics Approach to Pedestrian and Crowd Behavior

William H. Warren, Brown University, Providence, USA

Summary: Behavioral dynamics seeks to explain stable, adaptive behavior as emerging from the interaction between an agent and their environment, under physical and informational constraints [1].  To this end, we are developing a perceptually-grounded model of human locomotor behavior to account for pedestrian and crowd dynamics.
Taking a local-to-global approach, we derive a pedestrian model based on human experiments in VR, including components for steering, obstacle avoidance, interception, and pedestrian interactions such as following [2-4].  Elementary behaviors are modeled as nonlinear dynamical systems, which are linearly combined to generate more complex behavior.  Multi-agent simulations are then used to predict global patterns of crowd behavior.  Currently, we are investigating the local coupling between a pedestrian and multiple neighbors in a virtual crowd , which appears to be additive and to decay linearly with distance, yielding a metric neighborhood. 
Reciprocally, taking a global-to-local approach, we collect motion-capture data on human crowds in key scenarios.  Patterns of crowd behavior are analyzed to estimate the local coupling and test the model.  Key scenarios such as Swarm, in which 20 participants veer left and right while staying together as a group, and Counterflow, in which two groups pass through each other, can be simulated with just a few model components.
The results support the view that pedestrian and crowd behavior emerges from local interactions, without internal models or plans, consistent with principles of self-organization.
1.  Warren W.H. 2006 The dynamics of perception and action. Psychological Review 113, 358-389.
2.  Warren W.H., Fajen B.R. 2008 Behavioral dynamics of visually-guided locomotion. In Coordination: Neural, behavioral, and social dynamics (eds. Fuchs A., Jirsa V.). Heidelberg, Springer.
3.  Rio K., Rhea C., Warren W.H. 2014 Follow the leader: Visual control of speed in pedestrian following. Journal of Vision 14(2), 4:1-16.
4.  Dachner G., Warren W.H. 2014 Behavioral dynamics of heading alignment in pedestrian following. Transportation Research Procedia 2, 69-76.

Bio: William Warren is Chancellor’s Professor of Cognitive, Linguistic, and Psychological Sciences and Director of the Virtual Environment Navigation Lab (VENLab) at Brown University.  He earned his undergraduate degree at Hampshire College (1976), a Ph.D. in Experimental Psychology from the University of Connecticut (1982), did post-doctoral work at the University of Edinburgh (1983), and has been a professor at Brown ever since, including serving as department chair (2002-10).  He uses virtual reality techniques to investigate the visual control of human action within a dynamical systems framework, with funding from NIH, NSF, and the VA. Warren is author of over 100 research articles and chapters, the editor of two volumes, and the recipient of a Fulbright Research Fellowship, an NIH Research Career Development Award, and Brown's Elizabeth Leduc Teaching Award for Excellence in the Life Sciences. 

10h35 - 11h15 Visual servoing, an intuitive motion control approach

francois chaumette

Francois Chaumette, Inria, Rennes, France

Summary: The talk will present the basic aspects of visual servoing, that is, a control scheme in closed loop with respect to visual data for controlling the motion of robotics systems. Applications in humanoid robotics will be also described.

Bio: François Chaumette, IEEE Fellow, is an Inria Senior Scientist, currently head of the Lagadic group at Irisa in Rennes.

11h15 - 12h15 Demonstrations @ Immersia 3

12h15 - 13h45 Lunch (Salle Sein)

13h45 - 14h25 Session 2 

13h45 - 14h25 Crowds for real-time applications

nuria pelechano

Nuria Pelechano, Barcelona University, Spain

Summary: The simulation of large numbers of autonomous avatars can be a big challenge for real-time applications.  In this talk I will cover some of the most relevant areas regarding this topic including local movement, planning, navigation meshes and synthesis of animations. Local movement can be achieved through a combination of physical forces and rules given by psychological and geometrical factors. At the higher level, local movement needs to be driven by a navigation technique, such as multi-domain planning based on different representations for space, time and actions. By using plans in one domain to focus the search in finer domains, we can speed up the overall process of having agents navigating complex virtual environments. This talk will briefly cover how navigation meshes can be automatically generated for path finding with the NEOGEN system. Finally, I will describe techniques to synthesize animations based on foot step trajectories for those scenarios that may require careful foot positioning.

Bio: Nuria Pelechano is an Associate Professor at the Universitat Politecnica de Catalunya. She obtained her Engineering degree from the Universitat de Valencia, her Masters degree from the University College London, and her PhD from the University of Pennsylvania as a Fulbright Scholar in 2006. During her post-doc she worked with the Architecture Department at UPenn in several technology transfer projects on crowd evacuation. Nuria is the co-author of the books “Virtual Crowds: Methods, Simulation, and Control. Morgan & Claypool Publishers. 2008.” and “Virtual Crowds: Steps toward Behavioral Realism, Morgan & Claypool Publishers. August 2015”.  Nuria has over 30 publications in journals and international conferences on Computer Graphics and Animation. She has participated in projects funded by the EU, the Spanish Government, and several USA institutions. Her research interests include simulation, animation and rendering of crowds, generation of navigation meshes, real-time 3D graphics, and human-avatar interaction in virtual environments.

14h25 - 15h05 Assessing Agency and Self-Location in Embodied Interactions

ronan boulic
Ronan Boulic, EPFL, Switzerland

Summary : the talk will present recent activities of the Immersive Interaction research Group on real-time interaction with virtual environment through full-body movements. We focus primarily on ensuring the Sense of Agency , i.e. "making the user feel to be in control of the movement of the displayed avatar", but we also explore the importance of the Sense of Self-Location, i.e. "real and virtual bodies coincide due to a first person viewpoint". The intended applications are training in virtual environments, rehabilitation and performance animation.

Bio : Ronan Boulic is a Senior Scientist and PhD Advisor at the EPFL (Ecole Polytechnique Fédérale de Lausanne). He currently leads the Immersive Interaction research group (IIG) from the School of Computer and Communication Sciences. He received the PhD degree in Computer Science in 1986 from the University of Rennes, France, and the Habilitation degree in Computer Science from the University of Grenoble, France, in 1995. He is senior member of IEEE and of ACM, and member of Eurographics. Ronan Boulic has co-authored more than 140 refereed publications, including 36 in ISI-indexed journals, and contributed to 10 books. He was paper co-chair of the Eurographics/SIGGRAPH Symposium on Computer Animation 2004 in Grenoble, and general chair of the same symposium in 2012 in Lausanne. He was also paper co-chair of the joint Virtual Reality conference JVRC in Madrid in 2012. He has served on over 50 program committees of key conferences in computer graphics, computer animation and virtual Reality. He is Associate Editor of Wiley CAVW, Elsevier Computers & Graphics and IEEE TVCG (2011-14).

15h05 - 15h45 Motion synthesis and planning by spatial relationship descriptors

Taku Komura, Edinbourgh University, Scotland

Summary: In the area of computer animation and robotics, synthesizing movements such as tangling limbs, passing through constrained environments, wrapping and winding cloth or ropes around objects, are considered as difficult problems. Making use of descriptors based on the spatial relationships is essential for synthesizing such movements.  In this talk, I will describe about the research done in our group for synthesizing complex movements by using spatial relationship descriptors. These include synthesizing winding and knotting movements using Gauss linking numbers, synthesizing wrapping movements using electrostatic flux, retargeting movements using Laplacian coordinates, and classifying and recognizing scenes using medial axis. I will further discuss about our recent work about abstracting the geometry of the environment and plan the movements in the abstracted space.

Bio: Taku Komura is a Reader at the Institute of Perception, Action and Behavior, School of Informatics, University of Edinburgh. He is also awarded a
Royal Society Industry Fellowship to work in collaboration with Disney Research. As the group leader of the Computer Animation and Visualization Unit, his research has focused on data-driven character animation, physically-based character animation, crowd simulation, cloth animation, geometry processing and robotics.

15h45 - 16h25 Guiding style in optimized control of character locomotion, balance, and multi-finger manipulation

Paul Kry, McGill University, Montreal, Canada

Summary : Control for physically based characters presents a challenging task because it requires not only the management of the functional aspects that lead to the successful completion of the desired task, but also the resulting movement must be visually appealing and meet the quality requirements of the application. Crafting controllers to generate desirable behaviours is difficult because the specification of the final outcome is indirect and often at odds with the functional control of the task.  In this talk I will present methods for optimizing controllers where there also exists a mechanism for guiding the style of the motion.  I present and discuss a collection of examples involving locomotion, balance, and multi-finger manipulation.

Bio : Paul G. Kry is an associate professor in the School of Computer Science at McGill University, where heads the Computer Animation and Interaction Capture Laboratory.  Paul G. Kry received his B.Math. in computer science with electrical engineering electives in 1997 from the University of Waterloo, and his M.Sc. and Ph.D. in computer science from the University of British Columbia in 2000 and 2005.  He spent time as a visitor at Rutgers during most of his Ph.D., and did postdoctoral work at INRIA Rhône Alpes and the LNRS at Université René Descartes.  His research interests are in physically based animation, including deformation, contact, motion editing, and simulated control of locomotion, grasping, and balance.  He co-chaired ACM/EG Symposium on Computer Animation in 2012, Graphics Interface in 2014, and served on numerous program committees, including ACM SIGGRAPH, Eurographics, ACM/EG Symposium on Computer Animation, Pacific Graphics, and Graphics Interface.  He is currently the president of the Canadian Human Computer Communications Society, the organization which sponsors the annual Graphics Interface conference.

16h25 - 17h00 Conclusion, Julien Pettré