The aim for this research project is to equip social robots with nonverbal behaviour similar to the nonverbal behaviour used by people. Nonverbal behaviour refers to the use of gestures, body language and facial expressions which we use when interacting with others. Nonverbal behaviour is said to make up 55% of our communication, and when it is lacking, wrong or ill-timed it can seriously hinder communication. Social robots are robots that interact with us using the same communication channels as people use. As such they also need to express appropriate nonverbal behaviour. If a robot’s nonverbal behaviour is optimised, we expect the interaction between people and robots not only to be more pleasant, but also to be more effective. In this project, instead of programming nonverbal behaviour, we let the robot learn its nonverbal behaviour by letting it watch people talk and interact. As nonverbal behaviour is coordinated with verbal behaviour, we will learn the relationship between both by using state-of-the-art Deep Neural Networks. To teach the robot, we show it video clips of people talking at TEDx events and of people engaged in a conversation, for which we work together with the Max Planck Institute for Psycholinguistics. Once the networks are trained, we will assess the performance of the robot by running a series of studies in which we measure how effective the robot is at getting a message across and how persuasive the robot is.