DeepMind
Research Engineer, Robotics
Job Description
At Google DeepMind, we value diversity of experience, knowledge, backgrounds and perspectives and harness these qualities to create extraordinary impact. We are committed to equal employment opportunity regardless of sex, race, religion or belief, ethnic or national origin, disability, age, citizenship, marital, domestic or civil partnership status, sexual orientation, gender identity, pregnancy, or related condition (including breastfeeding) or any other basis as protected by applicable law. If you have a disability or additional need that requires accommodation, please do not hesitate to let us know.
Snapshot
Artificial Intelligence could be one of humanity’s most useful inventions. At Google DeepMind, we’re a team of scientists, engineers, machine learning experts and more, working together to advance the state of the art in artificial intelligence. We use our technologies for widespread public benefit and scientific discovery, and collaborate with others on critical challenges, ensuring safety and ethics are the highest priority.
About Us
In the robotics team we believe that a truly general AI needs to be able to act in the physical world. The aim of the Robotics Team is to embody Artificial Intelligence, thereby endowing it with the ability to learn how to interact with objects, people, and other robots, in order to perform complex and useful tasks. At Google Deepmind, experts in different scientific fields (e.g. quantum chemistry, neuroscience, game theory) collaborate with research scientists and software engineers to solve grand challenges that would enable multiple scientific breakthroughs. The Robotics Lab is a group of pioneering research scientists, research engineers and software engineers uniquely specialised in Robotics at Deepmind.
The role
Our Robotics Research Team at Google DeepMind focuses on Machine Learning approaches to robotics, with an emphasis on scalable and general techniques. We’re looking for Robotics Research Engineers to work on the development and deployment of new models and algorithms for robot perception, planning and control.
Google is uniquely positioned to offer AI technology to billions of users, and we’re particularly interested in candidates who are passionate about pushing technology to the real-world. The ideal candidate would have expertise in both the modern AI stack as well as theoretical and practical knowledge of robot control, and be comfortable communicating with external product teams.
The work that we do presents interesting and unique engineering challenges and as a member of the Research Engineering team, you’ll contribute towards enhancing the performance of both our research infrastructure and learning algorithms.
Key responsibilities:
- Work with a small team to develop and iterate on new techniques
- Write and maintain robotics R&D infrastructure
- Design and run experiments
- Evaluate results and communicate effectively to the wider team
About you
In order to set you up for success as a Research Engineer at Google DeepMind, we look for the following skills and experience:
- Modelling, planning and control for hybrid dynamical systems (i.e. systems that exhibits both continuous and discrete dynamic behaviours) with application to robotic manipulation
- Machine learning (SL, UL, RL) applied on simulated and real robots (with continuous action space and continuous state space)
- Middleware and software development for robotics. Experience with real-time middleware for robotics will be positively evaluated
- Robotics perception (e.g. object pose estimation and matching, instance segmentation, dense prediction, grasp detection, planning, end-to-end and self-supervised grasping, depth and ego-motion, etc.). Experience with deep learning for perception will be positively evaluated
In addition, the following would be an advantage:
- Experience with enterprise software development
- Experience with deploying real-world technology, e.g. product-teams, startups, open-source contributions
Applications will close on Friday 29th March at 6pm BST and will be reviewed on a rolling basis.