MetaPhase Consulting

Research Scientist Intern, Computer Vision and Machine Learning (PhD)

16 October 2024
Apply Now
Deadline date:
£50000 - £100000 / year

Job Description

The Reality Labs org aims to connect people from all over the world through world-class AR/VR hardware and software. Our XR Spatial AI team explores, develops, and delivers new cutting-edge technologies that serve as the foundation of current and future Meta products. Specifically, our engineering and research topics cover broad Scene Understanding and Rendering from core perception technology Depth Inference, Semantic Segmentation, Semantic Keypoint to end-to-end features on Passthrough, Occlusion, Room Layout Estimation, and 3D Object Detection. Our team is focused on taking new technologies from early concept to the product level while iterating, prototyping, and realizing the developer and user value and related new experiences.

Our internships are twelve (12) to twenty-four (24) weeks long and we have various start dates throughout the year.Research Scientist Intern, Computer Vision and Machine Learning (PhD) Responsibilities

  • Participate in cutting-edge Computer Vision & Machine Learning research and development for VR/AR/MR related topics
  • Develop technology that will be part of mass-market shipping consumer products
  • Document and present research progress and aim to publish findings at high-impact CV/ML conferences

Minimum Qualifications

  • Currently in the process of obtaining a PhD degree in Computer Science, Electrical Engineering, or Electrical and Computer Engineering in the field of computer vision, machine learning, computer graphics, or robotics.
  • 3+ years experience with Python and/or C++
  • 2+ years experience with modern deep learning frameworks like PyTorch
  • Knowledge in Computer Vision, in one or more of the following domains: Scene Understanding (3D Object Detection, Semantic Segmentation, Semantic Keypoints), Monocular/Stereo Depth Estimation, Multi-View Geometry (e.g. Structure-from-Motion, Multi-View Stereo, SLAM), Semi / Weekly / Self-supervised Machine Learning, Computational Geometry
  • Excellent interpersonal skills, cross-group and cross-culture collaboration
  • Ability to communicate complex research in a clear, precise, and actionable manner
  • Must obtain work authorization in the country of employment at the time of hire, and maintain ongoing work authorization during employment

Preferred Qualifications

  • Proven track record of achieving significant results as demonstrated by publications in top computer vision conferences (e.g., CVPR, ICCV, ECCV, or, SIGGRAPH) or journals (e.g., IEEE Transactions on Pattern Analysis and Machine Intelligence, International Journal of Computer Vision, Pattern Recognition, or IEEE Transactions on Image Processing)
  • Demonstrated software engineering experience via an internship, work experience, coding competitions, open source contributions, or research
  • Intent to return to a degree-program after the completion of the internship/co-op
  • Experience working and communicating cross functionally in a team environment

LocationsAbout Meta Meta builds technologies that help people connect, find communities, and grow businesses. When Facebook launched in 2004, it changed the way people connect. Apps like Messenger, Instagram and WhatsApp further empowered billions around the world. Now, Meta is moving beyond 2D screens toward immersive experiences like augmented and virtual reality to help build the next evolution in social technology. People who choose to build their careers by building with us at Meta help shape a future that will take us beyond what digital connection makes possible today—beyond the constraints of screens, the limits of distance, and even the rules of physics. Meta is committed to providing reasonable support (called accommodations) in our recruiting processes for candidates with disabilities, long term conditions, mental health conditions or sincerely held religious beliefs, or who are neurodivergent or require pregnancy-related support. If you need support, please reach out to [email protected].