Oculus VR
Research Scientist - Multimodal Sensing and Social Behavior
Oculus VR, Redmond, Washington, United States, 98052
Research Scientist - Multimodal Sensing and Social Behavior
Join to apply for the
Research Scientist - Multimodal Sensing and Social Behavior
role at
Oculus VR Meta’s Reality Labs Research (RL-R) brings together a team of researchers, developers, and engineers to create the future of Mixed Reality (MR), Augmented Reality (AR), and Wearable Artificial Intelligence (AI). Within RL-R, the ACE team solves complex challenges in behavioral inference from sparse information. We leverage multimodal, egocentric data and cutting-edge machine learning to deliver robust, efficient models that serve everyone. Our research provides core building blocks to unlock intuitive and helpful Wearable AI, empowering everyone to harness the superpowers of this emerging technology in their daily lives. In this role, you will work closely with Research Scientists and Engineers from across RL-R to develop novel, state-of-the-art algorithms for wearables that incorporate social behavior dynamics and multimodal sensing platforms. You will design and implement data collection strategies, benchmarks, and metrics to validate and improve model efficiency, scalability, and stability. Your expertise in psychology and human-human interaction will be crucial in developing AI algorithms that can infer human behavior patterns from wearable devices. Responsibilities: Characterize human behavior in-the-wild to derive behavioral signals for user states in the form of quantitative insights from ethnographic observations Identify use cases and experiences that leverage behavioral signals to provide user value in wearable AI assistance Design and implement data collection strategies, benchmarks, and metrics to validate and improve model interpretability, scalability, and stability Provide research results that accelerate the development and application of state-of-the-art AI algorithms to infer human behavior patterns from wearable devices Translate results of human data collection into datasets that can be effectively leveraged by ML tools and language readily interpretable by foundational models Collaborate with researchers and engineers across broad disciplines through all stages of project development Contribute to research that can eventually be applied to Meta products and services Create tools, infrastructure, and documentation to accelerate research Learn constantly, dive into new areas with unfamiliar technologies, and embracing the ambiguity of Augmented Reality/Virtual Reality problem solving Minimum Qualifications: Bachelor's degree in Computer Science, Computer Engineering, relevant technical field, or equivalent practical experience PhD degree in Informatics, Social/Behavioral Sciences, Computer Science, Human-Computer Interaction, or related field plus 2+ years of research scientist experience in industry Documented understanding of social behavior dynamics, including expertise in psychology and human-human interaction 2+ years of research scientist (post-PhD) experience with designing field experiments, data campaigns, and observation skills for human behavior Proven track record of solving complex challenges with multimodal ML as demonstrated through grants, fellowships, patents, or publications in top journals or at conferences like CVPR, NeurIPS, CHI, or equivalent 2+ years of documented experience with multimodal sensing platforms, data collection, multimodal data processing and analysis Preferred Qualifications: PhD degree in Behavioral Science, Computer Science or related field plus 3+ years experience with biosignals, behavioral signals, or egocentric data from wearable sensors 2+ years of coding experience documented in publications or open source (e.g., GitHub) repositories (such as Python, C++, PyTorch, etc.) Experience with Large Language Models Experience working in Wearables, Augmented Reality/Virtual Reality Experience with Multimodal Deep Learning approaches and research Meta is proud to be an Equal Employment Opportunity and Affirmative Action employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender, gender identity, gender expression, transgender status, sexual stereotypes, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics. Meta is committed to providing reasonable accommodations for candidates with disabilities in our recruiting process. If you need any assistance or accommodations due to a disability, please let us know at accommodations-ext@fb.com.
#J-18808-Ljbffr
Join to apply for the
Research Scientist - Multimodal Sensing and Social Behavior
role at
Oculus VR Meta’s Reality Labs Research (RL-R) brings together a team of researchers, developers, and engineers to create the future of Mixed Reality (MR), Augmented Reality (AR), and Wearable Artificial Intelligence (AI). Within RL-R, the ACE team solves complex challenges in behavioral inference from sparse information. We leverage multimodal, egocentric data and cutting-edge machine learning to deliver robust, efficient models that serve everyone. Our research provides core building blocks to unlock intuitive and helpful Wearable AI, empowering everyone to harness the superpowers of this emerging technology in their daily lives. In this role, you will work closely with Research Scientists and Engineers from across RL-R to develop novel, state-of-the-art algorithms for wearables that incorporate social behavior dynamics and multimodal sensing platforms. You will design and implement data collection strategies, benchmarks, and metrics to validate and improve model efficiency, scalability, and stability. Your expertise in psychology and human-human interaction will be crucial in developing AI algorithms that can infer human behavior patterns from wearable devices. Responsibilities: Characterize human behavior in-the-wild to derive behavioral signals for user states in the form of quantitative insights from ethnographic observations Identify use cases and experiences that leverage behavioral signals to provide user value in wearable AI assistance Design and implement data collection strategies, benchmarks, and metrics to validate and improve model interpretability, scalability, and stability Provide research results that accelerate the development and application of state-of-the-art AI algorithms to infer human behavior patterns from wearable devices Translate results of human data collection into datasets that can be effectively leveraged by ML tools and language readily interpretable by foundational models Collaborate with researchers and engineers across broad disciplines through all stages of project development Contribute to research that can eventually be applied to Meta products and services Create tools, infrastructure, and documentation to accelerate research Learn constantly, dive into new areas with unfamiliar technologies, and embracing the ambiguity of Augmented Reality/Virtual Reality problem solving Minimum Qualifications: Bachelor's degree in Computer Science, Computer Engineering, relevant technical field, or equivalent practical experience PhD degree in Informatics, Social/Behavioral Sciences, Computer Science, Human-Computer Interaction, or related field plus 2+ years of research scientist experience in industry Documented understanding of social behavior dynamics, including expertise in psychology and human-human interaction 2+ years of research scientist (post-PhD) experience with designing field experiments, data campaigns, and observation skills for human behavior Proven track record of solving complex challenges with multimodal ML as demonstrated through grants, fellowships, patents, or publications in top journals or at conferences like CVPR, NeurIPS, CHI, or equivalent 2+ years of documented experience with multimodal sensing platforms, data collection, multimodal data processing and analysis Preferred Qualifications: PhD degree in Behavioral Science, Computer Science or related field plus 3+ years experience with biosignals, behavioral signals, or egocentric data from wearable sensors 2+ years of coding experience documented in publications or open source (e.g., GitHub) repositories (such as Python, C++, PyTorch, etc.) Experience with Large Language Models Experience working in Wearables, Augmented Reality/Virtual Reality Experience with Multimodal Deep Learning approaches and research Meta is proud to be an Equal Employment Opportunity and Affirmative Action employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender, gender identity, gender expression, transgender status, sexual stereotypes, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics. Meta is committed to providing reasonable accommodations for candidates with disabilities in our recruiting process. If you need any assistance or accommodations due to a disability, please let us know at accommodations-ext@fb.com.
#J-18808-Ljbffr