Talking Papers Podcast
๐๏ธ Welcome to the Talking Papers Podcast: Where Research Meets Conversation ๐
Are you ready to explore the fascinating world of cutting-edge research in computer vision, machine learning, artificial intelligence, graphics, and beyond? Join us on this podcast by researchers, for researchers, as we venture into the heart of groundbreaking academic papers.
At Talking Papers, we've reimagined the way research is shared. In each episode, we engage in insightful discussions with the main authors of academic papers, offering you a unique opportunity to dive deep into the minds behind the innovation.
๐ Structure That Resembles a Paper ๐
Just like a well-structured research paper, each episode takes you on a journey through the academic landscape. We provide a concise TL;DR (abstract) to set the stage, followed by a thorough exploration of related work, approach, results, conclusions, and a peek into future work.
๐ Peer Review Unveiled: "What Did Reviewer 2 Say?" ๐ข
But that's not all! We bring you an exclusive bonus section where authors candidly share their experiences in the peer review process. Discover the insights, challenges, and triumphs behind the scenes of academic publishing.
๐ Join the Conversation ๐ฌ
Whether you're a seasoned researcher or an enthusiast eager to explore the frontiers of knowledge, Talking Papers Podcast is your gateway to in-depth, engaging discussions with the experts shaping the future of technology and science.
๐ง Tune In and Stay Informed ๐
Don't miss out on the latest in research and innovation.
Subscribe and stay tuned for our enlightening episodes. Welcome to the future of research dissemination โ welcome to Talking Papers Podcast!
Enjoy the journey! ๐
#TalkingPapersPodcast #ResearchDissemination #AcademicInsights
Talking Papers Podcast
HMD-NeMo - Sadegh Aliakbarian
๐๏ธJoin us on this exciting episode of the Talking Papers Podcast as we sit down with the talented Sadegh Aliakbarian to explore his groundbreaking ICCV 2023 paper "HMD-NeMo: Online 3D Avatar Motion Generation From Sparse Observations" . Our guest, will take us on a journey through this pivotal research that addresses a crucial aspect of immersive mixed reality experiences.
๐ The quality of these experiences hinges on generating plausible and precise full-body avatar motion, a challenge given the limited input signals provided by Head-Mounted Devices (HMDs), typically head and hands 6-DoF. While recent approaches have made strides in generating full-body motion from such inputs, they assume full hand visibility. This assumption, however, doesn't hold in scenarios without motion controllers, relying instead on egocentric hand tracking, which can lead to partial hand visibility due to the HMD's field of view.
๐ง "HMD-NeMo" presents a groundbreaking solution, offering a unified approach to generating realistic full-body motion even when hands are only partially visible. This lightweight neural network operates in real-time, incorporating a spatio-temporal encoder with adaptable mask tokens, ensuring plausible motion in the absence of complete hand observations.
๐ค Sadegh is currently a senior research scientist at Microsoft Mixed Reality and AI Lab-Cambridge (UK), where he's at the forefront of Microsoft Mesh and avatar motion generation. He holds a PhD from the Australian National University, where he specialized in generative modeling of human motion. His research journey includes internships at Amazon AI, Five AI, and Qualcomm AI Research, focusing on generative models, representation learning, and adversarial examples.
๐ค We first crossed paths during our time at the Australian Centre for Robotic Vision (ACRV), where Sadegh was pursuing his PhD, and I was embarking on my postdoctoral journey. During this time, I had the privilege of collaborating with another co-author of the paper, Fatemeh Saleh, who also happens to be Sadegh's life partner. It's been incredible to witness their continued growth.
๐ Join us as we uncover the critical advancements brought by "HMD-NeMo" and their implications for the future of mixed reality experiences. Stay tuned for the episode release!
All links and resources are available in the blogpost: https://www.itzikbs.com/hmdnemo
๐งSubscribe on your favourite podcast app: https://talking.papers.podcast.itzikbs.com
๐งSubscribe to our mailing list: http://eepurl.com/hRznqb
๐ฆFollow us on Twitter: https://twitter.com/talking_papers
๐ฅYouTube Channel: https://bit.ly/3eQOgwP