Pooyan Fazli is an assistant professor and the director of the People and Robots Laboratory (PeRL) in the School of Arts, Media, and Engineering (AME) and the Media and Immersive eXperience (MIX) Center at Arizona State University. He received his Ph.D. in computer science from the University of British Columbia. Previously, he was a postdoctoral fellow in the CORAL Research Group at Carnegie Mellon University and in the Laboratory for Computational Intelligence at the University of British Columbia.
Center Affiliations
I am affiliated with the Center for Human, Artificial Intelligence, and Robot Teaming (CHART).Research Interests
Artificial Intelligence, Autonomous Robots, Multi-Robot Systems, Human-Robot Teaming, Robot Learning, Vision and Language, Multimodal Learning, Video UnderstandingSchedule
Please check my calendar before proposing a meeting time.Prospective Postdocs, Students, Visitors, and Interns
If you are interested in a position in our lab, please fill out this form.Sponsors
My work is generously supported by the National Science Foundation (NSF), National Institute of Health (NIH), Google, Amazon, Ability Central Foundation, and ASU.News
10/2024           | HIRBI Grant ($10,000) |
10/2024           | PIT-UN Grant ($25,000) |
04/2024           | We launched ViDScribe, a platform to empower blind and low vision users by providing AI-generated audio descriptions. |
03/2024           | Paper accepted at NAACL 2024 |
11/2023           | Featured in ASU News |
10/2023           | Paper accepted at ICSR 2023 |
09/2023           | Our work on video accessibility was featured on ABC 15. |
07/2023           | NIH R01 grant (~$3.2M) on automated video description for blind and low vision users. The project is a collaboration with the University of Rochester, University of California, Santa Cruz, Columbia University, and Vista Center for the Blind and Visually Impaired. |
06/2023           | ASU CHART grant ($10,012) on superhuman performance in autonomous robot teaming applications (SPARTA) |
11/2022           | Paper accepted at the ACM Transactions on Human-Robot Interaction (THRI) |
08/2022           | NSF grant ($94,997, Grand Total: $599,974) to develop an edge-based approach to robust multi-robot systems in dynamic environments |
04/2022           | ASEE CyBR-MSI grant ($10,000) |
02/2022           | Our work on democratizing AI was featured on Google TensorFlow's blog. |
11/2021           | Grant award ($6,500) from Google TensorFlow on learning responsible AI for social impact |
10/2021           | RSCA grant ($17,885) on safe and resilient autonomous navigation for service robots |
08/2021           | exploreCSR award ($32,000) from Google Research on democratizing AI and promoting AI fairness, accountability, transparency, and ethics: https://democratizeai.org/ |
05/2021           | I am a Faculty in Residence at Google in Mountain View, CA. |
01/2021           | Grant award ($20,000) from Amazon on human-aware robot navigation in indoor environments |
10/2020           | I am a visiting faculty in the Department of Computer Science at the University of Copenhagen. |
11/2020           | Grant award ($99,948) from the Ability Central Foundation on video accessibility for blind and low vision individuals |
08/2020           | NSF grant ($999,987) to promote diversity in the AI workforce and encourage students from underrepresented groups to pursue research and careers in AI |
08/2020           | NSF grant ($749,304) to develop safe and secure autonomous robots |
12/2019           | Grant award ($102,500) from the Ability Central Foundation on video accessibility for blind and low vision individuals |
04/2019           | Speaker and spanelist in the Deep Humanities and Arts Symposium in San Jose, CA |
02/2019           | Grant award ($6,000) from the Center for Computing in Life Sciences on learning to navigate like humans |
12/2018           | Our paper titled "Online Learning of Human Navigational Intentions" was a finalist for the Best Paper Award at ICSR 2018. |
12/2018           | Our paper titled "Predicting the Target in Human-Robot Manipulation Tasks" was a finalist for the Best Interactive Paper Award at ICSR 2018. |