After decades in the lab, artificial intelligence is becoming part of users’ everyday lives. Neural networks run every time you unlock your iPhone with your face, or when you interact with Siri, the first mainstream intelligent assistant. This talk will introduce how advances in research were needed to improve the user experience.
Alex Acero is Sr. Director at Apple leading speech recognition, speech synthesis, language understanding, and dialog for Siri, Apple’s personal assistant for iPhone, iPad, Apple Watch, Apple TV, Carplay, Macintosh, and HomePod. Prior to joining Apple in 2013, he spent 20 years at Microsoft Research managing teams in speech, audio, multimedia, computer vision, natural language processing, machine translation, machine learning, and information retrieval. His team at Microsoft Research built Bing Translator, contributed to Xbox Kinect, and pioneered the use of deep learning in large vocabulary speech recognition. From 1991-1993 he managed the speech team for Spain’s Telefonica. His first stint at Apple started in 1990. He is Affiliate Faculty at the University of Washington.
The LTI Colloquium is generously sponsored by Abridge.
In Person and Zoom Participation. See announcement.