This book provides a comprehensive introduction to conversational spoken language understanding and surveys recent advances in conversational AI. It guides the reader through the history, current advancements, and future of natural language understanding (NLU) in human-computer interactions. To this end, the book is structured in seven chapters: Introduction to Natural Language Understanding lays the foundation by tracing the evolution of NLU from early human communication to modern human-computer interactions. Prerequisites and Glossary for Natural Language Understanding then serves as a foundational resource, consolidating essential prerequisites and key terminologies relevant across the book. Single-Turn Natural Language Understanding looks at Single-Turn NLU, focusing on tasks that involve interpreting and processing user inputs in a single interaction, while Multi-Turn Natural Language Understanding moves on systems for extended interactions with users and explores techniques for managing dialogues, using context and integrating external knowledge bases. Next, Evaluating Natural Language Understanding discusses the annotation of datasets and various performance assessment methods, covering different levels of understanding from intent recognition to slot filling and domain classification. Applications and Case Studies in Natural Language Understanding subsequently shows real-world applications of NLU in finance, medicine, and law. Eventually Challenges, Conclusions and Future Directions explores the core obstacles hindering the advancement of NLU, including ambiguity, domain adaptation, data scarcity, and ethical concerns. By understanding these challenges, this chapter highlights the ongoing work needed to advance NLU. This book mainly targets researchers, PhD students, and professionals who are entering this field and look for a state-of-the-art introduction to NLU applied in conversational systems such as chatbots, large language models, or educational systems. This book provides a comprehensive introduction to conversational spoken language understanding and surveys recent advances in conversational AI. It guides the reader through the history, current advancements, and future of natural language understanding (NLU) in human-computer interactions. To this end, the book is structured in seven chapters: Introduction to Natural Language Understanding lays the foundation by tracing the evolution of NLU from early human communication to modern human-computer interactions. Prerequisites and Glossary for Natural Language Understanding then serves as a foundational resource, consolidating essential prerequisites and key terminologies relevant across the book. Single-Turn Natural Language Understanding looks at Single-Turn NLU, focusing on tasks that involve interpreting and processing user inputs in a single interaction, while Multi-Turn Natural Language Understanding moves on systems for extended interactions with users and explores techniques for managing dialogues, using context and integrating external knowledge bases. Next, Evaluating Natural Language Understanding discusses the annotation of datasets and various performance assessment methods, covering different levels of understanding from intent recognition to slot filling and domain classification. Applications and Case Studies in Natural Language Understanding subsequently shows real-world applications of NLU in finance, medicine, and law. Eventually Challenges, Conclusions and Future Directions explores the core obstacles hindering the advancement of NLU, including ambiguity, domain adaptation, data scarcity, and ethical concerns. By understanding these challenges, this chapter highlights the ongoing work needed to advance NLU. This book mainly targets researchers, PhD students, and professionals who are entering this field and look for a state-of-the-art introduction to NLU applied in conversational systems such as chatbots, large language models, or educational systems. Caren Han is a senior lecturer at the University of Melbourne, an honorary academic at both the University of Sydney and the University of Edinburgh, and an adjunct professor at POSTECH. She is co-directing the Australia Deep Learning NLP Group. After her PhD in 2017, she received several teaching and research awards, including the Australian Young Achiever Certificate (Teaching Excellence), Teacher of the Year 2020, Supervisor of the Year 2021, Best Research Paper Award in top-tier International Artificial Intelligence Conferences, Early Career Research Award 2023. She currently supervises 23 research students, and her research interests include Natural Language Processing with Deep Learning. Henry Weld has PhDs in both Computer Science and Mathematics at The University of Sydney and is a member of the Australian Deep Learning NLP Group. His research focuses on Natural Language Understanding, particularly multi-turn NLU, and the use of NLU method