This gentle introduction to the most important techniques in natural language processing uses a unified mathematical and algorithmic framework and gradually increases in complexity. Topics covered range from n-gram language models to large language models (LLMs), from perceptron to deep learning, from text classification to structured prediction (e.g., sequence labelling, segmentation, and parsing) and generation, and from discrete representation to neural representation of linguistics structures. This book provides a comprehensive overview of NLP, making it ideal for upper undergraduate and graduate students in computer science and a valuable reference for researchers and engineers. Exercises of varying difficulty are provided as well as teaching slides and tutorial videos. The new edition features three new chapters on pre-trained language models and large language models as well as a new preliminary chapter overviewing data and model as a framework for NLP methods.