NATURAL LANGUAGE PROCESSING

Course objectives

General goals: The fundamentals of Natural Language Processing. Specific goals: Natural Language Processing at the morphological, part-of-speech tagging, syntax, semantic and pragmatic levels. Machine translation. Knowledge and understanding: Knowledge and understanding of algorithmic and machine learning techniques for Natural Language Processing. Applying knowledge and understanding: Ability to apply Natural Language Processing techniques through homeworks and a project. Critical and judgmental abilities: Ability to understand and identify effective solutions to Natural Language Processing problems. Communication skills: Ability to illustrate the project developed by the student. Learning ability: Ability to learn and apply new techniques in NLP based either on those illustrated within the course or on innovative approaches.

Channel 1
IACOPO MASI Lecturers' profile

Program - Frequency - Exams

Course program
Please see https://github.com/iacopomasi/NLP - Introduction to Natural Language Processing - N-gram models; smoothing; interpolation; backoff - Part-of-speech assignment (including multilingual POS tagging) - Syntactic analysis: statistical and neural techniques - Computational semantics and lexical semantics - Computational Lexicon: WordNet - Multilingual semantic networks: BabelNet - Word Sense Disambiguation; Entity Linking - Multilingualism in Natural Language Processing - Neural networks, embodiments of words and senses, and deep learning - Labeling of neural semantic roles and semantic analysis - Statistical machine translation - Natural Language Generation and Question Answering - Neural language models (Transformers, BERT, XLM-RoBERTa, GPT) - Neural machine translation - Advanced multimodal NLP applications
Prerequisites
Probability Theory, Statistics, Linear Algebra, and Programming skills in Python.
Books
- Jurafsky and Martin. Speech and Language Processing, Prentice Hall, third edition, in preparation. - Jacob Eisenstein. Introduction to Natural Language Processing, MIT Press, 2019. - Yoav Goldberg. Neural Network Methods for Natural Language Processing, Morgan & Claypool, 2017. - https://d2l.ai/
Teaching mode
We expect to go back to teaching normally in person, with the possibility of attending remotely (Zoom or Meet).
Frequency
It is up to the students but it is strongly advised to attend classes
Exam mode
We expect to have exams that test both the theory part (written exam) and a practical exam (the practical exam may be implemented with homework or a project). The practical exam tests the capability of implementing a basic/advanced NLP system.
Lesson mode
We expect to go back to teaching normally in person, with the possibility of attending remotely (Zoom or Meet).
IACOPO MASI Lecturers' profile

Program - Frequency - Exams

Course program
Please see https://github.com/iacopomasi/NLP - Introduction to Natural Language Processing - N-gram models; smoothing; interpolation; backoff - Part-of-speech assignment (including multilingual POS tagging) - Syntactic analysis: statistical and neural techniques - Computational semantics and lexical semantics - Computational Lexicon: WordNet - Multilingual semantic networks: BabelNet - Word Sense Disambiguation; Entity Linking - Multilingualism in Natural Language Processing - Neural networks, embodiments of words and senses, and deep learning - Labeling of neural semantic roles and semantic analysis - Statistical machine translation - Natural Language Generation and Question Answering - Neural language models (Transformers, BERT, XLM-RoBERTa, GPT) - Neural machine translation - Advanced multimodal NLP applications
Prerequisites
Probability Theory, Statistics, Linear Algebra, and Programming skills in Python.
Books
- Jurafsky and Martin. Speech and Language Processing, Prentice Hall, third edition, in preparation. - Jacob Eisenstein. Introduction to Natural Language Processing, MIT Press, 2019. - Yoav Goldberg. Neural Network Methods for Natural Language Processing, Morgan & Claypool, 2017. - https://d2l.ai/
Teaching mode
We expect to go back to teaching normally in person, with the possibility of attending remotely (Zoom or Meet).
Frequency
It is up to the students but it is strongly advised to attend classes
Exam mode
We expect to have exams that test both the theory part (written exam) and a practical exam (the practical exam may be implemented with homework or a project). The practical exam tests the capability of implementing a basic/advanced NLP system.
Lesson mode
We expect to go back to teaching normally in person, with the possibility of attending remotely (Zoom or Meet).
STEFANO FARALLI Lecturers' profile

Program - Frequency - Exams

Course program
- Introduction to Natural Language Processing- N-gram models; smoothing; interpolation; backoff- Part-of-speech assignment (including multilingual POS tagging)- Syntactic analysis: statistical and neural techniques- Computational semantics and lexical semantics- Computational Lexicon: WordNet- Multilingual semantic networks: BabelNet- Word Sense Disambiguation; Entity Linking- Multilingualism in Natural Language Processing- Neural networks, embodiments of words and senses, and deep learning- Labeling of neural semantic roles and semantic analysis- Statistical machine translation- Natural Language Generation and Question Answering- Neural language models (Transformers, BERT, XLM-RoBERTa, GPT)- Neural machine translation- Advanced multimodal NLP applications
Prerequisites
Probability Theory, Statistics, Machine Learning, and Programming skills in Python.
Books
- Jurafsky and Martin. Speech and Language Processing, Prentice Hall, third edition, in preparation.- Jacob Eisenstein. Introduction to Natural Language Processing, MIT Press, 2019.- Yoav Goldberg. Neural Network Methods for Natural Language Processing, Morgan & Claypool, 2017.The scientific material articles that compose the bibliography are provided together with the slides presented during the course.
Frequency
It is up to the students but it is strongly advised to attend classes
Exam mode
We expect written tests and practical exercises.
Lesson mode
We expect to go back to teaching normally in person, with the possibility of attending remotely (Zoom or Meet).
STEFANO FARALLI Lecturers' profile

Program - Frequency - Exams

Course program
- Introduction to Natural Language Processing- N-gram models; smoothing; interpolation; backoff- Part-of-speech assignment (including multilingual POS tagging)- Syntactic analysis: statistical and neural techniques- Computational semantics and lexical semantics- Computational Lexicon: WordNet- Multilingual semantic networks: BabelNet- Word Sense Disambiguation; Entity Linking- Multilingualism in Natural Language Processing- Neural networks, embodiments of words and senses, and deep learning- Labeling of neural semantic roles and semantic analysis- Statistical machine translation- Natural Language Generation and Question Answering- Neural language models (Transformers, BERT, XLM-RoBERTa, GPT)- Neural machine translation- Advanced multimodal NLP applications
Prerequisites
Probability Theory, Statistics, Machine Learning, and Programming skills in Python.
Books
- Jurafsky and Martin. Speech and Language Processing, Prentice Hall, third edition, in preparation.- Jacob Eisenstein. Introduction to Natural Language Processing, MIT Press, 2019.- Yoav Goldberg. Neural Network Methods for Natural Language Processing, Morgan & Claypool, 2017.The scientific material articles that compose the bibliography are provided together with the slides presented during the course.
Frequency
It is up to the students but it is strongly advised to attend classes
Exam mode
We expect written tests and practical exercises.
Lesson mode
We expect to go back to teaching normally in person, with the possibility of attending remotely (Zoom or Meet).
  • Lesson code1038141
  • Academic year2025/2026
  • CourseComputer Science
  • CurriculumSingle curriculum
  • Year1st year
  • Semester2nd semester
  • SSDINF/01
  • CFU6