Stanford’s Open Course on Natural Language Processing (NLP)

If you are interested in doing Stanford’s Open Course on Natural Language Processing (NLP), Coursera (coursera.org) have made the full course available on YouTube through 101 video lessons.

The full Stanford NLP Open Course can be found via the following YouTube playlist: https://www.youtube.com/playlist?list=PL4LJlvG_SDpxQAwZYtwfXcQr7kGnl9W93

Here is the Course Introduction (1 – 1):

Presented by Professor Dan Jurafsky & Chris Manning (nlp-class.org), the Natural Language Processing (NLP) course contains the following lessons:

1 – 1 – Course Introduction – Stanford NLP – Professor Dan Jurafsky & Chris Manning
2 – 1 – Regular Expressions – Stanford NLP – Professor Dan Jurafsky & Chris Manning
2 – 2 – Regular Expressions in Practical NLP – Stanford NLP – Professor Dan Jurafsky & Chris Manning
2 – 3 – Word Tokenization- Stanford NLP – Professor Dan Jurafsky & Chris Manning
2 – 4 – Word Normalization and Stemming – Stanford NLP – Professor Dan Jurafsky & Chris Manning
2 – 5 – Sentence Segmentation – Stanford NLP – Professor Dan Jurafsky & Chris Manning
3 – 1 – Defining Minimum Edit Distance – Stanford NLP – Professor Dan Jurafsky & Chris Manning
3 – 2 – Computing Minimum Edit Distance – Stanford NLP – Professor Dan Jurafsky & Chris Manning
3 – 3 – Backtrace for Computing Alignments – Stanford NLP – Professor Dan Jurafsky & Chris Manning
3 – 4 – Weighted Minimum Edit Distance – Stanford NLP – Professor Dan Jurafsky & Chris Manning
3 – 5 – Minimum Edit Distance in Computational Biology-Stanford NLP-Dan Jurafsky & Chris Manning
4 – 1 – Introduction to N-grams- Stanford NLP – Professor Dan Jurafsky & Chris Manning
4 – 2 – Estimating N-gram Probabilities – Stanford NLP – Professor Dan Jurafsky & Chris Manning
4 – 3 – Evaluation and Perplexity – Stanford NLP – Professor Dan Jurafsky & Chris Manning
4 – 4 – Generalization and Zeros – Stanford NLP – Professor Dan Jurafsky & Chris Manning
4 – 5 – Smoothing_ Add-One – Stanford NLP – Professor Dan Jurafsky & Chris Manning
4 – 6 – Interpolation – Stanford NLP – Professor Dan Jurafsky & Chris Manning
4 – 7 – Good-Turing Smoothing – Stanford NLP – Professor Dan Jurafsky & Chris Manning
4 – 8 – Kneser-Ney Smoothing – Stanford NLP – Professor Dan Jurafsky & Chris Manning
5 – 1 – The Spelling Correction Task – Stanford NLP – Professor Dan Jurafsky & Chris Manning
5 – 2 – The Noisy Channel Model of Spelling – Stanford NLP – Professor Dan Jurafsky & Chris Manning
5 – 3 – Real-Word Spelling Correction – Stanford NLP – Professor Dan Jurafsky & Chris Manning
5 – 4 – State of the Art Systems – Stanford NLP – Professor Dan Jurafsky & Chris Manning
6 – 1 – What is Text Classification- Stanford NLP – Professor Dan Jurafsky & Chris Manning
6 – 2 – Naive Bayes – Stanford NLP – Professor Dan Jurafsky & Chris Manning
6 – 3 – Formalizing the Naive Bayes Classifier – Stanford NLP-Dan Jurafsky & Chris Manning
6 – 4 – Naive Bayes_ Learning – Stanford NLP – Professor Dan Jurafsky & Chris Manning
6 – 5 – Naive Bayes_ Relationship to Language Modeling-Stanford NLP-Dan Jurafsky & Chris Manning
6 – 6 – Multinomial Naive Bayes_ A Worked Example – Stanford NLP-Dan Jurafsky & Chris Manning
6 – 7 – Precision, Recall, and the F measure – Stanford NLP – Professor Dan Jurafsky & Chris Manning
6 – 8 – Text Classification_ Evaluation- Stanford NLP – Professor Dan Jurafsky & Chris Manning
6 – 9 – Practical Issues in Text Classification – Stanford NLP-Dan Jurafsky & Chris Manning
7 – 1 – What is Sentiment Analysis- Stanford NLP – Professor Dan Jurafsky & Chris Manning
7 – 2 – Sentiment Analysis_ A baseline algorithm- NLP-Dan Jurafsky & Chris Manning
7 – 3 – Sentiment Lexicons – Stanford NLP – Professor Dan Jurafsky & Chris Manning
7 – 4 – Learning Sentiment Lexicons – Stanford NLP – Professor Dan Jurafsky & Chris Manning
7 – 5 – Other Sentiment Tasks – Stanford NLP – Professor Dan Jurafsky & Chris Manning
8 – 1 – Generative vs. Discriminative Models- Stanford NLP – Professor Dan Jurafsky & Chris Manning
8 – 2 – Making features from text for discriminative NLP models-Dan Jurafsky & Chris Manning
8 – 3 – Feature-Based Linear Classifiers – Stanford NLP – Professor Dan Jurafsky & Chris Manning
8 – 4 – Building a Maxent Model_ The Nuts and Bolts-Dan Jurafsky & Chris Manning
8 – 5 – Generative vs. Discriminative models_ The problem of overcounting evidence- Stanford NLP
8 – 6 – Maximizing the Likelihood- Stanford NLP – Professor Dan Jurafsky & Chris Manning
9 – 1 – Introduction to Information Extraction- Stanford NLP-Dan Jurafsky & Chris Manning
9 – 2 – Evaluation of Named Entity Recognition- Stanford NLP-Dan Jurafsky & Chris Manning
9 – 3 – Sequence Models for Named Entity Recognition-NLP-Professor Dan Jurafsky & Chris Manning
9 – 4 – Maximum Entropy Sequence Models- Stanford NLP – Professor Dan Jurafsky & Chris Manning
10 – 1 – What is Relation Extraction- Stanford NLP – Professor Dan Jurafsky & Chris Manning
10 – 2 – Using Patterns to Extract Relations – Stanford NLP – Professor Dan Jurafsky & Chris Manning
10 – 3 – Supervised Relation Extraction – Stanford NLP – Professor Dan Jurafsky & Chris Manning
10 – 4 – Semi-Supervised and Unsupervised Relation Extraction-Dan Jurafsky & Chris Manning
11 – 1 – The Maximum Entropy Model Presentation-NLP-Dan Jurafsky & Chris Manning
11 – 2 – Feature Overlap_Feature Interaction-Stanford NLP-Professor Dan Jurafsky & Chris Manning
11 – 3 – Conditional Maxent Models for Classification–NLP-Dan Jurafsky & Chris Manning
11 – 4 – Smoothing_Regularization_Priors for Maxent Models-NLP-Dan Jurafsky & Chris Manning
12 – 1 – An Intro to Parts of Speech and POS Tagging -NLP-Dan Jurafsky & Chris Manning
12 – 2 – Some Methods and Results on Sequence Models for POS Tagging -Dan Jurafsky Chris Manning
13 – 1 – Syntactic Structure_ Constituency vs Dependency -NLP-Dan Jurafsky & Chris Manning
13 – 2 – Empirical_Data-Driven Approach to Parsing-NLP-Dan Jurafsky & Chris Manning
14 – 1 – Instructor Chat –NLP-Dan Jurafsky & Chris Manning
15 – 1 – CFGs and PCFGs -Stanford NLP-Professor Dan Jurafsky & Chris Manning
15 – 2 – Grammar Transforms-Stanford NLP-Professor Dan Jurafsky & Chris Manning
15 – 3 – CKY Parsing -Stanford NLP-Professor Dan Jurafsky & Chris Manning
15 – 4 – CKY Example-Stanford NLP-Professor Dan Jurafsky & Chris Manning
15 – 5 – Constituency Parser Evaluation -Stanford NLP-Professor Dan Jurafsky & Chris Manning
16 – 1 – Lexicalization of PCFGs-Stanford NLP-Professor Dan Jurafsky & Chris Manning
16 – 2 – Charniak’s Model-Stanford NLP-Professor Dan Jurafsky & Chris Manning
16 – 3 – PCFG Independence Assumptions-Stanford NLP-Professor Dan Jurafsky & Chris Manning
16 – 4 – The Return of Unlexicalized PCFGs-Stanford NLP-Professor Dan Jurafsky & Chris Manning
16 – 5 – Latent Variable PCFGs-Stanford NLP-Professor Dan Jurafsky & Chris Manning
17 – 1 – Dependency Parsing Introduction-Stanford NLP-Professor Dan Jurafsky & Chris Manning
17 – 2 – Greedy Transition-Based Parsing-Stanford NLP-Professor Dan Jurafsky & Chris Manning
17 – 3 – Dependencies Encode Relational Structure-Stanford NLP-Dan Jurafsky & Chris Manning
18 – 1 – Introduction to Information Retrieval-Stanford NLP-Professor Dan Jurafsky & Chris Manning
18 – 2 – Term-Document Incidence Matrices -Stanford NLP-Professor Dan Jurafsky & Chris Manning
18 – 3 – The Inverted Index-Stanford NLP-Professor Dan Jurafsky & Chris Manning
18 – 4 – Query Processing with the Inverted Index-Stanford NLP-Dan Jurafsky & Chris Manning
18 – 5 – Phrase Queries and Positional Indexes-Stanford NLP-Professor Dan Jurafsky & Chris Manning
19 – 1 – Introducing Ranked Retrieval-Stanford NLP-Professor Dan Jurafsky & Chris Manning
19 – 2 – Scoring with the Jaccard Coefficient-Stanford NLP-Professor Dan Jurafsky & Chris Manning
19 – 3 – Term Frequency Weighting-Stanford NLP-Professor Dan Jurafsky & Chris Manning
19 – 4 – Inverse Document Frequency Weighting-Stanford NLP-Professor Dan Jurafsky & Chris Manning
19 – 5 – TF-IDF Weighting-Stanford NLP-Professor Dan Jurafsky & Chris Manning
19 – 6 – The Vector Space Model -Stanford NLP-Professor Dan Jurafsky & Chris Manning
19 – 7 – Calculating TF-IDF Cosine Scores-Stanford NLP-Professor Dan Jurafsky & Chris Manning
19 – 8 – Evaluating Search Engines -Stanford NLP-Professor Dan Jurafsky & Chris Manning
20 – 1 – Word Senses and Word Relations-NLP-Dan Jurafsky & Chris Manning
20 – 2 – WordNet and Other Online Thesauri -NLP-Dan Jurafsky & Chris Manning
20 – 3 – Word Similarity and Thesaurus Methods -NLP-Dan Jurafsky & Chris Manning
20 – 4 – Word Similarity_ Distributional Similarity I –NLP-Dan Jurafsky & Chris Manning
20 – 5 – Word Similarity_ Distributional Similarity II -NLP-Dan Jurafsky & Chris Manning
21 – 1 – What is Question Answering-NLP-Dan Jurafsky & Chris Manning
21 – 2 – Answer Types and Query Formulation-NLP-Dan Jurafsky & Chris Manning
21 – 3 – Passage Retrieval and Answer Extraction-NLP-Dan Jurafsky & Chris Manning
21 – 4 – Using Knowledge in QA -NLP-Dan Jurafsky & Chris Manning
21 – 5 – Advanced_ Answering Complex Questions-NLP-Dan Jurafsky & Chris Manning
22 – 1 – Introduction to Summarization-NLP-Dan Jurafsky & Chris Manning
22 – 2 – Generating Snippets-NLP-Dan Jurafsky & Chris Manning
22 – 3 – Evaluating Summaries_ ROUGE-NLP-Dan Jurafsky & Chris Manning
22 – 4 – Summarizing Multiple Documents-NLP-Dan Jurafsky & Chris Manning
23 – 1 – Instructor Chat II -Stanford NLP-Professor Dan Jurafsky & Chris Manning

[Stanford NLP Open Course video playlist: https://www.youtube.com/playlist?list=PL4LJlvG_SDpxQAwZYtwfXcQr7kGnl9W93]