made by https://cneuralnets.netlify.app/

This is the last part of the NLP series. Please read the earlier two first

Check that out : Basics of NLP - 1 [Week 5], Basics of NLP - 2 [Week 6]

Let’s continue from where we left off.

This blog is broken into 3 pieces

Hidden Markov Models

In such models, we work with two kinds of variables — hidden and observed. Like the name suggests, the hidden states are never observed by us, but we get information about them by observing the observed variables, which are emitted by the hidden state. Confusing right? Let’s use an example

The Intuition

Meet my friend Aakash 👨‍🔬. He is a peculiar one. His appetite depends on how the weather is. He decides what he wants to eat, based on what the weather is. As you can notice, I can’t observe what he wants to eat, but I can clearly observe what the weather is outside. So,

Hidden Variable - What Aakash is going to eat

Observed Variable - What the weather is

Let’s define the variables

Hidden Observed
Chai (C) Warm (W)
Ice Cream (I) Frigid (F)
Biriyani (B) Rainy (R)

image.png

The Probability Tables

The next step is to create two tables - Transition Table and Emission Table

State Transition Table