Probabilistic Graphical Models - Day 1
Progress: 0%
Probability and Bayesian Theory Continue
Progress: 0%%
Description: # Probability and Bayesian Theory ## Additive and Multiplicative Rules of Probability Consider $A$,$B$ are two events, and $P$ denotes the probability of occurence of an event.
Read more..

Maximum Likelihood Estimation (MLE) Continue
Progress: 0%%
Description: # Maximum Likelihood Estimation (MLE) * A model hypothesizes a relation between the unknown parameter(s) and the observed data. * The goal of a statistical analysis is to estimate the unknown parameter(s) in the hypothetical model * The likelihood function is a popular and latest method to estimate unknown parameters.
Read more..

Conjugate Priors & The Beta Distribution Continue
Progress: 0%%
Description: # Conjugate Priors & The Beta Distribution Consider a scenario where we count the number of heads in a coin toss that conforms to a bernoulli distribution. Suppose, the number of samples are low such as an event with 5 coin tosses where we obtain 4 heads and 1 tails, or all 5 heads, then the Maximum Likelihood Estimate (MLE) of the bernoulli distribution dictates the probability of coin toss to be either a 0.8 or 1.0 respectively. However, we know that the unbiased coin has a probability of 0.5 and hence the MLE isn't the accurate estimate and infact much farther from it. Therefore, we must search for a probabability distribution for the coin toss activity so that our final estimate is more realistic. One of the ways to obtain accurate estimates is by conducting more experiments and in our case, more coin tosses so that the probability of the coin toss converges to its true value. To resolve this issue, we can assume that the probability of the coin toss arises from a distribution. This prior distribution is known as conjugate prior. The conjugate prior distribution has interesting properties and has similar form as the binomial distribution or the posterior. We can now provide a bayesian treatment to this probability distribution. Probability of obtaining k heads while N toss experiments are carried out and the probability of the coin being x and data, \theta is given by: $$ p(\theta | x) = \cfrac{p(x|\theta)p(\theta)}{\int_0^1 p(x|\theta')p(\theta')}$$
Read more..

Bayesian Theory Problems Continue
Progress: 0%%
Description: # Bayesian Theory Problems ### Binary Communication System Consider a binary communication system where the input is either a 0 or a 1 with probability p. The receiver has an error with probability $\epsilon$ which would mean that the received data gets flipped [Alberto-1]. This can be illustrated as shown in the figure below:
Read more..