About the Course: ML: Code First Basics category

The ML bootcamp for those who know Python. Tight. Intense. Fun.

This is the forum for the course. Please Post your questions here.

2 Likes

We won’t be having any assignment like last time for the python-prep?
Thanks

Sir in today’s lecture you introduced terms like cross entropy and entropy?? Why do we use entropy in probability and what do you mean by it? Sir you had also used terms like Neuron? Why do we use it in machine learning? Any good resources on these terminologies for students like me who come from social sciences background.

Sir, what is the difference between positive likelihood (if it exists) and negative likelihood? As as I understand the likelihood is maximizing parameters such as lambda (in the case of Poisson) or p (in the case of Bernoulli, binomial)? So why do we need to have a positive or negative likelihood??

The negative of the (log) likelihood becomes a loss and is minimized, instead of maximizing the likelihood. These are equivalent approaches.

We do, its in the homework today :-).

One other thing you might want to do is to the resampling part of the Regression lab which we left out for time reasons…