Backward Elimination for Feature Selection in Machine LearningData Science by Sunny Srinidhi - November 11, 2019November 11, 20191 When we're building a machine learning model, it is very important that we select only those features or predictors which are necessary. Suppose we have 100 features or predictors in our dataset. That doesn't necessarily mean that we need to have all 100 features in our model. This is because not all 100 features will have significant influence on the model. But then again, this doesn't mean it will be true for all cases. It depends entirely on the data we have in hand. Here is more info about why we need feature selection. There are various ways in which you can find out which features have very less impact on the model and which ones you can remove from your
Null Hypothesis and the P-ValueData Science by Sunny Srinidhi - November 8, 2019November 8, 20195 When you're starting your machine learning journey, you'll come across null hypothesis and the p-value. At a certain point in your journey, it becomes quite important to know what these mean to make meaningful decisions while designing your machine learning models. So in this post, I'll try to explain what these two things mean, and you try to understand that. Now, if you don't have a background in statistics, the definitions of null hypothesis and p-value will make no sense to you. It's just gibberish going way over your head. That's what happened to me the first few times I tried to understand them. It took me a good couple of days to get an idea of what they mean. I