## Introduction To Linear Regression(Part-2)

In the previous part of the Introduction to Linear Regression, we discussed simple linear regression. Simple linear regression is a basic model with just two variables an independent variable x, and a dependent variable y based […]

## Optimal k in K-means

A major challenge in the K-means algorithm is choosing the optimal value of k; however, selecting the right value of k is quite tricky and is also crucial as it can impact the performance of the […]

## Introduction to K-means

K-means clustering is one of the simplest unsupervised learning algorithms that solve the well known clustering problem. Before we venture into K-means, let’s first understand what clustering is? What is clustering? The idea behind clustering is […]

## Regression Evaluation Metrics

Once we build our regression model, how can we measure the goodness of fit? We have various regression evaluation metrics to measure how well our model fits the data. In this article, we will see some […]

## Introduction to Linear Regression

Before we venture into linear regression, let’s first try to understand what regression analysis is? What is Regression? Regression is a statistical approach used for predicting real values like the age, weight, salary, for example. In […]

## Data Science Life Cycle

The data science life cycle consists of 7 phases. In this post, we will go through each of them briefly. The following infographic depicts different phases in the data science life cycle. PROBLEM DEFINITION: This phase […]

## Performance Measure for Classification(Part-2)

INTRODUCTION: In the last part, we see what a confusion matrix is and some other metrics like TPR, TNR, FPR, and FNR, which are based on the confusion matrix. In this post, we will see about […]

## Performance Measures for Classification

Why do we need performance measures? Why do we need performance measures at all? After we developed our classification model, we need to asses the performance of the model. Obviously, we can use accuracy as a […]

## Dropout for Regularization

INTRODUCTION: When we have deep neural networks, the biggest problem is overfitting. We can say a neural network overfits when it has excellent performance in the training data and has a poor performance in unseen data. […]