If you are a software engineer or a programmer you must have used StackOverflow at least once in your lifetime. But have you ever wondered how StackOverflow predicts the tags for a given question ? In this blog, I will discuss the StackOverflow tag predictor case study.

Contents

  1. Overview of Stack OverFlow Dataset.
  2. Exploratory Data Analysis.
  3. Data Preprocessing.
  4. Downscaling of data.
  5. Train-Test split.
  6. Text Featurization using Tfidf Vectorizer.
  7. Hyper Parameter Tuning.
  8. Logistic Regression with OneVsRest Classifier
  9. OneVsRestClassifier with SVM
  10. Conclusions.
  11. Enhancements.


Contents

  1. Overview of Dataset.
  2. Data Preprocessing.
  3. Train-Test split.
  4. Text Featurization using Bag of Words.
  5. Hyper Parameter Tuning.
  6. Model Building using the Naive Bayes algorithm.
  7. Performance Metrics.
  8. Model deployment into Web app using Flask API.
  9. Production of the model by Heroku platform.
  10. Results.

First We want to know What is Amazon Fine Food Review Analysis?

This dataset consists of reviews of fine foods from amazon. The data span a period of more than 10 years, including all ~500,000 reviews up to October 2012. Reviews include product and user information, ratings, and a plaintext review. We also have reviews from all other Amazon categories.

Amazon reviews are often the most publicly visible reviews of consumer…


In this blog, we’ll try to understand one of the most important algorithms in machine learning i.e. Random Forest Algorithm. We will try to look at the things that make Random Forest so special and will try to implement it on a real-world dataset.

Contents

  1. What Are Ensembles?
  2. Types of Ensemble Learning.
  3. Bagging.
  4. Random Forest and Construction.
  5. Best and Worst cases of Random Forest.
  6. Boosting.
  7. Types of Boosting.
  8. Gradient Boosting.
  9. AdaBoost (Adaptive Boosting).
  10. XGBoost.
  11. Stacking Classifier.
  12. Cascading Classifier.
  13. Random Forest and XGBOOST with Amazon Food Reviews.

What Are Ensembles?

Commonly, the individual model suffers from bias or variances and that’s why we need ensemble…


Decision trees are a popular supervised learning method for a variety of reasons. The benefits of decision trees include that they can be used for both regression and classification, they are easy to interpret and they don’t require feature scaling. They have several flaws including being prone to overfitting.

Contents

  1. What are Decision Trees?.
  2. Geometric Intuition of Decision Trees.
  3. Entropy.
  4. Information Gain.
  5. Gini impurity.
  6. Play Tennis Dataset Example of Decision Tree.
  7. Steps to Constructing a Decision Tree.
  8. Decision Tree Regression.
  9. Real-world cases of Decision Tree.
  10. Best and Worst cases of Decision Tree Algorithm.
  11. Decision Tree with Amazon Food Reviews.

What are decision trees?

Decision trees…


SVM is a supervised Machine Learning algorithm that is used in many classifications and regression problems. It still presents as one of the most used robust prediction methods that can be applied to many use cases involving classifications.

Contents

  1. Geometric Intuition Of Support Vector Machines.
  2. Mathematical Formulation of Support Vector Machines.
  3. Loss Minimization Interpretation of SVMs.
  4. Dual Form of Support Vector Machines.
  5. Kernel Trick in Support Vector Machines.
  6. Train and Runtime Complexities of SVMs.
  7. Support Vector Machines — Regression (SVR)
  8. Best and Worst cases of Support Vector Machines Algorithm.
  9. Support Vector Machines with Amazon Food Reviews.

Key Idea of SVM

Logistic Regression doesn’t care whether…


Logistic regression is a classification algorithm used to assign observations to a discrete set of classes. There are lots of classification problems that are available, but the logistics regression is common and is a useful regression method for solving the binary classification problem.

There are lots of classification problems that are available, but the logistics regression is common and is a useful regression method for solving the binary classification problem.

Contents

  1. Geometric Intuition Of Logistic Regression

3. Probabilistic interpretation of Logistic Regression

4. Loss Minimization Interpretation of Logistic Regression

5.Implementation of Logistic Regression…


The solution to an optimization problem can be done by selecting different methods. Moreover, the user can navigate on the surface or curve to establish an initial point and find the optimal or critical point, which can be observed on the plotted function.

Contents

1.Single Value Differentiation

2. Minima and Maxima

3. Gradient descent algorithm

4. Steps for Gradient descent algorithm

5. Types of Gradient Descent algorithms

6. Implementation of Stochastic Gradient Descent

For Optimization problems Differentiation is very important, Let’s see some maths,

Single Value Differentiation

Differentiation allows us to find rates of change. …


Contents

  1. Geometric Intuition for Linear Regression

3. Assumptions of Linear Regression

4. Implementation of the Linear Regression using Python

What is Regression?

Regression analysis is a form of predictive modeling technique that investigates the relationship between a dependent and independent variable.

Geometric Intuition for Linear Regression

Linear regression is perhaps one of the most well known and well-understood algorithms in statistics and machine learning. …


Naive Bayes is a statistical classification technique based on Bayes Theorem. It is one of the simplest supervised learning algorithms. Naive Bayes classifier is a fast, accurate, and reliable algorithm. Naive Bayes classifiers have high accuracy and speed on large datasets.

To understand the Naive Bayes algorithm first we want to know some basic concepts of probability.

Contents

  1. Probability

3. Independent Events

4. Mutually Exclusive Events

5. Bayes Theorem

6. Naive Bayes Algorithm

7. Toy Example using Naive Bayes

8. Naive Bayes Algorithm on Text data

9. Laplace (or) Additive Smoothing

10. Log-Probabilities and Numerical Stability

11. Bias-variance…


First We want to know What is Amazon Fine Food Review Analysis?

This dataset consists of reviews of fine foods from amazon. The data span a period of more than 10 years, including all ~500,000 reviews up to October 2012. Reviews include product and user information, ratings, and a plaintext review. We also have reviews from all other Amazon categories.

Amazon reviews are often the most publicly visible reviews of consumer products. …

Sachin D N

Trained on Data Science and Machine Learning at @6benches

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store