Machine Learning has made some significant contributions to the IT industry, making applications more scalable. Large organizations have been able to enhance their efficiency by successfully eliminating large number of errors from final the final product. Organizations have been exploring more areas where Machine Learning can benefit them, and this has created a demand for professionals who are trained to understand and work on Machine Learning.
IIHT’s Machine Learning course is focused on upskilling employees to build robust high-performing machine learning applications. Candidates will learn machine learning concepts and techniques including supervised and unsupervised learning, the mathematical aspects, and linear regression among many other topics. A combination of in-depth teaching and programming exercises gives them the opportunity to refine, test, and apply their knowledge. Our Machine learning training programme is ideal for programmers and organisations who want to ensure proficiency in Machine Learning and Python, and streamline their data modelling process.

40 hours

**Module 1: Introduction**

**1. Machine Learning Introduction**

- Applications of Machine Learning
- Types of problems and tasks
- Features, models and design of ML Study
- Supervised, unsupervised and reinforcement learning
- Classification and Regression
- Clustering and Anomaly Detection
- Recommendation systems
- Various techniques and tools
- System requirements
- Revisiting programming language and essentials
- Tools and software installations
- Case studies

**Module 2: Working with Tensorflow**

**2. Introduction to Tensorflow**

- Installing Tensorflow with R/Python
- Configuring with/without GPU
- The programming model of Tensorflow
- Tensor Board
- Representing tensors
- Creating operators
- Executing operators with sessions
- Writing code in Jupyter
- Using variables & Placeholders
- Saving and loading variables
- Visualizing data using TensorBoard
- Understanding code as a graph
- Implementing a moving average
- Visualizing a moving average

**Module 3: Linear Regression**

**3. Linear Regression **

- Regression Problem Analysis
- Mathematical modelling of Regression Model
- Gradient Descent Algorithm
- Programming Process Flow
- Use cases
- Programming Using python/R
- Building simple Univariate Linear Regression Model
- Multivariate Regression Model
- Normal Equation Non-invertibility
- Model specification
- Apply data transformations
- Identify multicollinearity in data treatment on data
- Identify Heteroscedasticity

**Module 4: Linear Regression – Practice & Dive Deeper**

**4. L1 & L2 Regularization**

- Linear Regression Implementation with Python & R
- Linear Regression with Tensorlfow
- Predicting the Server downtime duration in minutes using server dataset
- Share Market Prediction using dataset from Quandl
- Best Fit Line and Linear Regression
- Do’s & dont’s

**Module 5: Logistic Regression**

**5. Logistic Regression**

- Assumptions
- Reason for the Logit Transform
- Logit Transformation
- Hypothesis
- Variable and Model Significance
- Maximum likelihood Concept
- Log Odds and Interpretation
- Null Vs Residual Deviance

**Module 6: Logistic Regression – Practice & Dive Deeper**

- Chi Square Test
- ROC Curve
- Model Specification
- Cost Function Formation
- Mathematical Modelling
- Use Cases
- Digit Recognition using Logistic Regression
- Working with datasets using R/Python

**Module 7: Model Evaluation**

- Metrics for regression
- Metrics for classification
- Accuracy, Precision, Recall, F1Score, Confusion Matrix
- Metrics for probabilistic predictions
- Feature importance
- Non-parametric vs. parametric analysis
- Asymptotic approximation property
- Streamlining workflows with pipelines
- Using K fold cross validation to assess model performance
- The holdout method and K fold cross validation
- Debugging Algorithms with learning and validation curves
- Fine Tuning Machine Learning models via grid search

**Module 8: Productization of Machine Learning**

- Embedding a Machine Learning model into a web application
- Dumbing the model into a pickle file
- Serializing fitted estimator/model
- Developing a web application
- From validation to rendering
- Turning a classifier into a web application
- Deploying the web application to a public server
- Updating the web application

**Module 9: Decision Tree**

**6. Decision Trees**

- Forming a Decision Tree
- Components of Decision Tree
- Mathematics of Decision Tree
- Decision Tree Evaluation
- Practical examples & case study
- CART
- C4.5

**Module 10: Decision Tree – Practice & Dive Deeper**

- Variance analysis
- Chi Square based analysis
- CART for Regression
- Working with real time problems
- Random Forest
- Application of Random Forest
- Implementation

** ****Module 11: Naïve Bayes**

** 7. ****Naïve Bayes**

- Bayesian Theorem
- Probabilities – The Prior and Posterior Probabilities
- Conditional and Joint Probabilities Notion
- Traditional approach – Extract important features
- Naive Approach – Independence of Features Assumption
- Data Processing – Discretization of Features
- Practical Examples & Case Study

**Module 12: Artificial Neural Networks**

**8. Artificial Neural Networks **

- Neurons, ANN & Working
- Single Layer Perceptron Model
- Multilayer Neural Network
- Feed Forward Neural Network
- Cost Function Formation
- Applying Gradient Descent Algorithm
- Back propagation algorithm & mathematical modelling
- Programming flow for back propagation algorithm
- Use Cases of ANN
- Programming SLNN using Python

**Module 13: Artificial Neural Networks — Practice & Dive Deeper**

- Programming MLNN using Python/R
- Digit recognition using MLNN
- XOR Logic using MLNN & Back propagation
- Diabetes Data Predictive Analysis using ANN

**Module 14: Support Vector Machine**

**9. Support Vector Machine **

- Concept and working principle
- Maximum Margin Intuition
- Mathematical Modelling
- Optimization Function Formation
- The Kernel Method and non-linear Hyperplanes
- Use cases
- Programming SVM using Python/R

**Module 15: Support Vector Machine – Practice & Dive Deeper**

- Character recognition using SVM
- Regression problem using SVM
- Wisconsin Cancer Detection using SVM

**Module 16: K Nearest Neighbors**

- Classification with distance measurements
- Analysing the scatter plot
- Using KNN for handwritten digit classification
- Implementation using Python/R

**Module 17: Image Processing**

**10. Image Processing with OpenCV**

- Image acquisition and manipulation using openCV
- Video processing
- Edge detection
- Corner detection
- Face detection
- Image scaling for ANN
- Training ANN with images
- Character recognition

**Module 18: Unsupervised Learning – Clustering**

**11. Clustering **

- Hierarchical Clustering
- K Means Clustering
- Optimization Objective
- Random Initialization
- Choosing number of clusters
- Fuzzy C means Clustering
- DBSCAN Clustering
- Use Cases for K Means Clustering
- Programming for K Means using Python/R
- Image Color Quantization using K Means Clustering Technique

**Module 19: Anomaly Detection**

- Anomaly Detection
- Motivation and Concepts
- Gaussian Distribution
- Anomaly Detection using the Multivariate Gaussian Distribution
- One-class SVM
- Isolation Forest
- Case Study: Anomaly Detection in Time Series
- Modelling the background
- Detecting seasonality with Fourier Transforms
- Detrending z-Score
- Moving-Window Averages
- Including windowed data in model
- Bayesian Change points
- Python/R Implementation

**Module 20: Principal Component Analysis**

**12. ****Principle Component Analysis**

- Dimensionality Reduction and Data Compression
- Concept and mathematical modelling
- Covariance Matrix Method
- Single Value Decomposition Method
- Reconstruction from Compressed Representation
- Choosing the Number of Principal Components
- Use cases
- Programming using Python/R

**Module 21: Recommendation Systems**

**13. Collaborative Filtering & Recommendation System**

- Content based filtering and Collaborative filtering
- Collaborative filtering Algorithm
- Vectorization: Low Rank Matrix Factorization
- Implementational Detail: Mean Normalization

**14. User-User Collaborative Filtering Recommenders**

- User-User Collaborative Filtering
- Configuring User-User Collaborative Filtering
- Influence Limiting and Attack Resistance
- Trust-Based Recommendation
- Impact of Bad Ratings
- Programming User-User Collaborative Filtering

**15. Item-Item Collaborative Filtering Recommenders**

- Introduction to Item-Item Collaborative Filtering
- Item-Item Algorithm
- Item-Item on Unary Data
- Item-Item Hybrids and Extensions
- Strengths and Weaknesses of Item-Item Collaborative Filtering

**Module 22: Reinforcement Learning**

**16. Reinforcement Learning**

- OpenAI Introduction
- Q Learning Introduction
- Markov Decision Process
- Mote carlo methods
- TD Lambda
- Policy Gradient Methods
- Q Learning in Tensorflow

**Module 23: Ensemble Methods and Boosting Techniques**

- Learning with ensembles
- Implementing a simple majority vote classifier
- Combining different algorithms for majority votes
- Evaluating and tuning the ensemble classifier
- Bagging and Boosting
- Leveraging weak learners via Adaptive Boosting
- Gradient Boosting
- XGBoost, Cat Boost

**Case study & Project (part A)**

**Any one of**

1. Building a Recommendation system for Healthcare Products

2. Building a Recommendation System for a Super market based on the currently purchased products to recommend new products while billing

**Case study & Project (part B)**

1. Classification of usefulness in user-submitted content using supervised learning algorithms.