ML-DL Research Bootcamp

ML and DL Research Bootcamp

Work on impactful ML/DL research. Present at top-tier conferences. Publish impactful research papers. Build neural networks from scratch using Python, NumPy, and scikit-learn.

30 TopicsPythonResearch Projects

Instructors from

MIT
IIT Madras
Purdue University
Hear From Our Co-Founder
Watch

Hear From Dr. Sreedath Panat(MIT PhD)

Three Core Objectives

The Foundations of ML and Deep Learning

We teach three interconnected objectives that take you from Python fundamentals to building and training neural networks from scratch. Each builds on the previous, creating a rigorous foundation for ML research.

Python Foundations for ML

Start with Python fundamentals tailored for machine learning: variables, data types, matrix multiplication from scratch, object-oriented programming, and data visualization with Matplotlib, Seaborn, and Plotly. Build a solid coding foundation using NumPy and Pandas.

Classical Machine Learning

Master the core algorithms that power modern ML: linear classifiers, the perceptron, logistic regression with cross-entropy loss, gradient descent optimization, L1/L2 regularization, and decision trees with Gini impurity. Build every algorithm from scratch before using scikit-learn.

Neural Networks from Scratch

Build neural networks layer by layer using only NumPy: code neurons, forward passes, activation functions, cross-entropy loss, and full backpropagation. Master optimizers (SGD, RMSProp, Adam), regularization (dropout, K-fold CV), and train on real datasets like MNIST Fashion and California Housing.

Optimization and Training

Understand gradient descent at a deep mathematical level: the chain rule, matrix gradients, and how weights are updated during backpropagation. Learn to diagnose overfitting, apply regularization strategies, and build complete training pipelines.

Hands-on Projects and Interviews

Each module includes hands-on projects and interview-oriented recaps. Build classifiers, regression models, decision trees, and neural networks on real datasets. The bootcamp is designed to prepare you for both research and industry ML roles.

Research and Publication

Work on industry-level ML/DL research projects aimed at publication. Learn to formulate research problems, design experiments, validate hypotheses, and write scientific papers for conferences and journals.

Visual Framework

How Machine Learning and Deep Learning Work

Publication-quality diagrams illustrating the core algorithms and architectures you will master in this bootcamp.

The ML-DL Landscape

A high-level overview of the bootcamp curriculum: from Python foundations and data visualization, through classical ML algorithms (regression, decision trees), to building neural networks from scratch with backpropagation and optimizers.

ML-DL curriculum landscape showing Python foundations, classical ML algorithms, and neural networks

Neural Network Training Pipeline

The complete training loop: forward pass through multiple layers with activation functions, cross-entropy loss computation, backward pass with gradient propagation via the chain rule, and weight updates using SGD, RMSProp, or Adam optimizers.

Neural network training pipeline with forward pass, backpropagation, and optimizer comparison

Decision Tree Construction

Decision tree construction using Gini impurity for feature selection, recursive binary splitting, pruning strategies, and the resulting rectangular decision boundaries for classification and regression tasks.

Decision tree construction with Gini impurity and decision boundaries
Who Is This For

Designed for Beginners, Engineers, and Researchers

Whether you are new to programming or an experienced developer looking to build a rigorous ML foundation, this bootcamp teaches you to implement every algorithm from scratch before using libraries.

Students and Beginners

Undergraduate and graduate students who want a rigorous, ground-up understanding of machine learning and deep learning. No prior ML experience required: we start from Python basics.

UndergradsGraduate StudentsCareer Changers

Software Engineers

Developers looking to transition into ML/AI roles. Build every algorithm from scratch before using frameworks, giving you the deep understanding that separates ML engineers from API callers.

Backend EngineersFull-Stack DevelopersDevOps

Data Scientists

Analysts and data practitioners who use ML libraries but want to understand what happens under the hood. Master the mathematics and implementation behind regression, trees, and neural networks.

Data AnalystsBusiness AnalystsApplied Scientists

Aspiring Researchers

Students aiming for graduate programs or research careers in AI. The research project component and publication pathway strengthen applications to top PhD programs.

PhD AspirantsResearch ScholarsAcademic Professionals
Curriculum

A Guided Journey from Python to Neural Networks

30 topics across 6 weeks, covering Python foundations, machine learning algorithms, and deep learning from scratch. Phase 1 (Weeks 1 through 6) is entirely self-paced: all lectures are pre-recorded and available for lifetime access, so you learn at your own speed.

Python FoundationsTopics 1-5
Session 1

Python Basics, Variables, Data Types

  • Python fundamentals for machine learning: variables, data types, control flow, and functions
EXERCISE: Write basic Python programs for data manipulation
Session 2

Matrix Multiplication from Scratch

  • Implementing matrix multiplication using pure Python, understanding computational foundations of ML
EXERCISE: Build matrix multiplication without NumPy
Session 3

Classes and Objects in ML

  • Object-oriented programming patterns used in ML codebases: classes, inheritance, and encapsulation
EXERCISE: Build a reusable ML model class structure
Session 4

Intro to NumPy and Pandas

  • NumPy arrays, vectorized operations, broadcasting, and Pandas DataFrames for data loading and exploration
EXERCISE: Load, clean, and explore a real-world dataset
Session 5

Data Visualization: Matplotlib, Seaborn, Plotly

  • Creating publication-quality visualizations: line plots, histograms, scatter plots, heatmaps, and interactive charts
EXERCISE: Visualize dataset distributions and correlations
ML FoundationsTopics 6-10
Session 6

What is ML, Types of ML Models

  • Supervised, unsupervised, and reinforcement learning paradigms, with real-world examples of each
EXERCISE: Classify ML problems by type from real-world scenarios
Session 7

The 6 Steps of an ML Project

  • End-to-end ML workflow: data collection, preprocessing, feature engineering, model selection, training, and evaluation
EXERCISE: Design an ML pipeline for a given problem statement
Session 8

Linear Classifiers and the Perceptron

  • Decision boundaries, the perceptron algorithm, convergence theorem, and limitations of linear classifiers
EXERCISE: Implement the perceptron from scratch on a 2D dataset
Session 9

NumPy, Scikit-learn, Jupyter

  • Setting up the ML development environment: Jupyter notebooks, scikit-learn API patterns, and NumPy for computation
EXERCISE: Train a scikit-learn model end-to-end in a Jupyter notebook
Session 10

Build Random Linear Classifier

  • Building a random linear classifier from scratch, understanding decision boundaries and classification accuracy
EXERCISE: Implement and visualize a random linear classifier
RegressionTopics 11-15
Session 11

Logistic Regression Intuition and Coding

  • Sigmoid function, probability interpretation, decision boundaries, and implementing logistic regression from scratch
EXERCISE: Build logistic regression from scratch with gradient descent
Session 12

Cross Entropy Loss and Gradient Descent

  • Deriving cross-entropy loss, gradient computation, learning rate selection, and convergence analysis
EXERCISE: Implement gradient descent optimization step by step
Session 13

Regularization: L1/L2

  • Lasso (L1) and Ridge (L2) regularization: mathematical formulation, effect on weights, and preventing overfitting
EXERCISE: Compare L1 and L2 regularization on an overfitting dataset
Session 14

Linear and Ridge Regression

  • Ordinary least squares, closed-form solution, Ridge regression with regularization, and model evaluation metrics
EXERCISE: Implement linear and Ridge regression from scratch
Session 15

Interview-Oriented Recap

  • Comprehensive review of regression concepts with ML interview-style questions and problem-solving strategies
EXERCISE: Solve ML interview problems on regression topics
Decision TreesTopics 16-20
Session 16

Gini Impurity and Tree Construction

  • Gini impurity measure, information gain, recursive tree construction, and feature selection for splits
EXERCISE: Compute Gini impurity by hand and implement tree splitting
Session 17

Pruning Trees, Full Code Walkthrough

  • Pre-pruning and post-pruning strategies, cost-complexity pruning, and complete decision tree implementation
EXERCISE: Build a pruned decision tree from scratch
Session 18

Regression Trees with Multiple Features

  • Extending decision trees to regression problems, handling continuous features, and multi-feature splitting
EXERCISE: Build a regression tree for a multi-feature dataset
Session 19

Hands-on: Build Regression and Classification Trees

  • End-to-end project building both regression and classification trees on real-world datasets
EXERCISE: Train and evaluate trees on real datasets
Session 20

Interview Prep: Trees Summary

  • Decision tree interview questions, comparison with other algorithms, and when to use trees vs. other methods
EXERCISE: Solve ML interview problems on tree-based methods
Neural Networks: Part ITopics 21-25
Session 21

Coding Neurons and Layers using NumPy

  • Implementing single neurons, dense layers, and multi-layer architectures using only NumPy arrays
EXERCISE: Build a 2-layer neural network from scratch
Session 22

Forward Pass, Activation Functions

  • Computing forward passes through layers, ReLU, sigmoid, tanh, and softmax activation functions
EXERCISE: Implement and visualize different activation functions
Session 23

Loss Functions: Cross Entropy

  • Binary and categorical cross-entropy, MSE loss, and choosing the right loss function for your problem
EXERCISE: Implement multiple loss functions and compare gradients
Session 24

Backpropagation (Single Neuron + Full Layer)

  • Deriving backpropagation for a single neuron, extending to full layers, and computing weight gradients
EXERCISE: Implement backpropagation step by step on a toy network
Session 25

Chain Rule and Matrix Gradients

  • The chain rule for composite functions, Jacobian matrices, and efficient matrix gradient computation for deep networks
EXERCISE: Derive and verify gradients using numerical differentiation
Neural Networks: Part IITopics 26-30
Session 26

Complete Backprop Pipeline

  • Assembling the full training loop: forward pass, loss computation, backward pass, and weight update in a unified pipeline
EXERCISE: Build an end-to-end training loop for a multi-layer network
Session 27

Optimizers: GD, RMSProp, Adam

  • Vanilla gradient descent, momentum, RMSProp adaptive learning rates, and Adam optimizer implementation
EXERCISE: Implement and compare three optimizers on the same dataset
Session 28

Overfitting: Dropout, Regularization, K-fold CV

  • Diagnosing overfitting with learning curves, dropout regularization, weight decay, and K-fold cross-validation
EXERCISE: Apply dropout and K-fold CV to reduce overfitting on a real dataset
Session 293-Hour Mega Project

Projects: MNIST Fashion + California Housing

  • End-to-end classification on Fashion MNIST using your neural network from scratch
  • Regression on the California Housing dataset with your custom training pipeline
  • Model evaluation, hyperparameter tuning, and results visualization
MEGA PROJECT: Train and evaluate neural networks on both datasets
Session 303-Hour Mega Project

Final Recap: Neural Network in 100 Minutes

  • Comprehensive review: from a single neuron to a full deep network
  • Key concepts consolidated: forward pass, loss, backprop, optimizers, regularization
  • Interview preparation and next steps for advanced deep learning
MEGA PROJECT: Build a complete neural network from scratch in one session
What You Get

Research-Grade Deliverables

Everything you need to go from ML beginner to building neural networks from scratch and publishing research.

Complete Python Codebase

Production-ready Python code for every session, including from-scratch implementations of every algorithm: regression, decision trees, and neural networks.

  • All lecture code files and Jupyter notebooks
  • Homework assignments with solutions
  • Research project starter templates
  • Fully documented ML pipelines

Lecture Notes and Videos

Lifetime access to all session recordings and comprehensive lecture notes covering every ML and DL concept from Python basics to neural networks.

  • HD video recordings of all sessions
  • Detailed lecture notes in PDF format
  • Annotated code walkthroughs
  • Reference material and reading lists

Research Project Portfolio

Industry-level ML/DL projects including neural network classifiers, regression models, and decision tree systems ready for your portfolio or publication.

  • Neural network trained on Fashion MNIST
  • Regression pipeline for California Housing
  • Decision tree classifier implementation
  • Publication-ready research results

Community and Mentorship

Join the Vizuara ML-DL community on Discord for ongoing collaboration, doubt clearance, and research partnerships.

  • Discord community access
  • Student collaboration opportunities
  • Assignment checking and doubt clearance
  • Free access to all ML webinars
Your Instructors

Learn from MIT and Purdue AI PhDs

Our instructors are co-founders of Vizuara AI Labs and published researchers in Machine Learning and Deep Learning, with expertise spanning neural networks, optimization, and applied ML.

Dr. Sreedath Panat
MIT PhD
Python Foundations, Neural Network Training, Optimizers, and Regularization

Dr. Sreedath Panat

Co-founder, Vizuara AI Labs

PhD from MIT, B.Tech from IIT Madras. 10+ years of research experience. Dr. Panat brings deep technical expertise from both academia and industry to make complex AI concepts accessible and practical.

MIT
IIT Madras
Dr. Raj Dandekar
MIT PhD
ML Foundations, Linear Classifiers, Decision Trees, and Interview Prep

Dr. Raj Dandekar

Co-founder, Vizuara AI Labs

PhD from MIT, B.Tech from IIT Madras. Dr. Raj specializes in building LLMs from scratch, including DeepSeek-style architectures. His expertise spans AI agents, scientific machine learning, and end-to-end model development.

MIT
IIT Madras
Dr. Rajat Dandekar
Purdue PhD
Regression, Backpropagation, Neural Network Architecture, and Loss Functions

Dr. Rajat Dandekar

Co-founder, Vizuara AI Labs

PhD from Purdue University, B.Tech and M.Tech from IIT Madras. Dr. Rajat brings deep expertise in reinforcement learning and reasoning models, focusing on advanced AI techniques for real-world applications.

Purdue University
IIT Madras
Build a DeepSeek Model (From Scratch)

Manning #1 Best-Seller

Build a DeepSeek Model (From Scratch)

By Dr. Raj Dandekar, Dr. Rajat Dandekar, Dr. Sreedath Panat & Naman Dwivedi

Credentials

Learn from MIT PhD Researchers

Our lead instructor Dr. Sreedath Panat holds a PhD from MIT, where he conducted research in applied AI and scientific computing. Our team brings deep expertise in machine learning, neural networks, and applied AI research.

MIT Certificate of Dr. Raj Dandekar
Our Research

Sample Papers From Our Research

A selected few papers from our research over the past years. Students in the Industry Professional plan work on similar projects aimed at publication.

Pricing

Enroll in the Bootcamp

Choose the plan that matches your goals, from self-paced learning to intensive research mentorship with MIT PhDs.

Most Popular

Researcher Plan

Rs 95,000

Save 24%. Originally Rs 1,25,000. MIT and Purdue PhDs as your research mentors.

Enroll Now
  • Lifetime access to all videos, code files, and homework assignments
  • Access to bootcamp community on Discord
  • Assignment checking and doubt clearance
  • Free access to all ML webinars throughout the year
  • Access to open list of research problems in ML/DL
  • 4-month personalized guidance in doing research
  • Publishing the research in conferences/journals
  • How ML and DL can be applied to real-world industries
FAQ

Frequently Asked Questions

Everything you need to know about the ML-DL Research Bootcamp.

Ready to Master Machine Learning ?

Join hundreds of students and engineers who have built neural networks from scratch and launched ML research careers. Start building every algorithm from the ground up.