Codes
These codes are just viewable. In order to edit and execute them, in colab please go to File-> Save a Copy in Drive. This will save a copy in your Google Drive and you will be able to edit and execute your copy of the codes in colab.
Session 3: MNIST Classification, Feedforward NN
Session 5: Informed Search Methods (GA Example)
Session 5: GA to train a Perception
Session 6: Probability and Statistics for AI & Machine Learning
Session 7: Document Classification using Naive Bayes Classifier.
Session 8: Principal Component Analysis
Session 11: Gradient Descent Method for Optimization
Session 11: Linear Regression, Gradient Descent W/O-Fearure Scaling
Session 11: Linear Regression, Gradient Descent with Feature Scaling
Session 11: Linear Regression with Stochastic Gradient Descent w Feature Scaling
Session 11: Linear Regression with Mini-Batch Gradient Descent w Feature Scaling
Session 12: Implementing an AND gate with a Perceptron
Session 12: Implementing an XOR gate with a Perceptron (Mission NOT Accomplished!)
Session 12: Implementing an XOR gate with a Feedforward Neural Network
Session 13: Maximum Likelihood Estimation, the main idea
Session 14: Grid Search for Tuning Hyperparameters
Session 14: Grid Search for Designing a Deep Feedforward Network
Session 17: Session 17: Feedforward NN: Overfitting and Underfitting
Session 17: Session 17: Feedforward NN: Overfitting and Underfitting (IMDB Reviews)
Session 18: DNNs are not robust to rotation or shifts of inputs.
Session 19: CNNs: overfitting, data augmentation, and transfer learning
Session 20: Homework: Regression with DNN, advanced Optimization (beyond SGD)
Session 21: Implementing a basic RNN
Session 21: Simple RNN for text classification
Session 22: The Simpsons homework (Convolutional Neural Network)