Hello fellow learners!

This is a space for learners by a learner. Inspired by Professor Feynman, I use Feynman techniques of learning in an attempt to simplify concepts that I have previously or recently struggled with. I attempt to intuitively understand before getting started with the technicalities.

I hope my notes come in handy for you. Happy Learning!!

  • Neural Networks – Feedforward Math
    In this post, we will do the math on our dummy dataset and calculate the feedforward steps by hand. We will take the parameters of our first instance, i.e. first house, as the input vector and arbitrarily chosen random weights. Our dummy dataset was as follows: INSTANCE SQFT NUM_BED NUM_BATH house 1 1500 2 2 house 2 1700 3 2 house 3 1750 3 3 dummy housing data Let’s recall how the feature vector and weights are multiplied to get the input to hidden layer which will be the values of our hidden nodes. Let’s see what goes into the …
  • Neural Networks – Backpropagation
    Notes on intuitive understanding of backpropagation.
  • Neural Networks – Feedforward process
    What happens when the input goes into the model? It makes its way through the network architecture to the output layer. This process is called Feed-forward process and results in the prediction or classification of the problem we’re building the model for. We will begin with a brief overview and then peel the layers as we proceed. Things to know: Vectors Matrices Matrix Multiplication /Dot Product Components of a Feed-forward process: Input Layer Weight matrices Activation functions Bias The Process- A brief overview: Your data goes into the model in the form of an input vector. This vector is then …

Rating: 1 out of 5.