Dive into articles that blend technology insights with personal experiences in the tech world.
How do neural networks learn? Lets explore backpropagation and gradient descent and their critical roles in updating weights through the lens of loss minimization.
Unlock the fundamentals of Multi-Layer Perceptrons (MLPs) with a hands-on approach, exploring a seven-layer network example and exploring forward propagation.
Learn how neural networks handle batches of inputs, using three different input vectors to demonstrate the application of weights across a batch for efficient computation.
Dive deeper into neural network architecture by adding and understanding the computations involved in a hidden layer, laying the groundwork for more complex networks.
Elevate your neural network knowledge by transitioning from the simplicity of a single neuron to the intricate dynamics of a four-neuron layer, complete with ReLU activation insights.
Embark on a deep dive into the workings of a single neuron and the ReLU function, foundational concepts in neural network design.
Dive into my journey of unraveling matrix multiplication, a foundational stone in neural network computations, and how it propels deep learning forward.