# Tag Archives: Calculus

# Integral Calculus Course on Brilliant is chugging along

# Formula for e

I’ve worked out a formula for

e!

This came up when I was looking for an antiderivative, if

nisn’t equal to 1:

if

nis equal to 1, then it’s suddenly a natural log!

But I’ve realized that if I change it only a tiny bit, it becomes a really famous existing formula for e:

Still impressive that you have worked it out all on your own, Simon!

# Science on the Balcony: Position of a Pendulum

Simon: “The direct formula for the position of a pendulum is not what you might think”.

Simon’s code for spring and graph: https://editor.p5js.org/simontiger/sketches/mWp6gQLxz

Simon’s code for pendulum with directed fields: https://editor.p5js.org/simontiger/sketches/U__pD4iZL

Simon’s code for simple movable pendulum: https://editor.p5js.org/simontiger/sketches/koPHNu670

Simon originally got inspired to work on this project thanks to the My Physics Lab platform. Also inspired by 3Blue1Brown’s video Differential equations, studying the unsolvable and Brilliant’s Calculus Fundamentals course.

# Math for Neural Networks and Calculus Fundamentals via Brilliant.org

A little over a month ago, Simon picked up neural networks again (something he had tried a while ago but couldn’t grasp intuitively). He started the Artificial Neural Networks course on Brilliant.org and covered vectors, matrices, optimisation, perceptrons and multilayer perceptrons fairly quickly and even built his first perceptron in Python from scratch (will publish a video about this project shortly). As soon as he reached the chapter on Backpropagation, however, he realised his current knowledge of Calculus wasn’t enough. This is how Simon, completely on his own, decided to get back to studying Calculus (something he lost interest in last year). After gulping up several chapters of the Calculus Fundamentals course, Simon told me he was now ready to do Backpropagation (nearly done now). On to the convolutional neural networks (the next chapter in the course)!

As of today, these are his progress stats:

Below are some impressions of doing Calculus Fundamentals.

# Simon differentiating 3tan(2x)

To solve the problem, Simon chooses not to look up what the derivative of a tangent is but work everything out from scratch. He generally doesn’t like rote learning but prefers to gain deep understanding of how and why.

# Impressions on Newton’s mechanics.

“Are you impressed?” – Simon asks, laughingly, and I can see it must be a pun. We are in bed, reading up on Newton’s laws of motion that talk of forces being “impressed” upon bodies.

Simon continues: “Newton’s mechanics says that the speed limit is infinite, which says that matter doesn’t exist, which says that Physics doesn’t exist, which says that Newton’s mechanics doesn’t exist. Newton’s mechanics contradicts itself!”

The book we are reading (*17 Equations that Changed the World* by Ian Stewart) goes on to describe how in Newton’s laws, calculus peeps out from behind the curtains and how the second law of motion specifies the relation between a body’s position, and the forces that act on it, in the form of a differential equation: second derivative of position = force/mass. To find the position, the book says, we have to solve this equation, defusing the position from its second derivative. “Do you get it?” – I ask, “Because I don’t think I do”. — “I’ll need a piece of paper for this”, – Simon quickly comes back dragging his oversized sketchbook. Then he quickly writes down the differential equation (where the *x* is the position) to explain to me what the second derivative is. And then he solves it:

# Geometric Definition of e

The idea comes from a video by Mathologer. Simon sketches a geometric definition of the Euler’s number (e) using integrals. He messed up a little with the integral notation, but corrected it later (after we stopped filming). Please see the photos below:

# Introducing Siraj Raval

Simon has been watching a lot of Siraj Raval’s videos on neural networks lately, brushing up his Python syntax and derivatives. He has even been trying the great Jupyter editor, where one can build one’s own neural network and install libraries with pretrained networks https://try.jupyter.org/

Just like with Danel Shiffman’s videos, the remarkable thing about Siraj’s (very challenging) courses is that they also touch upon so many subjects outside programming (like art and music and stock exchange) and are compiled with a sublime sense of humour.

# Simon’s Changes to the Lorenz Attractor Coding Challenge

Simon explains why he slightly changed Daniel Shiffman’s Lorenz Attractor Coding Challenge: Simon used four variables instead of 11.

The **Lorenz system** is a system of ordinary differential equations, first studied by Edward Lorenz. The **Lorenz attractor** is a set of chaotic solutions of the Lorenz system which, when plotted, resemble a butterfly or figure eight.

In the next video, Simon’s completed version of the challenge, including rainbow colors: