Simon was preoccupied with vector functions for most of the day on Saturday, compiling what, at first site, looked like a monstrously excessive code in Processing (he recycled some of it from the Processing forum, wrote some it himself and translated the rest from Codea). Originally, he was gong to use the code to expand the 3D vector animation he made into a roller-coaster project he saw on Codea and wanted to create in Processing, but got stuck with the colors. What happened next was fascinating. In the evening I all of a sudden saw Simon write in a new online editor Repl.it – he was translating the vector code into… Python! He hadn’t used Python for quite a while. I don’t know what triggered it, maybe Daniel Shiffman noting last night during the live stream session that “normal people use Python for machine learning”. Simon also said he had sone some reading about Python at Python for Beginners and Tree House!
He isn’t done with his project in Python yet, but here is the link to it online: https://repl.it/JAeQ/13
Here Simon explains what he is writing in Python:
Simon did the 2D, 3D and 4D classes but eventually got stuck with the matrix class in Python. He then opened his old Xamarin IDE and wrote the 2D, the 3D and the 4D classes in C#. In the video below he briefly shows his C# sketch and talks about Cross Product in general:
And this is a video he recorded about vector functions (in Processing, Java) the day before:
Simon seems to be setting first gentle steps in Calculus, once again thanks to Daniel Shiffman.
In the videos below, Simon talks about minimizing and maximizing functions, the power rule, the chain rule and partial derivatives. Simon’s videos are his resumes of Daniel Shiffman’s Calculus Intelligence and Learning course tutorial 3.5 and a few other related tutorials. “Intelligence and Learning” course is a Spring 2017 course that Daniel Shiffman taught at the ITP, the Interactive Telecommunications Program, a graduate programme at NYU’s Tisch School of the Arts.
On Saturday morning Simon also chose to do Calculus with his math teacher (via Khan Academy), instead of Algebra and polynomial expressions that they were actually doing for the past weeks.
Daniel Shiffman’s videos on Calculus:
Power Rule: https://youtu.be/IKb_3FJtA1U
Chain Rule: https://youtu.be/cE6wr0_ad8Y
Partial Derivative: https://youtu.be/-WVBXXV81R4
Simon has been working on a very complicated topic for the past couple of days: Linear Regression. In essence, it is the math behind machine learning.
Simon was watching Daniel Shiffman’s tutorials on Linear Regression that form session 3 of his Spring 2017 ITP “Intelligence and Learning” course (ITP stands for Interactive Telecommunications Program and is a graduate programme at NYU’s Tisch School of the Arts).
Daniel Shiffman’s current weekly live streams are also largely devoted to neural networks, so in a way, Simon has been preoccupied with related stuff for weeks now. This time around, however, he decided to make his own versions of Daniel Shiffman’s lectures (a whole Linear Regression playlist), has been busy with in-camera editing, and has written a resume of one of the Linear Regression tutorials (he actually sat there transcribing what Daniel said) in the form of an interactive webpage! This Linear Regression webpage is online at: https://simon-tiger.github.io/linear-regression/ and the Gragient Descent addendum Simon made later is at: https://simon-tiger.github.io/linear-regression/gradient_descent/interactive/ and https://simon-tiger.github.io/linear-regression/gradient_descent/random/
And here come the videos from Simon’s Liner Regression playlist, the first one being an older video you may have already seen:
Here Simon shows his interactive Linear Regression webpage:
A lecture of Anscombe’s Quartet (something from statistics):
Then comes a lecture on Scatter Plot and Residual Plot, as well as combining Residual Plot with Anscombe’s Quartet, based upon video 3.3 of Intelligence and Learning. Simon made a mistake graphing he residual plot but corrected himself in an addendum (end of the video):
And finally, Linear Regression with Gradient Descent algorithm and how the learning works. Based upon Daniel Shiffman’s tutorial 3.4 on Intelligence and Learning:
Simon made a mistake in the formula using the sigma operator. He corrected it later. It should be i=1 (not i=0).
Here Simon explains how to calculate the magnitude of a 3D vector. This is something he partially figured out on his own and partially learned from Daniel Shiffman’s tutorial on Trigonometry and Polar Coordinates.
In the end, he got tired of writing all the coordinates for the terrain vertices, but he did get quite far.
Applying Box2D to translate from pixels into mm:
Almost every evening, before going to bed, we are reading books and Simon mostly prefers math adventures. Russian author Vladimir Levshin (1904-1984) published several books about geometry, algebra and math history, with numbers and letters as the leading characters. Most chapters contain complicated riddles that we solve along the way. Sometimes, Simon gets up to fetch some paper and pencils to write down what he thinks the formula or the geometrical pattern should be for a particular story. And because Levshin’s books often mention famous mathematicians of the past, I see Simon learn about history through math. What he knows about Ancient Greece or the 1970’s mainly comes from his interest in early math and geometry or the dawn of computer science.
Here Simon was drawing his impressions of Cantor’s set theory, inspired by a passage about him in Levshin’s book:
Levshin’s book that we’re reading now:
Passage on Boole and Cantor:
Another book by Levshin we have recently read, about Algebra:
A chapter from that book talking about finding a sum of all the members of an arithmetic progression:
Simon stormed out the bedroom and came back with a sheet of paper where he wrote down the formula, before we read about it in the book (he often tries to come up with his own formulas):
The same formula in the book:
Funny how, even when training some pretty straightforward (and boring) arithmetic or Dutch reading, Simon tries to introduce more complex notions like here,
the floor, ceiling and round functions while solving a simple arithmetic word problem:
and lexicographic order, while sequencing Dutch story sentences:
Heard Simon give his Russian grandparents a lecture in the playroom, via FaceTime. When I came in, this is what I saw on the whiteboard. Simon proudly said he figured out how to calculate the arc-tangent. Why, what do you talk to your Grandmom about?
Simon wrote a modulus counting program in Processing after we were discussing why
1 % 2 = 1 and why 2 % 4 = 2. He basically told me he simply knew 1 % 60 equals 1, but he didn’t know why. We realized later that the strange result comes from the fact that if a number (the divisor) fits into another number (the numerator) zero times then, in modulus counting, that numerator becomes the excess.