Back to Python (and C#)!

Simon was preoccupied with vector functions for most of the day on Saturday, compiling what, at first site, looked like a monstrously excessive code in Processing (he recycled some of it from the Processing forum, wrote some it himself and translated the rest from Codea). Originally, he was gong to use the code to expand the 3D vector animation he made into a roller-coaster project he saw on Codea and wanted to create in Processing, but got stuck with the colors. What happened next was fascinating. In the evening I all of a sudden saw Simon write in a new online editor Repl.it – he was translating the vector code into… Python! He hadn’t used Python for quite a while. I don’t know what triggered it, maybe Daniel Shiffman noting last night during the live stream session that “normal people use Python for machine learning”. Simon also said he had sone some reading about Python at Python for Beginners and Tree House!

He isn’t done with his project in Python yet, but here is the link to it online: https://repl.it/JAeQ/13

Here Simon explains what he is writing in Python:

Simon did the 2D, 3D and 4D classes but eventually got stuck with the matrix class in Python. He then opened his old Xamarin IDE and wrote the 2D, the 3D and the 4D classes in C#. In the video below he briefly shows his C# sketch and talks about Cross Product in general:

And this is a video he recorded about vector functions (in Processing, Java) the day before:

Treading into Calculus

Simon seems to be setting first gentle steps in Calculus, once again thanks to Daniel Shiffman.

DSC_0570

In the videos below, Simon talks about minimizing and maximizing functions, the power rule, the chain rule and partial derivatives. Simon’s videos are his resumes of Daniel Shiffman’s Calculus Intelligence and Learning course tutorial 3.5 and a few other related tutorials. “Intelligence and Learning” course is a Spring 2017 course that Daniel Shiffman taught at the ITP, the Interactive Telecommunications Program, a graduate programme at NYU’s Tisch School of the Arts.

 

 

 

 

On Saturday morning Simon also chose to do Calculus with his math teacher (via Khan Academy), instead of Algebra and polynomial expressions that they were actually doing for the past weeks.

DSC_0574

DSC_0576

DSC_0577

Daniel Shiffman’s videos on Calculus:
Power Rule: https://youtu.be/IKb_3FJtA1U
Chain Rule: https://youtu.be/cE6wr0_ad8Y
Partial Derivative: https://youtu.be/-WVBXXV81R4

Simon gets serious with Linear Regression (Machine Learning)

Simon has been working on a very complicated topic for the past couple of days: Linear Regression. In essence, it is the math behind machine learning.

Simon was watching Daniel Shiffman’s tutorials on Linear Regression that form session 3 of his Spring 2017 ITP “Intelligence and Learning” course (ITP stands for Interactive Telecommunications Program and is a graduate programme at NYU’s Tisch School of the Arts).

Daniel Shiffman’s current weekly live streams are also largely devoted to neural networks, so in a way, Simon has been preoccupied with related stuff for weeks now. This time around, however, he decided to make his own versions of Daniel Shiffman’s lectures (a whole Linear Regression playlist), has been busy with in-camera editing, and has written a resume of one of the Linear Regression tutorials (he actually sat there transcribing what Daniel said) in the form of an interactive webpage! This Linear Regression webpage is online at: https://simon-tiger.github.io/linear-regression/ and the Gragient Descent addendum Simon made later is at:  https://simon-tiger.github.io/linear-regression/gradient_descent/interactive/ and https://simon-tiger.github.io/linear-regression/gradient_descent/random/

And here come the videos from Simon’s Liner Regression playlist, the first one being an older video you may have already seen:

Here Simon shows his interactive Linear Regression webpage:

A lecture of Anscombe’s Quartet (something from statistics):

Then comes a lecture on Scatter Plot and Residual Plot, as well as combining Residual Plot with Anscombe’s Quartet, based upon video 3.3 of Intelligence and Learning. Simon made a mistake graphing he residual plot but corrected himself in an addendum (end of the video):

Polynomial Regression:

And finally, Linear Regression with Gradient Descent algorithm and how the learning works. Based upon Daniel Shiffman’s tutorial 3.4 on Intelligence and Learning:

DSC_0557

 

 

Simon explains Linear Regression (Machine Learning)

In the two videos below Simon writes a JavaScript program using Linear Regression in Atom and gives a whiteboard lecture on the Linear Regression algorithm, both following a tutorial on Linear Regression by Daniel Shiffman.

Simon made a mistake in the formula using the sigma operator. He corrected it later. It should be i=1 (not i=0).

DSC_0548

DSC_0549

DSC_0551

 

Magnitude of a 3D vector

Here Simon explains how to calculate the magnitude of a 3D vector. This is something he partially figured out on his own and partially learned from Daniel Shiffman’s tutorial on Trigonometry and Polar Coordinates.

 

DSC_0540

Translating Car On Terrain project from Phaser.io into Processing

Today Simon spent hours translating this Car On Terrain project from Phaser.io (where it appears in JavaScript) into Processing (Java). He loved doing it in a form of a lesson for me, while I was filming him and asking questions about loops, arrays, fixtures, shapes and bodies (and there are many things I don’t understand). Simon also spoke about “the three most important properties: density, friction and restitution”. The project involved a lot of Physics, using many Box2D sub-libraries and translating between pixels and mm.

In the end, he got tired of writing all the coordinates for the terrain vertices, but he did get quite far.

 

 

 

 

 

Applying Box2D to translate from pixels into mm:

CarOnTerrain translating from pixels into mm

Old men from the 19th century

Almost every evening, before going to bed, we are reading books and Simon mostly prefers math adventures. Russian author Vladimir Levshin (1904-1984) published several books about geometry, algebra and math history, with numbers and letters as the leading characters. Most chapters contain complicated riddles that we solve along the way. Sometimes, Simon gets up to fetch some paper and pencils to write down what he thinks the formula or the geometrical pattern should be for a particular story. And because Levshin’s books often mention famous mathematicians of the past, I see Simon learn about history through math. What he knows about Ancient Greece or the 1970’s mainly comes from his interest in early math and geometry or the dawn of computer science.

A couple days ago we were reading about George Boole, yet another example of someone way ahead of his time (200 years to be precise), the inventor of Boolean algebra. Simon was so excited when he recognized his name, and the name of Georg Cantor, a German mathematician, whose work was just as shocking to his contemporaries as Boole’s work was. Simon recognized both of their names because of his programming. This way, a connection was traced in his mind between these two 19th century men and today’s cutting edge projects in Java and JavaScript.

Here Simon was drawing his impressions of Cantor’s set theory, inspired by a passage about him in Levshin’s book:

DSC_0466

 

Levshin’s book that we’re reading now:DSC_0467

Passage on Boole and Cantor:

DSC_0468

DSC_0469

Another book by Levshin we have recently read, about Algebra:

DSC_0474

A chapter from that book talking about finding a sum of all the members of an arithmetic progression:

DSC_0470

DSC_0471

Simon stormed out the bedroom and came back with a sheet of paper where he wrote down the formula, before we read about it in the book (he often tries to come up with his own formulas):

DSC_0473

The same formula in the book:

DSC_0472

Looking for math everywhere

Funny how, even when training some pretty straightforward (and boring) arithmetic or Dutch reading, Simon tries to introduce more complex notions like here,

the floor, ceiling and round functions while solving a simple arithmetic word problem:

DSC_0450

DSC_0452

and lexicographic order, while sequencing Dutch story sentences:

DSC_0457

Modulus Counting in Processing

Simon wrote a modulus counting program in Processing after we were discussing why

1 % 2 = 1 and why 2 % 4 = 2. He basically told me he simply knew 1 % 60 equals 1, but he didn’t know why.  We realized later that the strange result comes from the fact that if a number (the divisor) fits into another number (the numerator) zero times then, in modulus counting, that numerator becomes the excess.

 

DSC_0407