In for a shower? Simon made a beautiful Cloud Simulation in Processing (Java). He wrote this code himself. This is the final version of the project:
The whole project is on GitHub, you can download it at: https://github.com/simon-tiger/rain
The videos below show Simon creating the cloud simulation step by step:
Simon was preoccupied with vector functions for most of the day on Saturday, compiling what, at first site, looked like a monstrously excessive code in Processing (he recycled some of it from the Processing forum, wrote some it himself and translated the rest from Codea). Originally, he was gong to use the code to expand the 3D vector animation he made into a roller-coaster project he saw on Codea and wanted to create in Processing, but got stuck with the colors. What happened next was fascinating. In the evening I all of a sudden saw Simon write in a new online editor Repl.it – he was translating the vector code into… Python! He hadn’t used Python for quite a while. I don’t know what triggered it, maybe Daniel Shiffman noting last night during the live stream session that “normal people use Python for machine learning”. Simon also said he had sone some reading about Python at Python for Beginners and Tree House!
He isn’t done with his project in Python yet, but here is the link to it online: https://repl.it/JAeQ/13
Here Simon explains what he is writing in Python:
Simon did the 2D, 3D and 4D classes but eventually got stuck with the matrix class in Python. He then opened his old Xamarin IDE and wrote the 2D, the 3D and the 4D classes in C#. In the video below he briefly shows his C# sketch and talks about Cross Product in general:
And this is a video he recorded about vector functions (in Processing, Java) the day before:
Simon seems to be setting first gentle steps in Calculus, once again thanks to Daniel Shiffman.
In the videos below, Simon talks about minimizing and maximizing functions, the power rule, the chain rule and partial derivatives. Simon’s videos are his resumes of Daniel Shiffman’s Calculus Intelligence and Learning course tutorial 3.5 and a few other related tutorials. “Intelligence and Learning” course is a Spring 2017 course that Daniel Shiffman taught at the ITP, the Interactive Telecommunications Program, a graduate programme at NYU’s Tisch School of the Arts.
On Saturday morning Simon also chose to do Calculus with his math teacher (via Khan Academy), instead of Algebra and polynomial expressions that they were actually doing for the past weeks.
Daniel Shiffman’s videos on Calculus:
Power Rule: https://youtu.be/IKb_3FJtA1U
Chain Rule: https://youtu.be/cE6wr0_ad8Y
Partial Derivative: https://youtu.be/-WVBXXV81R4
Simon has been working on a very complicated topic for the past couple of days: Linear Regression. In essence, it is the math behind machine learning.
Simon was watching Daniel Shiffman’s tutorials on Linear Regression that form session 3 of his Spring 2017 ITP “Intelligence and Learning” course (ITP stands for Interactive Telecommunications Program and is a graduate programme at NYU’s Tisch School of the Arts).
Daniel Shiffman’s current weekly live streams are also largely devoted to neural networks, so in a way, Simon has been preoccupied with related stuff for weeks now. This time around, however, he decided to make his own versions of Daniel Shiffman’s lectures (a whole Linear Regression playlist), has been busy with in-camera editing, and has written a resume of one of the Linear Regression tutorials (he actually sat there transcribing what Daniel said) in the form of an interactive webpage! This Linear Regression webpage is online at: https://simon-tiger.github.io/linear-regression/ and the Gragient Descent addendum Simon made later is at: https://simon-tiger.github.io/linear-regression/gradient_descent/interactive/ and https://simon-tiger.github.io/linear-regression/gradient_descent/random/
And here come the videos from Simon’s Liner Regression playlist, the first one being an older video you may have already seen:
Here Simon shows his interactive Linear Regression webpage:
A lecture of Anscombe’s Quartet (something from statistics):
Then comes a lecture on Scatter Plot and Residual Plot, as well as combining Residual Plot with Anscombe’s Quartet, based upon video 3.3 of Intelligence and Learning. Simon made a mistake graphing he residual plot but corrected himself in an addendum (end of the video):
And finally, Linear Regression with Gradient Descent algorithm and how the learning works. Based upon Daniel Shiffman’s tutorial 3.4 on Intelligence and Learning:
Yesterday Simon got a parcel from the US: Simon’s hero, NYU professor Daniel Shiffman sent him a beautiful gift – a Coding Train shirt! Coding Train is Daniel Shiffman’s channel on YouTube where he records tutorials, coding challenges and live streams. Basically, Coding Train has been Simon’s main learning source in Programming, Math and Physics (and English!) for months.
Simon made a mistake in the formula using the sigma operator. He corrected it later. It should be i=1 (not i=0).
Simon’s creative “remix” of example 2.7 from Daniel Shiffman’s The Nature of Code, Chapter 2 (Forces).
Here Simon explains how to calculate the magnitude of a 3D vector. This is something he partially figured out on his own and partially learned from Daniel Shiffman’s tutorial on Trigonometry and Polar Coordinates.
The project is available on Simon’s page in Codepen:
In the two videos below Simon explains what the bug was (he had forgotten a “break” statement). He insisted I include both videos, but actually only the second one is informative:
Simon still needs to add explosions to this game (make the enemies explode), so there will probably be a follow-up on this one.
Visited the reopened museum of contemporary art in Antwerpen MuHKA this afternoon. Simon enjoyed a few graphical pieces, especially when allowed to take photos of them with my mobile.
Three pictures taken by Simon:
Definitions of “the Truth”:
In the children’s “Salon”, we loved the survival-on-the-Moon game: you had to answer the questions about which items would help you survive on the Moon.