Simon gets serious with Linear Regression (Machine Learning)

Simon has been working on a very complicated topic for the past couple of days: Linear Regression. In essence, it is the math behind machine learning.

Simon was watching Daniel Shiffman’s tutorials on Linear Regression that form session 3 of his Spring 2017 ITP “Intelligence and Learning” course (ITP stands for Interactive Telecommunications Program and is a graduate programme at NYU’s Tisch School of the Arts).

Daniel Shiffman’s current weekly live streams are also largely devoted to neural networks, so in a way, Simon has been preoccupied with related stuff for weeks now. This time around, however, he decided to make his own versions of Daniel Shiffman’s lectures (a whole Linear Regression playlist), has been busy with in-camera editing, and has written a resume of one of the Linear Regression tutorials (he actually sat there transcribing what Daniel said) in the form of an interactive webpage! This Linear Regression webpage is online at: https://simon-tiger.github.io/linear-regression/ and the Gragient Descent addendum Simon made later is at:  https://simon-tiger.github.io/linear-regression/gradient_descent/interactive/ and https://simon-tiger.github.io/linear-regression/gradient_descent/random/

And here come the videos from Simon’s Liner Regression playlist, the first one being an older video you may have already seen:

Here Simon shows his interactive Linear Regression webpage:

A lecture of Anscombe’s Quartet (something from statistics):

Then comes a lecture on Scatter Plot and Residual Plot, as well as combining Residual Plot with Anscombe’s Quartet, based upon video 3.3 of Intelligence and Learning. Simon made a mistake graphing he residual plot but corrected himself in an addendum (end of the video):

Polynomial Regression:

And finally, Linear Regression with Gradient Descent algorithm and how the learning works. Based upon Daniel Shiffman’s tutorial 3.4 on Intelligence and Learning:

DSC_0557

 

 

Simon explains Linear Regression (Machine Learning)

In the two videos below Simon writes a JavaScript program using Linear Regression in Atom and gives a whiteboard lecture on the Linear Regression algorithm, both following a tutorial on Linear Regression by Daniel Shiffman.

Simon made a mistake in the formula using the sigma operator. He corrected it later. It should be i=1 (not i=0).

DSC_0548

DSC_0549

DSC_0551

 

Simon solved the bug in his Bit Invader game!

Simon actually managed to solve the bug in his Bit invader code! This is a game he was translating from Codea into JavaScript, we have already published a blog post about it here.

The project is available on Simon’s page in Codepen:

https://codepen.io/simontiger/project/editor/AdyVmr/

In the two videos below Simon explains what the bug was (he had forgotten a “break” statement). He insisted I include both videos, but actually only the second one is informative:

 

Simon still needs to add explosions to this game (make the enemies explode), so there will probably be a follow-up on this one.

 

Translating Car On Terrain project from Phaser.io into Processing

Today Simon spent hours translating this Car On Terrain project from Phaser.io (where it appears in JavaScript) into Processing (Java). He loved doing it in a form of a lesson for me, while I was filming him and asking questions about loops, arrays, fixtures, shapes and bodies (and there are many things I don’t understand). Simon also spoke about “the three most important properties: density, friction and restitution”. The project involved a lot of Physics, using many Box2D sub-libraries and translating between pixels and mm.

In the end, he got tired of writing all the coordinates for the terrain vertices, but he did get quite far.

 

 

 

 

 

Applying Box2D to translate from pixels into mm:

CarOnTerrain translating from pixels into mm

Translating Bit Invader from Codea into JavaScript

Simon tried to reconstruct Bit Invader game (from Codea.io) in JavaScript, but got stuck at a certain point when he was programming the enemy to recognize the hero and the bullets. Here is how far he got. The project is available on Simon’s page in Codepen:

https://codepen.io/simontiger/project/editor/AdyVmr/

 

Diffusion-Limited Aggregation translated into Codea

Simon translated Daniel Shiffman’s Diffusion-limited Aggregation Coding Challenge into Codea. The coding challenge explores the generative algorithm “Diffusion-Limited Aggregation”, whose visual pattern is generated from random walkers clustering around a seed (or set of seed) point(s).

Unfortunately, every time the iPad falls asleep the application seems to stop, so we never got a sizable tree.

 

Error with Genetic Algorithm. What is wrong?

Simon was almost done translating Smart Rockets example no. 2 (Smart Rockets Superbasic) from Daniel Shiffman’s The Nature of Code, from Processing (Java) into JavaScript in Cloud9, when he got an error using genetic algorithm: the dna seems to be undefined while Simon did define mom and dad dna.

This is Simon’s translation online in Cloud9: https://ide.c9.io/simontiger/smart-rockets#openfile-README.md

This is when he first discovered the bug and tried different solutions:

And this is the same project before he introduced the genetic algorithm:

In the next video Simon boasts he found two errors in his code and hopes that the problem would be solved, but alas, the rockets still vanish from the canvas after a few seconds:

Simon is officially stuck here.

On the positive side, this project did get us to read more about the actual human DNA and the way it works.

 

Drawing Object Trails in Processing

Simon’s translation of the Drawing Object Trails p5.js tutorial into Processing. In the tutorial, Daniel Shiffman looked at how an object can store a history of positions thus allowing to render the object’s trail while keeping background() in the draw() loop. Simon turned the p5.Vector and the createVector() into PVector, as well as the push() and splice() functions in JavaScript into add() and remove() in Processing.