Coding, JavaScript, Lingua franca, live stream, Machine Learning, Milestones, Murderous Maths, neural networks, Notes on everyday life, Python

Magic around New Year’s Eve

This magical time of the year, Simon’s craziest, most daring dreams come true! First, his guru from the New York University Daniel Shiffman sends Simon his book and the words he writes there are the most beautiful words anyone has ever told him. Then, on the last day of the awesome year 2017, Simon’s other hero, the glamorous knight of AI Siraj Raval materialises in our living room, directly from YouTube! Happy New Year full of miracles and discoveries everyone!

DSC_3314

Daniel Shiffman’s book “The Nature of Code” that Simon had already largely read online and now also reads before bed. It also comforted him recently when he was in pain, he cuddled up of the sofa with this big friendly tome on his lap.

dsc_33401897498871.jpg

Daniel Shiffman signed the book for Simon:

dsc_33441647117253.jpg

dsc_33421879896746.jpg

Siraj Raval stepped out of the YouTube screen straight into our Antwerp apartment on December 31. Simon has been following Siraj’s channel for months, learning about the types of neural networks and the math behind machine learning. It is thanks to Siraj’s explanations that Simon has been able to build his first neural nets :

DSC_3370

DSC_3364

Schermafbeelding 2018-01-05 om 02.07.56

Coding, Machine Learning, Milestones, Murderous Maths, neural networks, Python, Simon teaching

The Neural Nets are here!

Simon has started building neural networks in Python! For the moment, he has succeeded in making two working neural nets (a Perceptron and a Feed Forward neural net). He used the sigmoid activation function for both. The code partially derived from Siraj Raval’s “The Math of Intelligence” tutorials.

ML Perceptron 10 Dec 2017

The FF was tougher to build:

Simon’s nets run locally (on our home pc), but he will need more computational power for the more complex future projects, so he signed up to this wonderful online resource called FloydHub! FloydHub is sort of a heroku for deep learning, a Platform-as-a-Service for training and deploying deep learning models in the cloud. It uses Amazon, which Simon could, too, but it would have been a lot more expensive and tedious work to set up.

Simon’s next step will be another supervised learning project, a Recurrent Neural Net that will generate text. He has already started building it and fed it one book to read! In this video he explains how character-based text generators work:

Coding, Milestones, neural networks, Notes on everyday life, Python

DAE neural net and Keras

Simon told me today he was ready to start building his own DAE (Denoising Auto Encoder) neural network. He said he would be using a documentation page about a machine learning library called Keras at blog.keras.io/building-autoencoders-in-keras.html. He found this documentation page completely on his own, by searching the web and digging into Python fora. I just watch him google something like “How can I install Keras 1.0?” and find GitHub discussions on the subject that guide him along. Or I see him type “How to install Python on Windows?” and follow the instructions at How-to-Geek. Eventually, he came up with a list of steps that he needed to complete in order to be able to install Keras (installing Python 3 (looking up why it should be 3 and not 2), installing PIP, installing Tensor Flow, etc). It didn’t work on a mac laptop, so he tried everything on a pc and it worked! Two steps required that he used terminal. It was amazing to see him plan ahead, search and implement (notoriously difficult) steps completely independently.

He has started working on the DAE tonight. Working on a node package that makes `manifest.json` files (for Chrome extensions) at the same time, so not sure he’ll finish soon. “Mom, I’ve got so many things to do! There are so many thoughts in my head!”

DSC_2911

Biology, Coding, Java, live stream, Milestones, Simon makes gamez

Problem with a Genetic Algorithm Game

Simon has been working very hard at making his version of the Citius Invaders game that Siraj Raval presented in Python during Week 9 of his “Math of Intelligence” course. It’s a simplified version that still incorporates a genetic algorithm. Simon’s plan is to code the same game in P5 during his next live tutorial (at 17:00 CET on 30 November). Unfortunately, after many lines of code in several files, when the game was nearly ready, he faced a problem he doesn’t know how to resolve. He tried getting some online help via the Coding Train Slack channel but didn’t get much feedback. The code is now on github at: https://github.com/simon-tiger/citius-invaders/

Here is how Simon describes the problem:

Hi! I’m working on a “Citius Invaders” game, and I have a problem.
In the game, there are enemies, a spaceship, and bullets that it shoots.
The parameters of the game are:
* The game starts with 6 enemies.
* The minimal amount of enemies is 4.
* The maximal amount of enemies is 100.
The rules of the game are:
* When a bullet collides with an enemy, the bullet and the enemy get deleted.
* If there are 4 enemies on the screen, you’re not allowed to shoot any bullets any more.
* If there are 100 enemies on the screen, you’ve lost the game.
* The enemies reproduce every 5 seconds (this amount of time decreases every generation).
The goal of the game is to stay as close to 4 enemies as possible.
The problem is that: After the 5 seconds, the enemies don’t stop crossing over! I think the order that I’m crossing them over is wrong. They breed (cross over) forever and that makes the game pause forever. Maybe the algorithm in which I let them cross over is wrong. I have a `while` loop somewhere, maybe it became a forever loop. The `while` loop is at line 52 in `Population.pde`:

 while (!selectedEnemies1.isEmpty() || !selectedEnemies2.isEmpty()) {
 int index1 = int(random(selectedEnemies1.size()));
 int index2 = int(random(selectedEnemies2.size()));
 if (index1 == index2) {
 index2 = (index2 + 1) % selectedEnemies2.size();
 }

Below are a some videos showing Simon at different stages of the project. In the top video, he speaks of the difficulties he encountered with the frame rate and the genetic algorithm.

 

 

Coding, Milestones, neural networks, Python, Simon teaching

Simon working on a neural networks paper

Simon was working on a neural networks paper in Jupyter Notebook on Friday evening, but didn’t finish it because the Coding Train live stream started. He says he can no longer continue without having to do too much copy-pasting from this version into a new one, as his in-your-browser time expired, so I’m posting some screen shots of the unfinished paper below. This is the way Simon teaches himself: he follows lectures and tutorials online and then goes ahead to writing his own “textbook”or recording his own “lecture”. Much of the knowledge he acquires on neural networks these days comes from Siraj Raval’s YouTube series “The Math of Intelligence”.

 

Neural Networks Paper Jupyter 2017-11-20 1

Neural Networks Paper Jupyter 2017-11-20 2

Neural Networks Paper Jupyter 2017-11-20 3

Neural Networks Paper Jupyter 2017-11-20 4

Coding, Geometry Joys, Milestones, Murderous Maths, neural networks, Notes on everyday life

Just another day in graphs

Simon loves looking at things geometrically. Even when solving word problems, he tends to see them as a graph. And naturally, since he started doing more math related to machine learning, graphs have occupied an even larger portion of his brain! Below are his notes in Microsoft Paint today (from memory):

Slope of Line:

Slope of Line 15 November 2017

Steepness of Curve:

Steepness of Curve 15 November 2017

An awesome calculator Simon discovered online at desmos.com/calculator that allows you to make mobile and static graphs:

Desmos.com Polynomial 15 Nov 2017

Desmos.com Polynomial 15 Nov 2017 1

Yesterday’s notes on the chi function (something he learned through 3Blue1Brown‘s videos on Taylor polynomials):

DSC_2858

Simon following The Math of Intelligence course by Siraj Raval:

DSC_2843

DSC_2840

Coding, neural networks, Simon teaching

Simon explains the I, Robot project. How do synthetic literature neural nets work?

 

Today is a big day as – for the first time in human history – a short story has been published that was written by a robot together with a human. And the bot (called AsiBot, because it writes in the style of Isaac Asimov’s I, Robot) was developed in Dutch (!) in Amsterdam (at Meertens Institute) and in Antwerp (at the Antwerp Centre for Digital Humanities and Literary Criticim), Simon’s two home cities.

The story written by the AsiBot and Dutch bestselling author Ronald Giphart forms a new, 10th chapter in Isaac Asimov’s classic I, Robot (that originally contained only 9 chapters). The AsiBot was fed 10 thousand books in Dutch to master the literary language and can already produce a couple of paragraphs on its own, but a longer coherent story remains out of fetch. This is where a human writer, Ronald Giphart stepped in. It was he who decided which of the sentences written by AsiBot stayed and which should be thrown out. The reader doesn’t know which sentences are written (or edited) by the human writer and which are pure robot literature. Starting from November 6 anyone (speaking Dutch) can try writing with AsiBot on www.asibot.nl.

Simon was very excited about this news and recorded a short video where he explains how such “synthetic literature”neural nets work (based on what he learned from Siraj Raval’s awesome YouTube classes):

My phone froze so we had to make the second part as a separate video:

Coding, neural networks, Python

Introducing Siraj Raval

Simon has been watching a lot of Siraj Raval’s videos on neural networks lately, brushing up his Python syntax and derivatives. He has even been trying the great Jupyter editor, where one can build one’s own neural network and install libraries with pretrained networks https://try.jupyter.org/

Just like with Danel Shiffman’s videos, the remarkable thing about Siraj’s (very challenging) courses is that they also touch upon so many subjects outside programming (like art and music and stock exchange) and are compiled with a sublime sense of humour.

dsc_2125964025814.jpg