Teaching an 8 year old to code with Ruby

Here's a very simple Ruby script I put together this evening for my daughter. She's just started a book called Computer Coding for Kids and the first little Python script in there is a game where you are being chased by a ghost and have to guess which door to open. If the door you open has no ghost behind it, you get to continue to the next room and go again. If there is a ghost, the game ends.

After she got the code working, we discussed how we could use the same idea, but have a different story line to the game. Because it's Halloween, she thought it would be great to pretend we were trick or treating and have to guess what kind of treat she would get each time she rang on a doorbell.

I put this script together for her to copy. I think it's a good starting point for kids who are already confident with technology and who have perhaps played with things like scratch before and maybe have some concept of syntax.

guess = nil
trick_or_treat = nil
puts "It's Halloween"
puts "."
puts "."
puts "."
puts "And you're out trick or treating."
3.times { puts "." }
until guess != trick_or_treat do
puts "You walk toward a house with a pumpkin outside and ring the bell."
puts "The door creaks open and you shout 'Trick Or Treat?'"
puts "What type of treat will you get?"
print "Enter 1 for lollipop, 2 for chocolate bar, 3 for toffee apple: "
trick_or_treat = 1 + rand(2)
guess = gets.chomp.to_i
if guess == trick_or_treat
puts "Yay! you guessed correctly. That's another sweet in the bag."
puts "Bad luck, you you guessed wrong, time to go home."

There's quite a few concepts in this script that I threw in there to sort of lead my daughter's learning. She's done some basic programming in a few visual programming languages and a little Logo. She's also done some introductions at school where they explained what an algorithm is and they did some very simple logic like comparing two numbers and drawing out some if conditions.

Here are the interesting (at least to us) points of discussion. When I'm doing this sort of teaching, I generally try not to answer for her unless it's clear that she really has no idea what's going on. In that case, I'll usually take a step back and try to explain the concept in a different way and come back to the actual question I've posed another time to see if it "clicks". This usually makes the learning a little more fun. It's a game of discovery. Though, of course, takes longer than just shoveling information in :)

Anyway, on to the questions (note she didn't necessarily get all of these - one or two I listened to her reasoning and then shelved until another day - especially the points on nil and variable initialisation):

Creating a Phoenix Framework Application with SQLite

Phoenix defaults to using Postgres, which is probably very sensible. Getting SQLite up and running using Sqlite.Ecto, though, is reasonably easy.

First, add the dependency for sqlite_ecto in mix.exs:

defp deps do
{:sqlite_ecto, "~> 1.0.0"},

Now update your config/dev.exs to include reference Sqlite.Ecto:

config :widgetapp, Widgetapp.Repo,
adapter: Sqlite.Ecto,
database: "path/to/widgetapp.sqlite"

Drop into your console and run mix deps.get to pull in your new dependencies and then run mix ecto.create to get your database set up. If all goes well, you should get a new file created as specified in your dev.exs as above.

From here, just develop as normal.

Using highlight.js with WordPress' Twenty Fifteen theme

I've recently switched over to using highlight.js for my code highlighting on my blog mainly because it supports Elixir syntax highlighting.

Out of the box, running highlight.js with the WorPress Twenty Fifteen theme causes lines of code to wrap.

I wanted lines to run continuously with a scroll bar.

The following css tweaks did it for me:

pre {
border: 0;
padding: 0;
white-space: pre;
overflow-wrap: normal;
code {
overflow-x: scroll;

Your mileage may vary, but some quick tests seem to show this working reasonably well.

Of course you might not want to override the pre tag completely and instead push the changes into a class to apply to any pre tags wrapping your code.

And you'll want to rig this up into a child theme so you don't jettison your code when you run an update.

Elixir, Phoenix, Ubuntu VMs and non-SMP VMAborted errors

If you're playing with Elixir and Phoenix in a VM and you find yourself getting peculiar errors when doing sqlite database inserts like this:

enif_send: env==NULL on non-SMP VMAborted

Make sure your VM has more than one CPU

Rendering partials from other view modules in Phoenix

Playing with Elixir and Phoenix at the moment, so here is a small snippet.

When you want to render a partial template that sits in a different view module, here's the syntax:

<%= render ApplicationName.ViewModule, viewname, conn: @conn %>

So, for example, if we have the following application structure:

- controllers
- about_controller.ex
- page_controller.ex
- templates
- about
- index.html.eex
- page
- index.html.eex
- views
- about_view.ex
- page_view.ex

And we want to include the about/index page on our page/index, we can do the following somewhere in our index.html.eex:

<%= render Myapp.AboutView, "indexhtml", conn: @conn %>

Django REST Framework - Could not resolve URL for hyperlinked relationship using view name

This post is from 2013, it is probably out of date.

Here's an error that's all too easy to stumble on if you are just hacking your way into an API using Django REST Framework

Could not resolve URL for hyperlinked relationship using view name "model-detail". You may have failed to include the related model in your API, or incorrectly configured the `lookup_field` attribute on this field

Recreating the error

I'll assume you have, at least, a basic Django site up and running. Perhaps you are a little impatient (like me) and you skim the Django REST Framework's homepage. You add the following in to settings.py:
# Use hyperlinked styles by default.
# Only used if the `serializer_class` attribute is not set on a view.

# Use Django's standard `django.contrib.auth` permissions,
# or allow read-only access for unauthenticated users.

And then, somehow, you skip the rest of the tutorial and you end up with something like this:

class Pet(models.Model):
name = models.CharField(max_length=250)
date_of_birth = models.DateTimeField()

def __unicode__(self):
return self.name

## views.py:
from .models import Pet
from rest_framework.generics import(

class PetAPIListCreateView(ListCreateAPIView):
model = Pet

## urls.py:
from django.conf.urls import patterns, include, url
from .views import PetAPIListCreateView

urlpatterns = patterns('',
url(r'^api/$', PetAPIListCreateView.as_view()),

Make sure you run your migration or sync your db.

Now you should be able to browse your api with the following url:


Now add some data into your Pet table and refresh your api endpoint.

"Could not resolve URL for hyperlinked relationship using view name "pet-detail". You may have failed to include the related model in your API, or incorrectly configured the `lookup_field` attribute on this field."


There are two methods to fix this - the first (possibly simplest) is to cut the following out of settings.py (commented out so you don't skim my post and add it in again): ```python #REST_FRAMEWORK = { # # Use hyperlinked styles by default. # # Only used if the `serializer_class` attribute is not set on a view. # 'DEFAULT_MODEL_SERIALIZER_CLASS': # 'rest_framework.serializers.HyperlinkedModelSerializer', # # # Use Django's standard `django.contrib.auth` permissions, # # or allow read-only access for unauthenticated users. # 'DEFAULT_PERMISSION_CLASSES': [ # 'rest_framework.permissions.DjangoModelPermissionsOrAnonReadOnly' # ] #} ``` In actual fact, removing just the following two lines will fix it for you: ```python #'DEFAULT_MODEL_SERIALIZER_CLASS': #'rest_framework.serializers.HyperlinkedModelSerializer', ``` The second is to create a serializers.py class like so: ```python from rest_framework import serializers from .models import Pet class PetSerializer(serializers.ModelSerializer): class Meta: model=Pet fields =('id','name','date_of_birth') ``` and update your views.py as follows: ```python from .models import Pet from .serializers import PetSerializer from rest_framework.generics import( ListCreateAPIView ) class PetAPIListCreateView(ListCreateAPIView): queryset = Pet.objects.all() serializer_class = PetSerializer ```

Why does this happen?

Well, the key lies in the Default Model Serializer Class - when it is set to HyperlinkedModelSerializer, the rest framework needs a serializer to work with.

Models - Rails and ASP.NET MVC, Properties and Constraints

Some rather vague thoughts on models in Rails and ASP.NET MVC. Mostly out of interest in the different approaches rather than as a critique of either framework, because, well because that's a different blog post....Let's imagine, for this little ditty, that we are dealing with a blogging engine.


In Rails, properties don't have to be explicitly declared in the models, so you get something like this:
class Post < ActiveRecord::Base

It's nice and clean, nice and simple, but, looking at the model, you have no idea what's in there. Instead, if you're a new developer on the project, you can scan through the schema.rb or dive straight into the database to find out.

Contrast that with ASP.NET MVC which takes a more explicit approach:

namespace YetAnotherBlogEngine.Models
public class Post
public int Id (get; set;}
public string Title {get; set;}
public string Body {get; set;}
public int Category_Id {get; set;}
public datetime Date_Published {get; set;}

So a new developer bounced in to the project knows what properties this model has by looking at the model rather than a schema.

I wonder, to myself mainly, if the differences are a function of the duck typed vs strongly typed nature of ruby vs c#, and the fact that with ASP.NET MVC the toolest (Visual Studio with its autocomplete and continual compilation process) was there before the framework, whereas Rails was developed as a framework before the toolset (Texmate, Vim + plugins, Host of other 3rd Party dev environments and obviously Sublime Text).


Let's continue and add some sort of constraint - perhaps a blog post must have a title. That makes sense. I'm looking at the code first method in ASP.NET MVC because I think it provides a little more of a like-with-like comparison to Rails.

Rails takes the approach that model level validations are the way to go:

Model-level validations are the best way to ensure that only valid data is saved into your database. They are database agnostic, cannot be bypassed by end users, and are convenient to test and maintain

So, we can enhance our model like so:

class Post < ActiveRecord::Base
validates :title, :presence => true

This, by itself, will not persist your constraint to the database schema. You would need to do that in your migration in Rails.

The Active Record way claims that intelligence belongs in your models, not in the database. As such, features such as triggers or foreign key constraints, which push some of that intelligence back into the database, are not heavily used.


Although Active Record does not provide any tools for working directly with such features, the execute method can be used to execute arbitrary SQL.

ASP.NET MVC on the other hand, is pretty strongly tied to SQL Server, or at least, the implicit assumption is that you're going to be using SQL Server, so applying constraints to a model (called data annotations) and then running your update-database (roughly the equivalent of a Rails migration) will apply your constraint at the database level as well as the model level (now I haven't validated every data annotation, but I'm talking about things like foreign keys and non-null values, and obviously this won't work with custom annotations).

So, our ASP.NET MVC model would look something like this:

using System;
using System.Collections.Generic;
using System.Linq;
using System.Web;
using System.ComponentModel.DataAnnotations;

namespace YetAnotherBlogEngine.Models
public class Post
public int Id (get; set;}
public string Title {get; set;}
public string Body {get; set;}
public int Category_Id {get; set;}
public datetime Date_Published {get; set;}

I've added in the using statements, because you have to explicitly pop the DataAnnotations in there.

What I like about the Rails style, is looking at your model, you get a quick sense of what validations are going on because they are all collected at the top. With ASP.NET MVC, getting an overview of the constraints on a model takes a bit longer. Especially when your model grows in size, because you have to scan through all the properties to single out which ones have data annotations. Also, annotations in ASP.NET MVC are a bit of a mixed bag, because apart from constraints, you can also specify the front end display of a field. For example:

[Display(Name = "Post Title")]
public string Title {get; set;}

will cause your views to render "Post Title" as the label for the Title field by default. Anyway, lets not get into that too much because the data annotations are actually part of the entity framework rather than the mvc framework per se.

A true like for like comparison between ASP.NET MVC and Rails is not really possible. ASP.NET MVC, actually wants you to develop in a Model, View, View-Model, Controller paradigm. This extra layer in .NET makes more sense when you start to work with the strongly typed views and you realise that .NET doesn't actually want your views to use your models in anything but the most basic examples. Though often it feels like you are writing code for code's sake.

I'm not sure there is any punch line to this post other than maybe accepting that each framework has a philosophy and you should "render unto Caesar" in each framework and go with the overriding ideas, idioms and philosophies.

THE Micropad

I wonder if Microsoft can produce something as aesthetically pleasing as Apple's offering.

Setting up your own certificate authority on IIS7 using OpenSSL and securing your web api with client certificates

Creating self signed certificates isn't really all that complicated, but it can be a little intimidating the first time you do it.

What are we trying to achieve?

1) A web api that is protected by client certificates hosted on IIS7. 2) A way to test it out from our browser.


Well, the use case is a web api that is not open to the public. We want to secure it such that only clients with the relevant certificates can access the api.


A web site that is protected by a valid SSL certificate OpenSSL - I installed this on a 64bit version of Win7. Be warned, you need Visual C++ 2008 redistributable installed first. Be warned, yet again, that even though open ssl should install in 64bit mode, I couldn't get it working so I just took the 32 bit. Hasn't done me any harm..... :)


Testing your api from a browser like IE requires you to have a p12 client certificate to import into your personal certificate store. This one caught me out for a while.

Setting up your root certificate authority

First create a key pair that you will use to sign your certificate:
openssl genrsa -des3 -out root-ca.key 1024
Enter a strong pass phrase. This is the most vital pass phrase you will ever come up with - your root certificate is what you will use to sign client certificates and it is what will be installed on your IIS7. Basically, if someone gets your root cert and your passphrase for it, they can create their own client certificates and your web api will trust them. If you are super paranoid, disconnect the server that you are using to create this certificate from the network forever.

Now use your key pair to create and sign a root certificate:

<code class="c">openssl req -new -x509 -days 3650 -key root-ca.key -out root-ca.crt

We are generating a certificate that will be valid for 10 years. Make it shorter if you prefer. You'll be prompted for your root key pair pass phrase and a bunch of info - fill it in, forget about the email address.

You now have a root certificate - this is what you will install on your web server. You will also use the root certificate to sign client certificate requests. Once the client certificate request has been signed by with the root certificate, any requests to your secured api with these client certificates will be implicitly trusted by your web server.

Make your webserver recognise your new certificate as a trusted Certificate Authority

Copy your root-ca.crt file to your webserver. Now, either double click the .crt file or if you prefer, open up a command prompt, "mmc" and follow these instructions: * Click File->Add/Remove Snap-in * Select "Certificates" and choose "Computer Account". * Expand "Trusted Root Certification Authorities" and right-mouse button, "Import". * Find your root-ca.crt and complete the import.

Create a website and require certificates

Create a new site or application in IIS, and then using the IIS manager, select the SSL Settings.

Make sure Require SSL is checked and that the Client Certificates option is set to Require.

If you try to browse your website now, you should get an access is denied message.

Create a client certificate request

You've got your root cert. You've installed it on your webserver. You've locked down your website. All that's left to do is create the client certificate, install it in your certificate store on the client machine and away you go.

Open up the command prompt on your client machine which has openssl installed on it and:

openssl genrsa -out client-cert.key 1024

As above, we generate a keypair, and then create the certificate request:

openssl req -new -key client-cert.key -out client-cert.csr

Again, you'll be prompted for all sorts of information - fill it in. When you put the organisation name and common name, use something different from your root certificate above so you can keep tabs on things in your personal certificate store.

Now, we use the client certificate request and create a client certificate, signing the certificate with our root certificate:

openssl x509 -req -day 3650 -CA pathtoroot-ca.crt -CAkey pathtoroot-ca.key -CAcreateserial -in client-cert.csr -out client-cert.crt

You'll be asked for your root certificate pass phrase - you remember, the one I told you was super-important above.

Very cool, we now have a client certificate that IIS will trust because it has been signed by a root certificate that IIS trusts. Woohooo.

And here is the gotcha - If you simply install this .crt in your certificate store and try to browse your locked down website, it just will not work!

And here is the fix - see Internet Explorer needs the certificate to be in a specific format (pkcs12) for it to actually present the certificate to the webserver when you go a-browsing. Luckily, openssl allows us to fix this issue:

openssl pkcs12 -export -clcerts -in client-cert.crt -inkey client-cert.key -out client-cert.p12

Import your p12 certificate into your local personal certificate store

Again, double click the .p12 file, or go through mmc to import the p12 certificate and you should be away. Close IE, re-open and browse to your locked down website. You should be greeted with your website.

Aaaaah. That wasn't so bad was it?

Meteor - This is what Asp.Net webforms could have been

Meteor ticks the boxes for a realtime web application; "one language", "realtime as default". It looks like it eases the path of development, automatically updating your front end templates when your data changes by implementing an subscription model.

I'm playing with it. Enjoying the feeling of coding everything in one language. It's definitely a smoother process for a developer. And all of a sudden, I get a feeling of deja-vu. Someone else tried to do this before, didn't they? Someone named Microsoft. I remember back when ASP.NET came out and we all frantically switched from the old-and-inferior-scripting based technology to the all-new-and-improved-web-forms based technology. The promise was the same. No more hacking away in VBScript (erm I mean Asp) for your backend and Javascript for your frontend. No, instead you could write C# in your magical code-behind pages and all but forget about frontend jiggery-pockery.

Web forms really tries. You define your HTML in an aspx file - these are basically your templates. Elements can be set to have a data source which will, for example, infill the data for the element from your database. Elements can also be set to post back to the server anytime they change, get clicked or what have you.

Postbacks are tied to events in the C# code behind pages which means that you can move all your logic server side. It makes developing for the web more like writing a traditional style windows client application.

Now the html controls, of course, responded to Javascript to make the page post back. But it wasn't Javascript that the developer had to write. It was auto-generated. Awesome. Awesome. Awesome. One language....C#.

Of course it isn't without it's downside. The ViewState - a massive chunk of data that kept....state between page loads. That's a nasty piece of work. Ingenious in it's own way, but nasty when you bump up against it. And of course there is the enormous expense of re-rendering the entire page every time a select box that is wired up to post back to the server changes. Large Web forms applications can become sloooooow on the client side.

Meteor does something similar. Only now with baked in Ajax goodness, the applications actually feel useable. I wonder if Microsoft missed the boat? I wonder if they could have leveraged Web forms, or something like it, to be more like Meteor. Maybe they do now - Web forms is still alive and well, but I haven't (thank goodness) had to use it in a very long time so I'm somewhat out of touch.

Where Microsoft really shine, in my opinion, is with their IDE. Wiring up server side events to a button click really is easy - just open up your aspx file (the template), double click on the button and Visual Studio will wire up the event for you and pop you into the server side event so you can write your code. On a large page, this really does take some pain away. You don't make stupid typos wiring up your button because you don't have to type it in. It know Vim is awesome and all, but this part of the developer experience, Microsoft really does well.

Somewhere in the future, I imagine the best of both worlds. Something like Meteor + Visual Studio all open sourced and ready to go.

Next Page