Google AIY…

Ok, before I get started, I wanted to offer a huge thank you to Jamie at The Pi Hut for sorting me out with a copy of the MagPi with the AIY kit. You are a star!

Back in May, the Internet exploded with the news that the MagPi magazine was offering a free Google AIY kit to use with Raspberry Pi. Copies sold out in minutes and people hunted across branches of WHSmiths to try and get their hand on one. Shops got new stock that sold out instantly.

The concept? Google, working with Raspberry Pi, have created a simple voice-interaction module so that you can use your Pi like an Amazon Alexa (but powered by Google).

Naturally, the makersphere exploded with excitement and projects began surfacing across the Internet. Copies of the MagPi were selling on eBay for extortionate prices and people were chattering about what they would create.

I’ve had my copy of the Magpi for a few months, but I wanted to wait until my friend Kirk was visiting as I knew he’d enjoy himself helping us with the making.


We pulled the kit out of the box, and being the methodical cool kid that I am, I quickly checked the components against the kit I had. This is where we hit the first hurdle – according to the instructions, we had a ’40-pin header’ that was pre-soldered, but the picture showed the 40 single pins that most definitely weren’t pre-soldered. To further add to the confusion, all of the photos of the accessory board within the tutorial appeared to show the 40 single pins soldered on to the board, but the first line of instructions stated that no soldering was necessary – so which was it? Did we need to solder or not?

Further exploration and a brief skim of the instructions made us realise that we didn’t actually need to solder the extra pins on, but we were pretty confused for a while there. Why show them soldered on if we don’t need them!?

Kirk decided to boot up a pi and start writing the code and it was at this point that we realised that we need a special version of OS. Stuart got pretty grumpy at this point because he only knew how to flash SD cards using a command line and he claimed the image file could only be read using a Linux machine – fortunately, I explained about Etcher and we were back on track.IMG_2127.JPG

Before we got any further, we decided to stop for dinner – it had been an hour already and all we’d managed to do was flash an image and get confused about the components.


After dinner, Kirk checked the SD card – success!

We knew we had the right image when the AIY background popped up so Kirk made a start on setting up the AIY API (supposedly the hardest part of the task).

This is where we hit on yet another issue – the new OS image was convinced that we were time travellers and that it was actually January 2018 – Kirk spent quite a bit of time fighting with the date and even tried to just ignore it, but Chrome clearly had other ideas.

I’m not sure what Kirk did, but he managed to get it working eventually, however, the pi was renamed as ‘TimePi’ to represent it’s time-travelling capabilities.

While Kirk got on with the software, I was in charge of building the hardware, while Stuart caused problems by putting things together ahead of the instructions! (Kirk also decided that we need to have a paintbrush in all pictures…each to their own).


The next instruction that made Stuart cross involved using Scotch tape to hold the microphone in place – firstly the microphone didn’t sit flush unless you really rammed it into the cardboard and secondly, Scotch tape did not stick to the cardboard properly and so I had to go and find some masking tape instead.

Meanwhile, Kirk had successfully managed to set up the Google AIY API so that it worked with one of my Gmail accounts – both boys said it was surprisingly easy, considering this was the part that people had warned me about going wrong. Having said that, both Kirk and Stuart are hugely experienced using Google products so they didn’t use the instructions particularly, it was something they already knew how to do.

Kirk wanted me to point out that he found the software set up quite confusing as there were instructions in the magazine as well as completely different instructions online, but he managed to get everything working eventually and when i came in from loading up the dishwasher, the two boys were arguing over whether they were on instruction 9 or 12 depending on whether they were using the online instructions or the magazine instructions.

At this point I realised that there was absolutely no way that I would have been able to get the software side working without the boys and I became confident that this was not a beginners’ project.

I took this opportunity to make some cocktails, while Stuart got excited about there being a slot in the cardboard to easily add the SD card… simple things 😉

We got the light to turn on, but hit on some problems


Stuart dialled in using SSH and we realised that the speaker wasn’t working – on closer inspection, one of the cables that I had previously screwed in had come loose and so we had to pull it apart to refit the cable – personally, I think Stuart knocked it out when he was putting things together in the wrong order, but he claims it was my shoddy building skills!

So, we got the AIY working but were slightly disappointed

Without another add on, we weren’t able to ask the AIY to play music – in fact, it didn’t seem to do much, but the boys got excited and started trying to find out what to do next.IMG_2157.JPG

A bit of hunting and I found a great post on the Raspberry Pi forum which explained how to get the AIY to interact with YouTube. Thanks for posting MikeRR


It took a while, but Stuart and Kirk were successful and managed to get techno music playing…

The next step was that Stuart wanted to get Radio 6 music playing through the AIY…Things did not quite go as planned, with the AIY mis-hearing Stuart’s instructions a little bit…


At this point, I was completely oblivious as to what the boys were doing, which once again affirmed that this wasn’t really a project for me, but rather it was better suited to makers and people with greater understanding of both software and hardware. At least they were having fun.

So, in conclusion, I’m not convinced this Raspberry Pi project is one for schools, but in defence of the MagPi, it was never advertised as an educational project, it’s definitely one that’s more for the makers than the educators. The instructions could have done with a bit of polishing, but I suspect most makers wouldn’t have got as confused as I did by something as simple as a pre-soldered header. Basically, this particularly piece was a bit beyond my ability, but the boys had fun, particularly when they got beyond the most basic stage – it was around 11pm when I eventually kicked Kirk out so they must’ve been really enjoying themselves.

The thing is… now what? We have a built AIY unit, but what do we do with it now, other then get it to play Techno Music…


Shakey Sense Hat Cat

After Sunday’s coding session, I set the boys the task of making our Scratch Sense Hat Cat Shake, just like Carrie Anne’s Interactive Pixel Pet.

The first thing the boys did was to figure out how to use some of the sensors on the SenseHat – remember how in my other post, I said it was good practice to run the basic broadcast command before you do anything.


We had found some code from Albert’s GitHub page, however, when we tried to select the sensor value for accelerometer, we only had a few choices as shown below.



Luckily, after trying a few things, I suggested that we hit the green flag to check that the GPIO pins were on and that Scratch knew we had a SenseHat attached. When we next checked the sensing options a while heap of new options appeared, including the accelerometer (sorry, I forgot to screenshot it).

The boys had great fun playing with the sensors, but couldn’t quite figure out how to get the ‘shake’ function working so they went back to the original code for Interactive Pixel Pet.

x, y, z = sense.get_accelerometer_raw().values()

while x<2 and y<2 and z<2:      
    x, y, z = sense.get_accelerometer_raw().values()

This is what they come up with:


A job well done, if I do say so!

Now, I’m sure some of you have spotted that I could neaten up my code by removing the ‘ledbackground’ line and that ‘clearleds’ would be better suited to the end of the repeat loop as that would leave me with a completely blank neopixel array at the end of the animation sequence, but otherwise I’m pleased with our work in recreating the pixel pet for Scratch.

I look forward to trying out some of the other sensors using Scratch in the future!


Sense Hat Cat using Scratch

So, I absolutely LOVE the Interactive Pixel Pet activity from the Raspberry Pi website, and while I was playing with the Sense Hat the other day, I realised it was possible to imitate it using Scratch. So far I’ve only got it running as an animation, so next step is to get the shake function working as we’ve just figured out how easy it is to use the other sensors on the hat using Scratch.

I had a play and managed to get a very cool dancing cat on my LED matrix – I’m not going to lie, I was super excited and may have run around showing everyone in a slightly excited manner. Fortunately, my colleagues were also excited, although their contributions of dancing ‘poo emojis’ weren’t quite what I had in mind.


Here’s a bit of background on the Sense Hat… for those of you who don’t already know, the Sense Hat was created by the Raspberry Pi Foundation and launched as Astro Pi – a competition to get your pupils’ code into space. It has an 8×8 neopixel array, a mini joystick and a load of amazing sensors like humidity, pressure, gyroscope and accelerometer.

So, the first thing you always need to remember when using Scratch GPIO is that you have to turn on the GPIO server on and, if you’re using a hat, you’ll need to let it know which hat it is by using the command “set AddOn to”.


This is pretty important for anything using the SenseHat and it’s good practice to run it before you go any further in your code as by running it, Scratch will realise you have access to all of the sensors on the hat and allow you to access them through the drop down menu in the blue ‘sensor value’ block.

Firstly you will need to delete the Scratch Cat so that you can draw you own sprite.


In the paint editor, you need to zoom right in as far as you can and select the smallest brush size.


You have four squares in total to draw your image – I’ve shown this here by making the area black (you don’t need to do this, but it can help as ‘black’ represents the neopixel being turned off).


Now you can draw your image – you have exactly 64 pixels to draw with and, as you may have guessed, one pixel on the screen represents one neopixel on the sense hat. By the way, a neopixel is a very bright LED which can be any colour depending on the mix of red green and blue. The lighter your colour, the brighter it will appear on your neopixels so try to avoid dark browns and blues etc.


Next you need to create a second image – you need to use the duplicate command to create a second version of your image.


Then click on the costumes tab to be able to edit it a little bit so that you can make your second sprite slightly different, thus giving the appearance of animation.


Finally, you need some simple Scratch code to get your image moving – I’ve put a couple of broadcast commands in here to clear the SenseHat before you start and to make sure that the background is black (so turned off).


You can experiment if you want by changing the background colour, although this will only make a difference if your sprite is ‘backgroundless’ (but you have to make sure it’s still only 8 pixels/2 squares wide).


I’ve had great fun recreating this project in Scratch and I’ve set Stuart and Kirk on a mission to figure out the ‘shake’ control too so hopefully I can add an update soon.

<edit> Kirk and Stuart have successfully managed to get shake working and are now celebrating with chocolate cake



Part two of this project can be found here.

All thanks to Albert Hickey for his advice with this project – he is a Scratch and SenseHat guru!!

Watch this space for some more projects using Scratch soon!

A Strange Experience – Being on the Other Side

Since I’ve been working for pi-top, I’ve experienced being on the other side of the EdTech system and it’s certainly been a bit of an eye-opener

I’ve tested various products over the years and found problems and complaints, bugs and surprises, delights and nightmares, but it has been a really interesting experience for me being a producer of content rather than just a consumer.

Firstly, I thought it would be really easy to implement all of the things on my ‘want’ list – it turns out that it’s nowhere near as easy to just ‘add a button that prints out all of the users’ or ‘add a widget that allows the teacher to find out the answer’. All of these things require thought, tweaking of the UI (user interface) and lots of code.

I’ve learnt that things that seem obvious to me are not necessarily useful or even acknowledged by other users.

I’ve learnt that a developer can spend two weeks working overtime to completely overhaul the interface and I’ve not even noticed a difference (sorry).

I’ve discovered that it’s really important to make it clear what the delete button does… and I definitely didn’t accidentally delete a huge chunk of a resource which, thankfully, had been backed up.

I’ve found out that it’s really, really important to get more than one opinion and that relying on mine alone is not enough.

I’ve learnt that developers can’t write resources for beginners even though they really, really mean the best and want to help.

I’ve learnt that even someone like me can make things too difficult for beginners and it’s important to have someone who is truly a novice to try things out.

I’ve found out that sometimes developers just want to sit and watch you use the interface whilst giving them a running commentary so they can figure out what needs to be done next.

I’ve learnt that creating good quality content takes time, creating interfaces takes time and editing information takes time.

I’ve discovered that ‘popping over to ask a quick question’ is akin to tossing an hour’s worth of work into the bin for a developer and it’s better to contact them over Slack.

I’ve found out that developers don’t read emails…

Above all, I’ve learnt that being this side of the interface is HARD WORK and although we sometimes get frustrated with developers bringing out software that doesn’t do exactly what we want it to do, it’s not through lack of trying. It’s pretty important to give developers constructive feedback explaining exactly what doesn’t work as you’d expect and what you’d like it to do instead rather than getting cross and frustrated with it. Communication is vital to ensuring that a product is the best it can be.

Finally, I’ve learnt that pi-topCODER is going to be an incredible resource when we’re done with it and I’m proud to have been part of the team working on it, even if I sometimes feel like I don’t really know anything compared to the people making it!

Swift Playgrounds

It’s about time Apple joined the Coding Revolution – with Raspberry Pi, and Google running projects for years, it was only a matter of time before something was released. And boy is it a good one… with glossy graphics and slick tutorials, Swift Playgrounds has certainly hit the ground running as a way to teach coding concepts to pupils on an iPad. However, it’s not without faults, but then nothing is, so let’s take a look.

A few weeks ago, I visited Apple HQ in London along with a few CAS Master Teachers and various CAS reps and teachers. As it turned out, the majority of attendees were primary school teachers, which brings us to the first flaw in the Swift Playgrounds roll-out. The first thing we were told about it was that it was primarily made for Year 7 pupils and older and this becomes clear when you work through activities as the vocabulary is very dense and would certainly lose many younger pupils. However, the look and feel of the app is very primary-friendly which is why most of the secondary school teachers hadn’t shown an interest, assuming it was ‘not for them’. Indeed, a primary ex-colleague of mine was recently shown Swift Playgrounds and after about ten minutes, decided it would be the perfect way to teach KS2 coding, unaware of its secondary-school target audience. When you spend some time playing through and looking at it, however, you begin to realise that it is indeed best suited to KS3, particularly because of the skills it is highlighting and teaching.

So, you can see very easily just how much effort Apple have put into Swift Playgrounds and how determined they are to make it a useful classroom tool. Not only is there a wealth of content that is easy to download, there are accompanying iBooks full of Keynote presentations, information, progress charts and comparisons to the CSTA standards. Information is made as clear as possible and it is quite fun to play the games. You can explore the current playground challenge by rotating, zooming, changing angle etc. Code is presented in text boxes with lines and phrases of Swift pre-written in them, and it is still drag and drop so that pupils become familiar with the language without having to write it by themselves. This makes coding and debugging easier when they are ready to move on to independent coding. You are also able to select a different character and alter the speed at which your code is run, which adds an element of personalisation.

img_0003   img_0004You can see here the interface for downloading lessons and a selection of the different types of lesson, including one for Hour of Code. There are some interesting resources that are worth exploring as all of them are slick and well made.


The first tutorial is called ‘Learn to Code 1’ and it talks you through using the interface from the beginning. img_0005img_0006

At the start of the game, you are limited to a few commands, but as you move through you are offered more commands and, in the second section, you are shown how to create your own commands, or functions.


Now, one thing that was picked up on the training that I attended was that the US curriculum for Computer Science places more emphasis on explaining functions than on the word algorithm, which is different to the UK curriculum, where algorithm is considered a core word for coding and function is a later skill to learn. It is worth bearing this in mind as Swift Playground is geared towards the US curriculum. However, this isn’t necessarily a bad thing.

Let’s take a look at the code needed for ‘Four Stash Sweep’, which is approximately halfway through Learn to Code 1, with my solution to the problem included.

My solution is certainly not the most elegant, but it does demonstrate the complexity of some of the easy tasks, I can’t imagine doing this with primary-age pupils without a lot of support – I’ve had to write three functions to make my code more efficient as well as understanding ‘for i in range’ as a loop. Don’t get me wrong, I’m not complaining about the content, but it does make it clear that Apple are right to pitch this as a KS3 resource in spite of it looking like something for the juniors. It is definitely teaching text-based coding concepts, even if you are dragging the blocks of code into place.

A bigger problem lies in its running speed. When I tested it at Apple, it worked really smoothly, everything was simple and easy to use, but of course we were using brand new iPads. In contrast, when I used it at home on an iPad mini 2, it was slow and frustrating at times. The iPad mini 2 is the minimum specification device required to use Swift Playgrounds, along with the iPad Air, meaning that some early-adopting schools are already feeling excluded unless they upgrade their iPads. Perhaps those schools should consider upgrading them, but it is upsetting when there’s no budget to do so.

So, what next?

Apple are marketing Swift Playgrounds as a way to get to grips with Swift, their open-source language which allows users to create apps and content for iOS and macOS. This is very appealing to schools and young people because, let’s face it, who doesn’t want to be the next app-store millionaire. Making learning goal-orientated makes it instantly more fun and so to present to pupils that they could eventually make a real-life app will certainly inspire them to get more interested in learning to code. The fact that when you use Xcode to write Swift, you can use a playground to test your code, is deliberate to draw a link between Swift Playgrounds and the more ‘real’ Xcode environment and is a clever move by Apple, albeit one that confused existing users as to which playground was which. Swift works across multiple systems, including Linux and therefore Raspbian and I look forward to hearing about some Apple/Raspberry Pi crossovers in the future – perhaps we’ll finally see a RPi physical computing project which is controlled from an iPad!

Where does it fit?

My gut instinct is that Swift Playgrounds would be a great tool for a flipped learning environment. Pupils could work through the game in their own time and come to school armed with questions. Teachers could discuss concepts and offer their class challenges based on the skills they’ve practised at home while using the app. I think it is a great tool for KS3 programming and a lovely way to introduce pupils to the world of programming. I would worry about a whole class just sitting and plodding through in the classroom without the teacher bothering to be involved and it would far too tempting to just sit back and let them get on with it which is why I think it would be better suited to independent work outside of the classroom so that the teacher could focus on discussing the skills and developing them in the classroom.

My initial concern that it was a little too restrictive, like Discovery Coding, have been dispelled and I think there is plenty of opportunity for pupils to explore and create once they have learnt the most basic skills. There are some lovely, interesting resources already available (I recommend taking a look at ‘Drawing Sounds’ in the Swift Playgrounds ‘featured’ tab which you can download and play with) and I look forward to exploring and creating my own playgrounds once I’m more confident and perhaps after I’ve worked through all of the ‘Learn to Code’ modules.





Crumble Bot by Redfern Electronics

A few months ago, I donated ‘an afternoon of robot building’ to a charity auction and so this weekend I headed out to fulfil my promise.

Last month I tried out the CamJam Edukit, which I absolutely loved, but quickly realised wasn’t necessarily the right tool for using with 9 and 11 year olds so, on the recommendation of the wonderful Nic Hughes, I bought a Crumble Bot from Redfern Electronics.


The robot is based around the crumble controller, which is a simple programmable board with simple inputs and outputs.


Photo credit

The kit comes with an easy to follow booklet which shows very clearly how to build the robot.

Screen Shot 2016-05-15 at 08.38.52.png

We started off by fitting the Crumble to the base unit – it required a bit of force to push the controller into the base, but with the help of big brother Rupert, Annabelle was able to push it into place.


The boys handled screwing bolts through the motors and it was only after they’d screwed both pieces in that we realised that they had fitted them to the motor the wrong way around! As you can see from the diagram, it’s quite important that you put the screws in the right way around, otherwise you can’t attach the wheels later!!

Screen Shot 2016-05-15 at 08.38.58.pngAlerted to the boys desire to do everything backwards, we kept a careful eye and had to check carefully to ensure that Alexi had put the screw through the base the right way around.

Screen Shot 2016-05-15 at 08.41.20.png

Luckily, we caught it before any further mistakes were made.

The children worked really hard and built their robot in around forty five minutes – we were really impressed with how easy to follow the directions were and how quickly we had a working robot.

Our next job was to work out the Crumble interface. Redfern have created a ‘Scratch-like’ interface to make programming the bot easy and we were really pleased when, after a few seconds, we had a robot moving across the floor.

Screen Shot 2016-05-14 at 21.49.24.png

We’d had some great conversations so far about circuits and the need to ground the motors, we discussed how the controller is used to control the robot and now we were ready to get the robot to draw some shapes on the floor. Once we added a loop in, we discovered there were only two ways to stop the programme running. Either we unclipped a crocodile clip from the battery pack, which wasn’t easy unless we’d removed the slippery protection from the clip and risked a short circuit, or we pushed the stop button on the computer interface, but this only worked while the bot was plugged into the computer. Perhaps a future step for us would be to put in a button which ‘stopped’ the code as it was quite a pain to stop it running otherwise (saying that, it was pretty funny watching Sia dive across the floor and grabbing the robot while frantically pulling out crocodile clips).

Screen Shot 2016-05-15 at 08.51.06.pngThe children worked hard to figure out how to make a square on the floor and had great fun making the bot dance, but there was one more piece of kit in the box. We had one neo pixel ‘Sparkle‘. With no instructions for getting the Sparkle working, we searched the internet to find some instructions. Our vain hope that we’d be able to just drag a ‘set sparkle’ block didn’t work out.

Screen Shot 2016-05-15 at 08.55.14.pngWe tried setting it up in various different ways and, from looking at some diagrams in blog posts, we realised that the arrows on the Sparkle were misleading and didn’t actually tell us where to clip the crocodile clips, but were to let users know how to set up the Sparkles in series.

Screen Shot 2016-05-15 at 09.05.54

I would’ve liked there to be clearer information about how to clip in the the Sparkle and how to programme it as it took us a long time to figure out – the information is there, but it wasn’t as easy for us to find as I would’ve liked.

We eventually found some code which included the block ‘let ‘t’ = 0′ and so we tried dropping a block in the beginning of our code and this seemed to do the trick, so now our robot both drew a square and had a red sparkley light. Screen Shot 2016-05-14 at 21.48.30IMG_9150

I’ve now been told by Redfern that we shouldn’t have needed the variable block to get the sparkle working so who knows what we were doing wrong!

One other issue we struggled with was that the Sparkle tended to flicker between colours for no apparent reason – to be honest, the children actually quite liked this; it wasn’t intentional and is perhaps a hardware fault that could be improved on, but nothing to worry about.

I admit to a moment of indulgence at this point – I wanted to see if I could make the Sparkle phase through different colours and so I tried the following code:

Screen Shot 2016-05-14 at 21.48.01.pngI was pretty pleased when the bot went through blues, pinks and reds as it drew its square and I’d love to have time to further develop this code!

Anyway, we had a super afternoon building our Crumble Bot – the children were pleased with their robot and had just as much fun coming up with code for it as they did building it in the first place. I would thoroughly recommend this for primary school teachers as the robot could easily be dismantled and rebuilt and it was great fun to do – it really emphasised the importance of physical computing to make coding relevent and fun. The Scratch-like interface for programming the robot was mostly intuitive and easy to use (although the sparkle code needs work) and the children quickly managed to achieve their goals with very little input from me. I’ve seen some wonderful projects where teachers have used this robot as a base, but have built and designed their own chassis as a DT project. It is a great way to get started with robot building in the primary school and then, once the children are more confident they can move on to more challenging projects such as the CamJam kit.

I’m really excited to see this robot in the classroom so I hope you give it a go!

Thank you to the parents of the children involved for allowing me to use photos of their children in this post.

Hour of Code 2015 – Minecraft

Hour of Code has become a global phenomenon, but with excellent resources and celebrity support, it comes as no surprise. Several websites are now running their own Hour of Code projects, but I want to look at the ones on the official site as I’m really impressed with their offerings this year.

Earlier in the year I talked about the Frozen resources and mentioned that while it is an excellent resource, it gets quite tricky near the end. I’ve noticed that in the meantime, they have addressed the issue of calculating angles being too tricky, by adding in information about the necessary angles in the description for each level. However, some of the children in my school completed both of the new resources and then tried the Frozen one and all agreed that Frozen was still the hardest of the lot.

So, Minecraft is exceptionally popular amoungst the children in my school and, with the approaching launch of the new Star Wars movie, this too proven to be a popular choice.

In the Minecraft puzzle, the children are given the choice of playing as Steve or, his female counterpart, Alex. This is a nice start as it acknowledges that children of both genders will be giving this activity a go.

As with the previous activities, you are shown a video, this time it’s from one of developers of Minecraft, who explains the activity ahead.

Screen Shot 2015-12-11 at 10.02.09.png

The code is based on blockly and introduces it’s concepts in a slow and simple manner. Once you’ve figured out of the system works, you are introduced to concepts such as shearing sheep, cutting down trees and mining resources by using ‘destroy’ blocks in the code.

Screen Shot 2015-12-11 at 10.05.48.png

I really love that when you get to level six, you can chose a difficulty level for building the foundations of your house.

Screen Shot 2015-12-11 at 10.06.00.png

We’ve just been introduced to the repeat block and the activity begins by giving us some basic code, which we are expected to modify to complete the design. The code we are given at the start won’t build our house, but with the addition of some further loops, we can complete our house.

Screen Shot 2015-12-11 at 10.07.39.png

In the Frozen code, we are usually given a limit on the number of blocks we can use for each activity, which is to encourage us to use loops effectively, but so far we haven’t been given a limit for the Minecraft code, but this changes in level 7:

Screen Shot 2015-12-11 at 10.08.46.png

The limit is not enforced, and if you exceed it, you are politely reminded that you could do it more efficiently.

Screen Shot 2015-12-11 at 10.16.59.png

If you don’t finish the activity in the required number of blocks, your status bar shows a slightly lighter green colour, which means that, as a teacher, I can clearly see who has carefully completed the activity and suggest that children look again at certain bits to try and make their code more elegant.

Screen Shot 2015-12-11 at 10.25.32.png

The activities involve a number of blocks unique to this activity such as ‘place cobblestone’ or ‘shear sheep’ and I think this is useful for children to see as they can recognise that code can be altered to suit the situation.

Screen Shot 2015-12-11 at 10.21.52.png

Slightly different to the other Hours of Code, if you complete the Minecraft activity, you get a special Minecraft themed certificate, which the children really love!


So, what are you waiting for? Give the Minecraft hour of code a go!