I've been playing with JeeNodes recently, soldering stuff together, getting wireless connections working... and then tying it into the Socket.io implementation of the real-time web. The idea is to create super-cheap controllers which can be used as interactive voting tools, RFID readers, etc... but all of which can talk to each other and change things that happen on the web.
Here's a (slightly wobbly) video demonstration I gave to a colleague in the cafe in Bolton:
The beauty of this solution is that it will work anywhere with very little setup. The controller application can be 'widgetised' and stuck in an AppStore, which means that it can be instantiated anywhere in any learning context with little difficulty. The only thing that needs to be set up is the teacher's machine (which can just be a laptop which is plugged into the network), or a little bit of extra configuration on the 'Whiteboard machine' would do the trick.
That machine also runs Socket.io and NodeJS, and bounces messages from the internet (where the controller widget is) and the local machine. That means that the teacher can control the activity of all the learners, enabling and disabling controls, sequencing activities, etc. What I've done here is design a simple sequence of activities which are coordinated by the teacher through clicking the wireless Node device.
Because it's Socket.io, it pretty much works everywhere, although with varying latency: with Chrome, it uses a WebSocket connection which is pretty much instantaneous. On my mobile phone, there is a latency of no more than about 1 second (usually less).
The JeeNode device is particularly responsive because it talks directly to the local machine (laptop, whiteboard machine) running Socket.io, and that has a direct WebSocket connection to the internet machine. So things happen very quickly.
The bell sounds are generated by PD, which responds to a UDP signal sent out by the local Socket.io machine in response to controller signals sent by the Web controller widget.
It's all a bit 'Blue-tac and sticky tape' but it works! The most interesting thing is the reaction of surprise on peoples' faces when they click on buttons on their phones... particularly when they do it together.
What I'm most passionate about is that they don't look at their screens. They look at each other.
Of course, one other thought occurred to me when I was putting my activity sequence together. "I really could do with some sort of XML specification to define how the activities flow into one another.".. hmmmm :-)
Here's a (slightly wobbly) video demonstration I gave to a colleague in the cafe in Bolton:
The beauty of this solution is that it will work anywhere with very little setup. The controller application can be 'widgetised' and stuck in an AppStore, which means that it can be instantiated anywhere in any learning context with little difficulty. The only thing that needs to be set up is the teacher's machine (which can just be a laptop which is plugged into the network), or a little bit of extra configuration on the 'Whiteboard machine' would do the trick.
That machine also runs Socket.io and NodeJS, and bounces messages from the internet (where the controller widget is) and the local machine. That means that the teacher can control the activity of all the learners, enabling and disabling controls, sequencing activities, etc. What I've done here is design a simple sequence of activities which are coordinated by the teacher through clicking the wireless Node device.
Because it's Socket.io, it pretty much works everywhere, although with varying latency: with Chrome, it uses a WebSocket connection which is pretty much instantaneous. On my mobile phone, there is a latency of no more than about 1 second (usually less).
The JeeNode device is particularly responsive because it talks directly to the local machine (laptop, whiteboard machine) running Socket.io, and that has a direct WebSocket connection to the internet machine. So things happen very quickly.
The bell sounds are generated by PD, which responds to a UDP signal sent out by the local Socket.io machine in response to controller signals sent by the Web controller widget.
It's all a bit 'Blue-tac and sticky tape' but it works! The most interesting thing is the reaction of surprise on peoples' faces when they click on buttons on their phones... particularly when they do it together.
What I'm most passionate about is that they don't look at their screens. They look at each other.
Of course, one other thought occurred to me when I was putting my activity sequence together. "I really could do with some sort of XML specification to define how the activities flow into one another.".. hmmmm :-)
4 comments:
XML spec of activity flow? I'm hearing IMS LD in there... :)
Of course! I groan... I'd thought we'd seen the back of it... But we always knew there was something in it....
Taken to its dystopian extreme, the school headmaster could have this gearbox in his office, where he can remotely control all classes simultaneously. "Start reading... answer questiones... 5 minutes of cowriting... now let's do a quick exam and call it a day." No more teachers needed! Yay!
The idea is that this is much more subtle and human-oriented than LD. It is for the teacher on the ground to coordinate Learners... Someone who knows exactly what's happening, who can make quick decisions to change the course of the lesson, and so on. So no remote big brother.
But better classroom coordination = more meaningful learning experience. Learners have greater freedom to coordinate their individual learning, teacher better able to coordinate the group. Both are required.
Post a Comment