resonate.io

March 28th, 2012 alex

Firstly, big big big shouts to everyone who made Resonate awesome. Two days of mashing Art and Culture with Technology featuring some of the greatest artists, coders, designers and musicians around.

 

 

 

Karsten Schmidt
http://www.postspectacular.com/

Spent the whole flight home (from Serbia) in the same cabin as one of the gods of processing, and if you ask my girlfriend, she’ll tell you I was acting like a 13 year old girl sat behind Justin Beiber. I don’t care, the guy is a legend. Valued at around $450,000, toxiclibs is a goto for coders, architects and designers wanting to get straight into utilising algorithms, colour theories, voxels etc in their work. I spent a bit of time at a workshop he was running on utilising the various tools that he has written for me, for you, for ‘free’! Karsten is also a huge believer in opensource software and talked in a panel along side Niklas Roy, Jürg Lehn and Jonathan Puckey. Great topics covered, with a little bit of heat arising between what defines using opensource as it should, or chiefing other peoples hard work.

 

Niklas Roy
http://www.niklasroy.com/

The guy is nuts! Creating some of the craziest mechatronic installation devices ever. One device on show at resonate was his ‘Electronic Instant Camera’ which is, essentially, a camera with an onboard thermal printer. The most interesting part of the device is in fact its complete lack of RAM. At the heart of the device is an ATmega, with a mere 1kB of RAM, which makes storing the complete image impossible. Nikolas has gone about overcoming this shortfall by parsing the image data and printing it line by line, creating strange look monotone images.

 

Nicholas Felton
http://www.feltron.com/

Ok, maybe slightly off topic, but who cares. Nicholas Felton is a graphic designer from the US who is best known for his ‘Feltron Reports’. Each year he produces amazingly beautiful infographics documenting his past year. Calculating everything from distance traveled to hotdogs eaten, he meticulously gathers data on himself non stop. He gave us all a small little insight into how he uses iCal to document everything, and more recently, got into documenting it all on a custom built iPhone app, as well as using SQL to query his data and Processing to help with drawing it.

More detilas here:

http://www.resonate.io/ 

Bob’s new brain

March 12th, 2012 alex

Brain sugery is fully underway at Digit as we start to begin porting Bobs old AT-MEGA based brain, to a sparkly new OMAP brain.

Those of you who’ll remember Bob, will know him as a desk lamp, reactive to human interaction, extending when he’s happy, and slouching when bad. Now he’s come of his shelf in the back room, and lies naked and exposed, waiting for his brain capacity to be increased by several folds. One of the main reasons for porting Bob from its Arduino prototype board is to reduce the surrounding technology that is required to drive him and give him additional functionality . The actual firmware on Bob merely provides a serial link and wrapper interface, allowing externally hosted applications to shunt data to him over a COM port. These applications have been based around Flash and Processing in the past, however with the increasing availability of small, sbc (single board computer) devices, it made sense to wrap up all this functionality on board Bob itself, decoupling him from his COM port crutch. It also gives us a lot more possibilities with what we can do with Bob, not only in terms of communication, but functionality as well.

* Not my hand…

For Bobs brain, we’ve gone for a Beaglebone. Now armed to the teeth with 700mHz of processing power and 256MB of RAM, he is capable of performing as well as a low end desktop PC. On board of Bob (the Beagleboard) is a nicely integrated Node.JS distribution, which allows for integration with the systems on board GPIO pins, timers, UART, I2C and so on…. and his USB COM port ;) I am a massive fan of Node.JS and with a quick copy and paste, a legacy socket server I have previously written was up, listening and setting GPIO’s high and low. Some of the linguistic recognition software that was previously written for Bob can easily be ported to JS to run on top of the server, allowing him to to make his own requests to the interwebz/IRC allowing for new integration with the classics…. Twitter, FB, 4Chan?… etc… to be added.

On idea we’ve had for Bob is integrating a C library that utilises OpenCV and image processing techniques to capture human facial expressions from a web cam (http://sourceforge.net/projects/mptbox/) allowing him to communicate and detect a users emotion in a more natural way.

 

CityPeaks – ‘Chariots of Fire’ addon

February 28th, 2012 alex

A new addition to our CityPeaks fitness project comes in the form of a Spotify app built for our shared studio music station. It brings with it… winners entrance music! This new feature allows users to specify a song (Spotify URI) that will be played when they make a final check-in to win a building, allowing them to walk to their desk as their fanfare plays letting everyone know that YOU WON! The track can be completely customised on the profile section of the website(defaults to Vangelis – Chariots of Fire). It was built while I was playing with the new Spotify app API (which is proving to be very awesome and powerful to use!) and gives an extra level of encouragement to users to join in the game and get their song heard. (Can’t wait to walk into the Karate Kid theme).

The app itself is currently in testing, but uses a simple method of polling an API using XHR. When a user conquers a building, the API updated the polled data allowing for it to only be hit once (so we dont play the song every x seconds).

 

Find out more here:

http://blog.digitlondon.com/2012/archiveposts/city-peaks
http://www.citypeaks.co.uk/ 

CityPeaks

February 2nd, 2012 alex

CityPeaks is finally live!

The main idea behind the project was to get Digit moving in the new year. By simply taking the stairs up to the office, Digit folk compete in a company wide race to climb the stairs of actual buildings around London. Users can also track their progress on a more personal level, plotting the distance they have climbed against mountains across the world. “Scaled k2 at work today, like a boss”.

The project comprises of 2 custom built check-in stations, an API and a pretty website to display it all on. Each station is built around an ATmega328 (The initial prototypes have been built on Arduino) and uses RFID to read Oyster cards. All communication with the API is done over HTTP and uses an Ethernet module for physical connectivity. Users simply check in and out using the system records their time in and time out and the id of the station, this enables the system to determine a direction of travel. The stations also provide feedback to the user through a 16×2 character LCD screen. Users can earn various badges and trophies i.e. Climbing the stairs more than 3 times in a day, and these results are fed back to the user on completion of a journey. The feedback mechanism also allows for users to play without fully registering themselves on the system. When a new Oyster card ID is read, it creates a new user which is assigned a token. This token is a combination of 2 words that can be entered into the website to fully register, including setting a screen name, password and various other details.

The backend is LAMP and is responsible for handling all of the data processing and communication that happens in the game and also powers the main frontend of the website. The homepage shows each users flag positioned against the current building. The height depicts how far up, in real distances, they have climbed the building. As they check in and check out, climbing the actual stairs to the office, their distance is recorded and added to their in game climbing distance. When a user reaches the top, their flag is positioned at the top of the building to mark that they have conquered it. Each users distance is then reset and the race starts again on the next building. On the users profile page they are able to see the badges and trophies they have earned (badges are collected, however trophies are unique, meaning only one person in the game can have it) as well as view their distance agains some of the tallest mountains in the world!

 

We did come across a number of problems whilst developing the system. One particular problem was the downstairs check in station. Due to the fact that it wasn’t able to directly connect to our LAN network (as the upstairs station does), we had to route all check-in’s through a 3G router. As you can imagine, this does impede on the latency of requests/responses quite substantially.

There are a number of solutions that we are currently working on implementing, one of which is to change from using HTTP to a custom protocol. The problem with HTTP is that is comes with a lot of overhead. While it provided a quick and established way of communicating, there is a lot of header information that is part of the HTTP standard that we just didn’t need, which not only takes time to transport, but to read and write as well. By using a custom server, possibly built in Node.JS, we can build a ‘proxy’ that can handle communication from the stations, and then forward these requests over HTTP to the API. This final hop from custom protocol to HTTP would be only be across a localhost, meaning that its fast and also means that we dont have to rewrite backend code.

A slightly different solution would be to use a document database with a socket interface that allows for executing custom procedures. One option could be CouchDB(It’s REST, but it could be implemented with the socket-to-http proxy), which provides the ability to read and write data exceptionally fast as well as write custom ‘shows’ to allow it to respond with a custom format, similar to how the API functions already. Another feature of CouchDB is its ‘revision-ing’ functionality, while not practically possible with Arduino, on a higher end processor, such as the Beaglebone, a local DB could be used for making immediate writes, this in turn would make the whole check-in process a lot faster as we no longer need to go through the netwrok to the server. This local DB could be then be synced with the master during quiet periods. It would also assist with problems transmitting over 3G, as we could simply write to our local revision during link downtime, then sync it to the server once its restored. This database would only be responsible for holding journey details, although switching the whole system to a document store could be a interesting experiment in itself.

Hardware:

  • SM130 (MiFARE Compatible – 13.56mhz)
  • Wiznet based ethernet shield
  • Arduino Uno
  • 16×2 LCD (drivin by 4-Wire)
Software:
  • Website/API – PHP (CodeIgniter)
  • Spotify App – HTML/JS

 

Find out more:
http://blog.digitlondon.com/

Visit the project:
http://www.citypeaks.co.uk/

A Decade of Processing

November 28th, 2011 alex

Originally created as a tool to teach the fundamentals of development with quick visual feedback, Processing has come a very long way in the last 10 years. Based on the Java language, it is now used and recognised a lot more professionally by artists and programmers for experimenting, prototyping and production. It is mainly known for its powerful graphics and openGL integration (in particular, generative graphics), but can also be used for sound manipulation, networking, hardware integration and much more. Processing projects normally build as applications / applets and as of recently include an option to export to Javascript, allowing scripts to be run in a browser on an HTML5 canvas element.

Celebrating its 10th anniversary, onedotzero’s “A Decade of Processing” was a test of endurance. 50 minutes of stomach turning, rage inducing code pr0n. I am jealous. The collection of videos that was on display was incredible, featuring work from infamous artists/codes/hackers, it showed just how powerful Processing can be.

Interim Camp

Interim Camp by Field

I’m a big fan of generative landscapes, and Fields ‘Interim Camp’ video was a brilliant example of it done in an amazing way. The video is like exploring a strange alien planet, flying over fields of fluid-like moving vertices, rising and falling to create sweeping mountain ranges and deep valleys comprised of thousands of triangles, correlating to create a huge mesh of colour. To add to the the obscure journey we were on, a custom soundtrack was created producing an even more ethereal journey thought the vast, brightly coloured Class-Q planet. What’s even more amazing is that Field have used the technique in producing commercial work for the likes of AOL.

More info on the process and concept here.

Catalina

Catalina by Moullinex

Moullinex’s video for the song Catalina was also on show. This piece was a music video that featured Kinnect data being used and worked with to create amazing visuals. Using a depth point cloud received from a Kinnect, they use Processing to manipulate and clean up the data and then log it, frame by frame, to a file. Having read up on how they created the video, I was a little disappointed to find out that this is where Processing stopped and Cinema 4D carried on. Obviously the level of rendering that was achieved would of been a lot more complex and time consuming in Processing, however… I was at a Processing exhibit, I was expecting nothing but. (Maybe _some_ post editing)… Anyway, the data was then used by a custom Python script. This then produced a huge network of interconnected nodes in 3d space, creating crazy, complex, wireframe looking people playing instruments and standing under umbrella’s. Some cool RGB channel separation and vector based rain made for a visual feast of blobs and lines.

A short howto, Processing sketch and C4D Scripts are available on Moullinex’s Tumblr.

Audio and beat reaction No. 2 | Processing

Drawing API experiments with code has always been one of my favourite things to do in my spare time. As soon as I saw Diana Lange‘s Processing / audio experiment above I was blown away. The spirographical lines are positioned and rotated on beat detection and FFT data which is loaded from the sound file. This is one of the prettiest sound visualisations I’ve seen since Robert Hodgin’s Magnetosphere. A visual treat for the eyes.