If you haven't already done so, make sure to check out this months issue (April 2013) of Architectural Design (AD) titled Computation Works: The Building of Algorithmic Thought.  Edited by Xavier De Kestelier and Brady Peters, this issue focuses on emerging themes in computational design practices, showcasing built and soon-to-be-built projects and provides a state of the art look at current computational design techniques.

In addition to some amazing articles written by Daniel DavisDavid RuttenDaniel PikerGiulio Piacentino, Arthur van der Harten, Thomas Grabner and Ursula Frick, and many more... it also features an article that I co-authored with Jason K. Johnson titled Firefly: Interactive Prototypes for Architectural Design.  

In addition, make sure you also take a look at the book Prototype!  edited by Julian Adenauer and Jorg Petruschat which was published by Form+Zweck last summer (2012).  Written by leading individuals at world renown design labs and research centers, this book offers a unique compilation of articles centered around the topic of advanced forms of prototyping.  In my article, IDE vs. IPE: Toward and Interactive Prototyping Environment I discuss the need to shift toward a more visually oriented Interactive Prototyping Environment (IPE) which addresses the limitations found in the existing IDE paradigm and opens up creative new opportunties for artists and designers.


I am excited to be teaching a one-day Interactive Surfaces workshop for the upcoming Facades+ Conference being held in New York City on April 11th-12th.  The event has an amazing line up of speakers and workshops which are being taught by some of the industries leaders including: Robert Aish (Autodesk), Nathan Miller (Case), Gil Akos & Ronnie Parsons (Studio Mode), Neil Meredith (Gehry Tech), and  John Sargent (SOM).

The Interactive Surfaces workshop will concentrate on producing facade prototypes that are configurable, sensate, and active.  The facade of a building is the liminal surface across which information and environmental performance is frequently negotiated.  Given dynamic context of our built environment; the facade must be capable of intelligent adaptation over time.

In this workshop, we'll be focusing on new hardware and software prototyping techniques; primarily focusing on a wide range of sensing and actuation modalities in order to build novel interactive devices. Using remote sensors, microcontrollers (Arduino), and actuators, we will build virtual and physical prototypes that can communicate with humans and the world around them.  Using both Grasshopper and the Firefly plug-in, you will learn how to create intelligent control strategies for interactive or responsive facades.

Click here to sign up!

The participants who sign up for this workshop will also be the first to get their hands on the new Firefly Interactive Prototyping Shield which I have been developing. This shield provides access to a number of built-in, ready-to-use sensors and actuators including: 3 linear sliders (potentiometers), a light sensor, a two-axis joystick, 3 push buttons, a red LED, a yellow LED, a Green LED, and a Tri-color LED, 2 servo connections, and a high-voltage MOSFET circuit capable of driving lights, valves, DC motors, etc.  Each participant will not only walk away with a kick ass new hardware kit, but valuable knowledge in how to create new types of interactive prototypes!




Like many people, I've been anxiously awaiting the official release of the Microsoft SDK for the Kinect.  Now, that its officially out, I spent some time over the last two weeks working on a set of Kinect related components that I hope to include in the next release of Firefly (1.007).  The first component I tried to implement was the Skeleton Tracker... and I have to admit that the result are quite promising.  It's surprisingly fast and as long as you stay within the specified range of the sensor, the results are quite good.  Using this component I put together two very quick demo videos.

There has been a big push over the last decade to develop novel 3D technology for multimedia displays (whether its new ways for stereoscopic projection, refractive lens, etc.) One of the most successful implementations and inventive (in my opinion) was Johnny Chung Lee's reverse engineering of the Wii sensor bar.  Another recent example (and equally impressive) is this hack using the Kinect sensor and head tracking

The video above is my first attempt to create a real-time 3D display system within Grasshopper using Firefly's newSkeleton Tracker component and some simple camera manipulation. The Skeleton Tracker component outputs a list of points (click here for further explanation).  From there, I simply use the Horster Camera Control component (another 3rd party plugin for Grasshopper) to position the camera at the viewers head and the camera target at a point in space locating the Kinect sensor.  It really is that easy.  Turn on some real-time shadows and you've got a real-time 3D display.  It still needs some tweaking but it's pretty fun to play with. 

This next demo shows how easy it is to turn gestural movements into physical actuation using an Arduino.  The setup is very simple.  My z-value of my right hand (basically the height of my hand) controls the brightness value (or Pulse Width Modulation - PWM) of the LED.  My left hand controls the servo.  When my hand is by my side, the servo goes to position 0 and if I raise my hand above my head the servo moves to position 180.  So simple.  Of course, this could be expanded to control all sorts of things... perhaps that is next.