Entries in Firefly (11)
What is Design X?
At ICFF you can get in the digital fabrication game through DesignX, a series of hands-on workshops bringing together leading experts from digital design and fabrication. These workshops—on the topics of Digital Tools, Fab On-Demand, and Cloud-Based Apps—will show you how to harness cutting-edge technology to bring your ideas to reality.
Who should attend?
DesignX is geared toward innovative designers, manufacturers, representatives and anyone who is interested in learning about the digital tools and technology trends that are revolutionizing design today. Join in for an inspiring four days of educational programming and networking events at ICFF 2013.
About this workshop?
I'm excited to be teaching two sessions on Sunday, May 19th as part of this year's Design X event. The first is a Introduction to Physical Computing with Arduino where we'll learn the fundamentals of Physical Computing including basic circuitry, different types of input sources, and output options, and how to integrate all three to create a working prototype.
The second session is an Introduction to Responsive Design using Firefly. In this session we'll learn new Interactive Prototyping techniques using the Firefly plugin for Grasshopper which facilitates real-time communication between the digital and physical worlds – enabling the possibility for you to explore virtual and physical prototypes with unprecedented fluidity.
NEW YORK, NY | NEW YORK CITY COLLEGE OF TECHNOLOGY | INTERACTIVE SURFACES WORKSHOP | APRIL 12, 2013
I am excited to be teaching a one-day Interactive Surfaces workshop for the upcoming Facades+ Conference being held in New York City on April 11th-12th. The event has an amazing line up of speakers and workshops which are being taught by some of the industries leaders including: Robert Aish (Autodesk), Nathan Miller (Case), Skylar Tibbits (MIT + Ted Fellow), Neil Meredith (Gehry Tech), and John Sargent (SOM).
The Interactive Surfaces workshop will concentrate on producing facade prototypes that are configurable, sensate, and active. The facade of a building is the liminal surface across which information and environmental performance is frequently negotiated. Given dynamic context of our built environment; the facade must be capable of intelligent adaptation over time.
In this workshop, we'll be focusing on new hardware and software prototyping techniques; primarily focusing on a wide range of sensing and actuation modalities in order to build novel interactive devices. Using remote sensors, microcontrollers (Arduino), and actuators, we will build virtual and physical prototypes that can communicate with humans and the world around them. Using both Grasshopper and the Firefly plug-in, you will learn how to create intelligent control strategies for interactive or responsive facades.
The participants who sign up for this workshop will also be the first to get their hands on the new Firefly Interactive Prototyping Sheild which I have been developing. This shield provides access to a number of built-in, ready-to-use sensors and actuators including: 3 linear sliders (potentiometers), a light sensor, a two-axis joystick, 3 push buttons, a red LED, a yellow LED, a Green LED, and a Tri-color LED, 2 servo connections, and a high-voltage MOSFET circuit capable of driving lights, valves, DC motors, etc. Each participant will not only walk away with a kick ass new hardware kit, but valuable knowledge in how to create new types of interactive prototypes!
I'm excited to release more information about a paper I wrote titled A Five Axis Robotic Motion Controller for Designers which was selected for publication and presentation for this year's ACADIA 2011 Conference: Integration Through Computation held in Banff, Canada from October 13th-16th. Click here to download the full paper.
This project aims to bring physical input and output closer together through the design purpose-built tools for fabrication, which hopefully leads to many new creative opportunities for designers. Working from observations about the way architects design, this project explores the development of a novel 3D drawing tool or customized 5-axis digitizing arm that takes real-time input and translates movement patterns directly into machine code for robotic fabrication. An improved workflow for robotic simulation was also developed as part of this project; using design tools that are already familiar to architects and designers, such as Rhino and Grasshopper. The purpose of this project was not to suggest that this new workflow is a ready-made solution to replace the existing robotic fabrication process; rather I hope that this work is seen as a proof of concept that could enable wider use of digital fabrication tools by architects and designers.
The existing design-to-fabrication workflow for industrial robots (seen above) has traditionally been a slow and cumber-some process, especially for designers. Machine tooling, kinematic simulations, and robotic movement programming often require intimate knowledge of scripting and manufacturing processes, all of which limit the utilization of such tools by the 'typical' architect/designer.
In the traditional robotic fabrication workflow, there is often a discrepancy between the original design intent and the final output, primarily because there is an intermediate step where the designer has to hand off a digital model to a fabrication consultant who has more intimate knowledge of the specific robotic CAM software and the fabrication process in general. Typically, this consultant will use programs such as Robot Studio or Master CAM to create the necessary tool paths for the design, however this process can often take a great deal of time. And, if during this process, modeling irregularities are found or fabrication problems arise due to reachability or collision detection issues, then the entire process must start anew.
Conceptually, this project started very simply. I began by looking at the joint and axis configurations of the ABB-IRB 140 robot, one of the six axis robots available in the Harvard robotics lab. The design challenge then, was to design a tangible controller around these constraints. By using the same joint and axis configurations, the digitizing arm has a one to one relationship with the larger industrial robot. It is very intuitive. A user immediately grasps the idea that when they move the digitizing arm in a certain way, the robot will respond in kind.
Outside of the development of a new robotic workflow, one of the primary goals of the project was to minimize costs. Given that all of the parts for this project were paid for out of pocket (a student's pocket), creating a low-cost solution was of utmost importance. But, beyond my own personal economic restrictions, I wanted this project to be seen as a do-it-yourself solution - something that could be built in any garage or workbench using easily purchased hardware parts and sensors and a few custom fabricated pieces. The entire controller, shown here, was built for less than $200 dollars. The aluminum body was water jet cut and all of the hardware were pieces that could purchased from local hardware stores or online retailers. All of the sensors, including the five high-precision potentiometers (shown here as the small blue knobs sticking off of the aluminum body) and the two digital sensors on the tool tip were also purchased from online retailers and were chosen because of their affordability.
To create a real-time robotic simulation, data from each of the embedded sensors on the tangible controller are streamed into the computer using a plug-in for Grasshopper that I have also been developing called Firefly. Among other things, Firefly enables the use of real-world data, acquired from various types of sensors or other input devices to explicitly define parametric relationships within a Grasshopper model. In this project, sensor information is used to create a forward kinematic robotic simulation. Forward kinematics describes one type of solution for determining robotic positioning. If given all of the relative angles of each joint and the lengths of each leg; the tool tip (also known as the end effector) can be found by performing a series of matrix transformations on each body in the robotic mechanism. In this case, each of the potentiometers will return a 10-bit number between 0 and 1023. These particular potentiometers were able to rotate up to 340º, so the angle between each joint can be found by simply multiplying the current sensor value by the sensor step size. These angle values are used to perform a matrix transformation on each of the robotic legs, ultimately giving you the precise position of the tool center point. And, once you know the location of the end effector, you can record this data over time to create real-time robotic tool paths.
In addition to the five high-precision potentiometers, the digitizing arm is equipped with a tool tip circuit board with two push button controls. These allow the user to easily record or reset the digitized information on the fly. I also designed and built a customized circuit board (on the left) which processes all of the sensor information and sends a formatted string of information over the serial port to the virtual parametric interface.
The Grasshopper definition is relatively straight forward. The Firefly Read component parses of the sensor information being sent directly from the microcontroller circuit board. There is a custom component written in VB.NET (seen in item number 2), which creates the necessary tool data. The data from both of these components are fed into another custom component which calculates the forward kinematic solution and outputs the position of each leg, creating a real-time preview of the robot moving in the Rhino viewport. In addition, the robotic simulator also returns all of the RAPID code, or the robotic programming language used by all of the ABB robots, to move the actual robot in the same manner as the forward kinematic preview.
The custom robotic simulation component written inside of Grasshopper outputs all of the necessary RAPID code to control the actual robot. There are two methods by which this can be done. First, all of the digitizing information is recorded and formatted into a composite data type called a robtarget. Each robtarget is defined by its name, absolute position as XYZ coordinates, rotation and orientation of the robot as four quaternion values, and its joint configurations. Each robtarget is given a unique identification each time the solution is recomputed. Then the movement commands are created to tell the robot specifically how to get to each robtarget. Once the program has been written, it can then be saved to a file on disk and uploaded to the robotic controller to be played back. An alternative method is to stream the angle information from the digitizing arm directly to the robot through a network cable. In this method, a program is uploaded to the robot which tells it to sit and wait for any information being sent directly from the Grasshopper definition (which can be seen in the video above).
As of today, there have only been a limited number of test runs using the five-axis robotic controller, however, the initial tests suggest that the proposed direct-to-fabrication process could prove to be a viable alternative to existing robotic workflows. One of the first tests I tried was attaching a custom designed pen tool to the robot to see if the drawing movements of the digitizing arm would match those of the robot. And while spelling your name isn't the most exciting demo, it did show some of the potential available with this process. Because virtually any end effector can be attached to the end of the robot, the design opportunities are endless. And because the tangible controller has a one-to-one relationship with the larger industrial robot, designers immediately understand that their drawing motions will be converted directly into robotic movements, creating a very intuitive interface.
Although there has been considerable progress made in the digital tools used to control robots, I'd like to close by reiterating the fact that there is an identifiable problem in the existing design-to-fabrication process. I would like to propose an improved workflow for robotic fabrication. It is the hope of this project that the physical articulation of embodied input and output through purpose-built tools for fabrication can allow for wider adoption by and new creative opportunities for architects and designers. In turn, I hope this will help re-establish the relationship between designers and the physical fabrication process.
I would like to thank Harvard professors Martin Bechthold and Pangiotis Michalotos as well as Neil Gershenfeld from MIT's Center for Bits and Atoms for their support during the development of this project.
I am very happy to announce that my full-paper titled A Five-Axis Robotic Motion Controller for Designers has been accepted for presentation and publication in the conference proceedings of the ACADIA 2011 conference to be held at the Banff Center, Calgary Canada from Oct. 11th-16th, 2011. You can find out more about the entire line-up of speakers on the ACADIA website. I'll also be releasing more information about this project (and paper) very soon, so stay tuned.
I would also like to mention that I will be teaching a two-day workshop on physical computing (using Arduino, Grasshopper, and Firefly) as part of the ACADIA pre-conference workshop series. This fast-paced workshop will focus on hardware and software prototyping techniques. For more information, see the workshop description below.
In 1991, Mark Weiser published a paper in Scientific American titled, The Computer for the 21st Century, where he predicted that as technology advanced, becoming cheaper, smaller, and more powerful, it would begin to "recede into the background of our lives" - taking a more camouflaged, lifestyle-integrated form. He called this Ubiquitous Computing (Ubicomp for short), or the age of calm technology. There have been numerous examples to support Weiser's claim, including Natalie Jeremijinko's "Live Wire" project (1995), the Ambient Orb (2002), or the Microsoft Surface Table (2007) to name just a few.
In 1997 Hiroshi Ishii expanded Weiser's idea in a seminal paper titled Tangible Bits where he examined how architectural spaces could be transformed through the coupling of digital information (bits) with tangible objects. Where Wieser’s research aimed to make the computer ‘invisible’ by embedding smaller and smaller computer terminals into everyday objects, Ishii looked to change the way people created and interacted with digitally augmented spaces.
Both Weiser and Ishii have had a significant impact on the development of physical computing, a term used to describe a field of research interested in the construction of physical systems that can sense and respond to their surroundings through the use of software and hardware systems. It overlaps with other forms of tangible computing (ie. ubiquitous, wearable, invisible) and incorporates both material and computational media, employing mechanical and electronic systems.
Interest in physical computing has risen dramatically over the last fifteen years in the fields of architecture, engineering, industrial design, and art. Designers in the future will be called upon to create spaces that are computationally enhanced. Rather than simply design traditional buildings and then add a computational layer, it is better to conceive and design this integration from the outset. A review of the literature reveals that there are no established methodologies for designing architectural spaces as smart or intelligent spatial systems. As such, it is clear that a new multidisciplinary approach is needed to bring together research in the fields of interaction design (IxD), architectural design, product design, human computer interaction (HCI), embedded systems, and engineering to create a holistic design strategy for more livable and productive spaces. Preparing architectural designers for these challenges demands a range of knowledge, skills, and experience well beyond the traditional domain of architectural education. This workshop in Physical Computing at the ACADIA 2011 conference is in line with the conference theme of Integration Through Computation.
2011.October.11 | Workshop Day 1 at University of Calgary
2011.October.12 | Workshop Day 2 at University of Calgary
All students will be required to bring their own laptops preloaded with the latest versions of Rhino, Grasshopper, and Arduino. The latest build of Firefly will be provided to all workshop participants. Trial software will also be made available.
Given the nature of the workshop, each student will be required to bring a small set of hardware components to begin their physical prototypes. There are many different packages to choose from, but the following are recommended:
Arduino Starter Pack or equal [includes the new Arduino Uno Atmega328, Protoboard, and a good selection of starter components]. 2 Standard Servo Motors similar to these: Adafruit or Hi-Tec from Servocity.
Arduino Experimentation Kit v1.0 or Sparkfun's Inventors Kit for Arduino [includes the new Arduino Uno Atmega328, Prototyping bundles, and a great selection of starter components]. 2 Standard Servo Motors similar to these: Adafruit or Hi-Tec from Servocity.
Students are encouraged to bring other components if they have them, but the packages should serve as a good starting point.
NEW YORK, NY | ARDUINO, GRASSHOPPER, & FIREFLY | SEPT 24TH-25TH, 2011
Studio Mode | modeLab is pleased to announce the next installment of the coLab workshop series: Hybrid Prototypes. As a follow-up workshop to the coLab workshop held in January 2011, Hybrid Prototypes is a two-day intensive design and prototyping workshop to be held in New York City during the weekend of September 24-25.
As architects and designers, we make things and build objects that interact with other objects, people, and networks.We're constantly seeking faster and more inexpensive methods to build prototypes, yet we are frequently hindered by practical and time consuming factors that arise in the process of bringing our ideas to life. Firefly is the new paradigm for interactive hybrid prototyping; offering a comprehensive set of software tools dedicated to bridging the gap between Grasshopper (a free plug-in for Rhino) and the Arduino micro-controller. It allows near real-time data flow between the digital and physical worlds – enabling the possibility to explore virtual and physical prototypes with unprecedented fluidity.
This fast-paced workshop will focus on hardware and software prototyping techniques. Using remote sensors, microcontrollers (Arduino), and actuators, we will build virtual and physical prototypes that can communicate with humans and the world around them. Through a series of focused exercises and design tasks, each attendee will make prototypes that are configurable, sensate, and active. As part of a larger online infrastructure, modeLab, this workshop provides participants with continued support and knowledge to draw upon for future learning.
Attendance will be limited to provide each participant maximum dedicated time with instructors. Participants should be familiar with the basic concepts of parametric design and interface of Grasshopper.
All experience levels are welcome. Participants are encouraged to be familiar with the basic concepts of parametric design and interfaces of Grasshopper and Arduino.
Registration Pricing (limited enrollment) : $550.
Workshop Location : Gansevoort Studio | Meatpacking District, Manhattan.
Workshop Hours : 10AM-6PM.
Examples of Previous Workshops.
coLab Workbook | Printed + PDF Documentation
coLab Primers | Annotated Primer GHX Files
coLab Exercises | Annotated Exercise GHX Files
modeLab Fabrication Equipment | CNC High Force Cutter
Arduino Micro-controller Hardware
Arduino Control Logic
Parametric Design Logics
Sensors + Actuators
All students will be required to bring their own laptops preloaded with the latest versions of Rhino, Grasshopper, and Arduino. The latest build of Firefly will be provided to all workshop participants.
Trial software will also be made available.
Given the nature of the workshop, each student will be required to bring a small set of hardware components to begin their physical prototypes. There are many different packages to choose from, but we recommend the following:
Arduino Starter Pack or equal [includes the new Arduino Uno Atmega328, Protoboard, and a good selection of starter components]. 2 Standard Servo Motors similar to these: Adafruit or Hi-Tec from Servocity.
Arduino Experimentation Kit v1.0 or equal [includes the new Arduino Uno Atmega328, Prototyping bundles, and a great selection of starter components]. 2 Standard Servo Motors similar to these: Adafruit or Hi-Tec from Servocity.
Students are encouraged to bring other components if they have them, but the packages should serve as a good starting point.
2010.August.24 | Workshop Announced + Registration Opens.
2011.September.24 | Workshop Begins.
2011.September.25 | Workshop Concludes.
Like many people, I've been anxiously awaiting the official release of the Microsoft SDK for the Kinect. Now, that its officially out, I spent some time over the last two weeks working on a set of Kinect related components that I hope to include in the next release of Firefly (1.007). The first component I tried to implement was the Skeleton Tracker... and I have to admit that the result are quite promising. It's surprisingly fast and as long as you stay within the specified range of the sensor, the results are quite good. Using this component I put together two very quick demo videos.
There has been a big push over the last decade to develop novel 3D technology for multimedia displays (whether its new ways for stereoscopic projection, refractive lens, etc.) One of the most successful implementations and inventive (in my opinion) was Johnny Chung Lee's reverse engineering of the Wii sensor bar. Another recent example (and equally impressive) is this hack using the Kinect sensor and head tracking.
The video above is my first attempt to create a real-time 3D display system within Grasshopper using Firefly's new Skeleton Tracker component and some simple camera manipulation. The Skeleton Tracker component outputs a list of points (click here for further explanation). From there, I simply use the Horster Camera Control component (another 3rd party plugin for Grasshopper) to position the camera at the viewers head and the camera target at a point in space locating the Kinect sensor. It really is that easy. Turn on some real-time shadows and you've got a real-time 3D display. It still needs some tweaking but it's pretty fun to play with.
This next demo shows how easy it is to turn gestural movements into physical actuation using an Arduino. The setup is very simple. My z-value of my right hand (basically the height of my hand) controls the brightness value (or Pulse Width Modulation - PWM) of the LED. My left hand controls the servo. When my hand is by my side, the servo goes to position 0 and if I raise my hand above my head the servo moves to position 180. So simple. Of course, this could be expanded to control all sorts of things... perhaps that is next.
I was extremely excited to announce the official release of Firefly version 1.006 earlier this week. For those who aren't familiar with Firefly, allow me to provide a short introduction. Firefly is a set of software tools dedicated to bridging the gap between Grasshopper (a free plug-in for Rhino), the Arduino micro-controller, the internet and beyond. It allows real-time data flow between the digital and physical worlds and will read/write data to/from internet feeds, remote sensors and actuators, mobile phone devices, the Kinect, and more. There are a lot of new components in this release (including the Arduino Code Generator, Upload to I/O Board, UDP and OSC Listeners and Transmitters, XML Search, and State Detection) that I thought it would be a good idea to put together a few videos showing some of the latest features. So without further ado...
This first video shows the potential of the new Arduino Code Generator and the Upload to I/O Board components. In my opinion, one of the greatest limitations of the previous versions of Firefly was that your Arduino board always had to be tethered to your computer via the USB cable. This was because Firefly communicates back and forth to Grasshopper through serial communication. However, sometimes you just want to use Grasshopper (and its visual programming interface) to prototype your design and then unplug it from your computer to run off external power. Now, you can!
The Arduino Code Generator attempts to convert any Grasshopper definition into Arduino compatible code (C++) on the fly. It works by detecting components that are 'upstream' from the Uno/Mega Write component. The Code Generator checks the component ID against a library of custom C++ functions which then get added to the code if there is a match. The code can be simultaneously saved as a .pde (Arduino Sketch) file to be opened in the Arduino IDE.
In addition, there is also a new Upload to I/O Board component which allows you to upload any sketch (could be from the Code Generator or any other sketch) directly to your Arduino board from within the Grasshopper environment. A lot of stuff happens behind the scenes with this component. Essentially it creates a dynamic MakeFile and calls a shell application to convert the .pde file into a .cpp (C++) file and then into .hex code (machine readable code) to be uploaded to the microcontroller. Note: WinAVR is required to be installed on your machine in order to properly upload sketches to your board. You can download the latest version here.
There are also a lot of great network tools included in this release, including the UDP and OSC Listener and Transmitter components. OSC (Open Sound Control) messages are essentially specially formatted UDP messages which can be particularly handy when you want to send some sort of information across a network (either wirelessly or LAN). OSC messages are particularly useful because each message contains some metadata and a value, giving you more information about what type of data the message contains. These new components open up a whole new world of possibilities by allowing you to send/receive data from smart phones (iphone or android) or by sharing documents among friends or colleagues over a network.
The video above uses the BreathOSC application (free from the iphone app store) developed by Thomas Edwards to simulate wind effects in Grasshopper. Simply breathe into the microphone and an OSC message is sent to a specified IP address on a UDP port. I then simply use the OSC Listener to decode the message and uses its value to create a wind vector to drive the Kangaroo (another 3rd party plugin for Grasshopper) wind simulation. Daniel Piker, the developer of Kangaroo, helped setup this demo... and I have to say... it's quite fun.
Another useful networking application for smart phones is TouchOSC (available for both iphone and android). It supports sending and receiving Open Sound Control messages over a Wi-Fi network using the UDP protocol. You can also create your own interfaces using the TouchOSC Editor and sync them directly to your phone. In this example, I've created a simple layout to control a few LED's, a tri-color LED, and a standard servo using the new OSC Listener in Firefly. This is just a simple test, but the sky is the limit with this type of control over mobile phone interface design.
If you are interested in learning more about Firefly, check out our website at: http://www.fireflyexperiments.com/
The website has a lot of good tutorials and examples to get you up and running in no time. As always, if you have a suggestion or want to send us a comment, you can reach us at email@example.com
It is without a doubt that this release would not have been possible without the tremendous support from Prof. Panagiotis Michalatos at Harvard's GSD. His guidance over the last 6 months strongly influenced the development of the Firefly_X toolset and I owe him a great debt of gratitude for his assistance. Firefly is built upon the Grasshopper plug-in for Rhino, both developed by Robert McNeel and Associates. The Arduino language syntax is based on Wiring by Hernando Barragan. The Arduino environment is based on Processing byBen Fry and Casey Reas, and is now supported by an amazing team of software and hardware developers that continue to refine and expand its capabilities.
SAN FRANCISCO, CA | GRASSHOPPER & FIREFLY | JUL 11TH-22ND, 2011
Hosted by the California College of the Arts & the Architectural Association
Sponsored By McNeel Associates
I am excited to be an invited tutor for this year's Biodynamic Structures Workshop in San Francisco, CA. Biodynamics is the study of the force and energy of dynamic processes on living organisms. Through simple mechanisms embedded within the material logic of natural systems, specific stimuli can activate a particular response. This response occurs in carnivorous plants such as the Venus fly-trap, which uses turgor pressure to trap small insects in order to feed, and worms, which by contracting differently oriented muscles, achieve movement. This ten-day intensive workshop, co-taught by the faculty of the Emergent Technologies and Design Programme at the AA and the faculty of Architecture and MEDIAlab at California College of the Arts, will explore active systems in nature, investigating biomimetic principles in order to analyze, design and fabricate prototypes that respond to electronic and environmental stimuli.
Students will work in teams to research specific biological systems, extracting logics of organization, geometry, structure and mathematics. Advanced analysis, simulation, modeling and fabrication tools will be introduced in order to apply this information to the design of both passive and active responsive architectural systems. Investigation and application of robotics, sensors and actuators will be employed for the activation of the material system investigation through the construction of working responsive prototypes.
I am proud to announce that I will be participating in this year’s SmartGeometry conference in Copenhagen, Denmark. I would like to thank Daniel Piker and Robert Cervellione for their invitation to be an assistant cluster champion for the 4 day workshop cluster titled ‘Use the Force’. I'd also like to express my sincere appreciation to Robert McNeel and Associates for their sponsorship for this event.
This cluster will explore the use of Kangaroo as a form-finding tool, and linking it to real-time sensor input (through Firefly). I think this is a unique and very exciting opportunity to come together to develop, test and really push the boundaries of what is possible with these design tools. Here's a little more information about the cluster:
Through Kangaroo, the live physics plug-in for Generative Components and Grasshopper, this cluster will explore ways of using the simulated interaction of physical forces and real-time spatial inputs to develop novel form-finding processes. The physics engine will constantly react to the streaming data - when one element in the system changes, the entire system adjusts itself accordingly in order to find and maintain equilibrium. Treating geometric and construction constraints, material behaviour, and user interaction all within this common language of physical forces unifies and allows complex real-time interaction between them.
The deadline for applications to the workshops has been extended until this Sunday, February 6th. Read more about it and apply here.
The Biodynamic Structures workshop (hosted by the AA and CCA) is well underway and it has been amazing to see the intensity by both the instructors and the students. I will be posting more information about the final exhibition for this event which will be happening next Wed (July 22nd), but definitely make sure you take a look at the Flickr site which is updated daily with new photos from the event. The flickr site can be found here.
I am excited to announce that I will be working as an Associated Faculty member at the Biodynamics Structures workshop being hosted by the Architectural Association and the California College of the Arts from July 12th-21st, 2010. Both institutions have assembled a truly first class set of instructors and it promises to be a ground breaking event. I would like to thank Andrew Kudless and Jason Kelly Johnson for this invitation. See workshop details below.
AA Visiting School @ CCA California College of the Art
Monday 12 to Wednesday 21 July, 2010
Biodynamics is the study of the force and energy of dynamic processes on living organisms. Through simple mechanisms embedded within the material logic of natural systems, specific stimuli can activate a particular response. This response occurs in carnivorous plants such as the Venus fly-trap, which uses turgor pressure to trap small insects in order to feed, and worms, which by contracting differently oriented muscles, achieve movement. This ten-day intensive workshop, co-taught by the faculty of the Emergent Technologies and Design Programme at the AA and the faculty of Architecture and MEDIAlab at California College of the Arts, will explore active systems in nature, investigating biomimetic principles in order to analyze, design and fabricate prototypes that respond to electronic and environmental stimuli. Students will work in teams to research specific biological systems, extracting logics of organization, geometry, structure and mathematics. Advanced analysis, simulation, modeling and fabrication tools will be introduced in order to apply this information to the design of both passive and active responsive architectural systems. Investigation and application of robotics, sensors and actuators will be employed for the activation of the material system investigation through the construction of working responsive prototypes.
+ CONTENT TAGS: Biodynamic, Parametric, Scripted, Mimetic, Responsive, Interactive, Digitally Fabricated
+ SOFTWARE: Rhino, Grasshopper, Firefly, RhinoScript, Arduino, Processing
Michael Weinstock (Academic Head, Director of Emergent Technologies Programme, AA London UK)
Christina Doumpioti, Evan Greenberg, Konstantinos Karatzas (Tutors, AA EmTech Programme, London UK)
Jason Kelly Johnson [Future Cities Lab], Andrew Kudless [Matsys] (CCA MediaLab Coordinators, SF CA)
George Jeronimidis (Director of Center for Biomimetics, University of Reading UK); Andrew Payne (LIFT Architects, Grasshopper Primer); Daniel Segraves (ASGG Adrian Smith + Gordon Gill Architecture); Ronnie Parsons + Gil Akos (Studio Mode, NY); Daniel Piker (Kangaroo Project Live Physics)
http://sanfrancisco.aaschool.ac.uk; or visit the CCA MEDIAlab website: http://mlab.cca.edu
(Workshops are non-credit. Enrollment is processed by the AA. Workshop will run the full 10 days.)
firstname.lastname@example.org or email@example.com