news and updates
New videos of the clusters have been posted on vimeo, it's worth watching!
SmartGeometry 2011 is officially over! The Interacting with the City workshop cluster has generated a series of multidimensional Tangible Table prototypes as well as an interactive augmented environment using Kinect and online data sources (Google Maps, Yahoo, Twitter, real-time weather data) for collaborative design and modeling.
The cluster, led by Przemek Jaworski, Flora Salim, and Martin Kaftan, is joined by brilliant participants from academia and practice: David Gillespie, Davide Madeddu, Eva Friedrich, Jacob Østergaard, Jakob Bak, Joao Albuquerque, John Fihn, Jose Luis Perez Galaso, Rafael Roa, Rafael Urquiza, Raul Kalvo, Stefan DiLeo, and Suhee Oh, who spent four intensive days to develop these demonstrable prototypes from scratch and got them exhibited in SmartGeometry 2011 Conference as shown in this video:
Hands on Ofelia Beach uses real-time wind and weather data to generate visualise simulation on the map of Ofelia Beach as shown on the table. Architects and designers can use 3D building blocks to model new constructions on the table and experience the impact of the wind flow generated from the new buildings on the site. Kinect is used to scan the 3D building blocks.
Project 1: Hands on Ofelia Beach
iUbi uses Kinect to scan 3D freeform model on the table, sends the point cloud for analysis and normalisation in Processing and GC (GenerativeComponents). Human and thermal comfort analysis is projected back to the model, and the digital model reverse engineered from the physical model is geotagged and sent periodically to an iPhone server, thus the model can be viewed in augmented reality using an iPhone.
Project 2: iUbi
Social.Construct (agents of mass construction) is an agent simulation system that is based on real-world data from Google Maps and Twitter, simulating agents movement across the city based on attractors (hot spots) and visibility analysis. The agent is building a public structure over the city space over a period of time, which reflects the most traversed path.
Project 3: Social.Constructs
uPlanSim (Interactive table as a tool for planning and simulation) is a multi-touch table with an agent simulation system running on a Copenhagen city map. The agents are rebuilding the city and pathways across the city based on visibility. Streets that are more visible have the higher buildings over the street lines. New attractor points can be dropped by a touch on the table. Multitouch on the table will enable panning and zooming of the 3D maps.
There is also an augmented environment project where a kinetic physical canopy is installed over a walkway, with Kinect scanning the space and Google Street View of Copenhagen Central Station projected on the wall. Whenever someone walks over the space and is recognised by Kinect, the programmatic code will send user position from Kinect to the physical installation, and hence triggers movement on the canopy, and updates the Street View at the same time.
Project 5: Augmented Environment
We will continue our collaboration and updates of our works will be posted here and in the Interacting with the City workshop cluster website.
The photos and exhibits of tangible tables and augmented environment from the Conference Day can be found here.
Another day in the cluster proved to be a bit challenging, as calibration of most of tangible tables and Kinect sensors took more than expected. Late afternoon though - we've been witnessing some small breakthroughs, thanks to everybody's hard work and help from skilled people from the community.
More information, photos and video - as usual here:
Day one completed! We managed to achieve the main goal of this day - to write/assemble simple skeletons of programs for object scanning, and calibrate feedback visualisation. Tomorrow - goal nr 2 - generation of simple 3D structure or an object! (which will be sent to 3d print).
more photos and information here
Latest developments have shown, that it's quite trivial to use Google Maps API in Processing, using static maps. All it needs is a command that loads an image, where image name is a specially prepared link. Text in the link in fact, is some sort of code, containing longitude, lattitude, zoom level and even colors of different sub-layers !
The simpliest example is :
and more advanced one :
We're planning to use Google Maps as our 'base layer', or 'canvas' for simulations of different kinds + displaying geo-tagged data on top of it.
Workshop will consist of 3 sub-groups :
- Environmental / climatic responsive design and 3d modelling
- Designing adaptive public spaces informed by human movements and social interactions
- Using enironmental data in Augmented Reality collaborative environment
6 tangible tables will be used + physical and augmented models.
Hello, and welcome to Interacting with Cities cluster!
In this news bulletin we will inform you on developments and preparations to IwC cluster being part of Smart Geometry 2011 Copenhagen.
Most information is kept on http://ubimash.com/InteractingWithCities/ website, however, we will also utilize main SG website to share as much information as possible!
Currently website contains list of all participants, a blog featuring recent research and preparations, links to inspirations on the web and some videos. Pretty soon, we're about to add full workshop schedule and we're waiting for your input on your personal home pages!
Currently we're researching on possibility of using Google Maps API via Processing, to inform pedestrian and traffic simulations. More info soon!
Post some videos also.
Since we are talking about harnessing Google Maps, check my very simple to use map library ( http://googlemapper.pt.vu ), I think it fits like a glove in here.
I think I might have to come hang out in your group and "play" a bit tomorrow!