11.21.2010

Coding for Interactive Behavior

Here is an idea I had that could generate the behavior we have been looking for:

Light Sensors:  We program these to work the opposite of the solar tracker meaning that they face towards the darkest point that they can sense.  This will inhibit them from just facing the windows during the day (when the sun is brightest) and also cause them to face a person as a result of their shadow when they are near each group.  Since these are the only sensors that provide direction this will be their sole job.

Sound Sensors:  I'm assuming that the noise sensors can give us an amplitude measurement as opposed to just a yes or no output (please correct me if i'm wrong) but if this is the case then the level of noise in the room can determine the speed that the servos are moving at.  The louder it is the faster they move so as to attract more attention. 

Distance Sensors:  If the light sensors can direct each tracker towards the nearest person then the distance tracker should be able to get a reading of how far that person is away and this could control the extension length of each of the faces.  The farther a person is away the longer they extend out and as the person approaches they would retreat for shy behavior.

IR Sensors:  These could control a couple of things.  
1. they could be a simple on/off feature so that we can preserve battery power when no one is around
  2. If each cluster has a designated color of LED then whichever cluster is being played with will send its color information to the other clusters and they will all match it.  This would be a simple way for the wireless feature to be displayed.  I'm not sure what would happen when multiple are being activated at once though.

No comments:

Post a Comment