These papers document a small project I've been working on for the last few years,
completely independently of my
In short, it is a wearable, headmounted spectrometer, which can image up to 256
wavelengths simultaneously and translate that information into sound. Exteneded wear can
lead to a sort of artificial synesthesia and, given the difference between visual and
acoustic perception, can be used as a superset of normal human color vision. The device
is currently in its third hardware prototype, and development is ongoing.
For more details, see the links below. The most recent links are first.
- A full description of the system, including some personal observations of what it
like to wear it, as either Postscript or PDF. This has appeared in the Journal on Special
Topics in Mobile Networking and Applications (MONET), as part of their Special
Issues on Wearable Computers ["Artificial Synesthesia via Sonification: A
Wearable Augmented Sensory System", Volume 4, Issue 1, Jan 1999, pp. 75-81]
(Baltzer/Kluwer Science Publishers and the ACM).
- A two-page poster, in Postscript or PDF,
from when the device was demonstrated live at the
First International Symposium on Wearable Computers (ISWC '97)
- A two-page extended abstract,
as presented, with live demo, at the
International Conference on Acoustic Display
- A much longer description of the system, as either
Postscript or PDF, as
supplementary material to the above ICAD presentation.
Last modified: Mon Apr 5 21:38:42 EDT 2004