Experiential lighting : development and validation of perception-based lighting controls ; Development and validation of perception-based lighting controls

by Matthew (Matthew Aldrich

Institution: MIT
Department: Department of Architecture. Program in Media Arts and Sciences
Year: 2014
Keywords: Architecture. Program in Media Arts and Sciences.
Record ID: 2024575
Full text PDF: http://hdl.handle.net/1721.1/95866


Lighting, and its emergence as a digital and networked medium, represents an ideal platform for conducting research on both sensor and human-derived methods of control. Notably, solid-state lighting makes possible the control of the intensity, spatial, and color attributes of lighting in real-time. This technology provides an excellent opportunity to conduct new experiments designed to study how we perceive, judge, and subsequently control illumination. For example, given the near-infinite variation of possible lighting attributes, how might one design an intuitive control system? Moreover, how can one reconcile the objective nature of sensor-based controls with the subjective impressions of humans? How might this approach guide the design of lighting controls and ultimately guide the design of lighting itself? These questions are asked with the benefit of hindsight. Simple control schemes using sliders, knobs, dials, and motion sensors currently in use fail to anticipate human understanding of the controls and the possible effects that changes in illumination will have upon us. In this work, the problem of how humans interact with this new lighting medium is cast as a human-computer interaction. I describe the design and validation of a natural interface for lighting by abstracting the manifold lighting parameters into a simpler set of controls. Conceptually, this "simpler set" is predicated on the theory that we are capable of discerning the similarities and differences between lighting arrangements (scenes). I hypothesize that this natural ordering (a metric space in a latent multidimensional basis) can be quantitatively extracted and analyzed. First, in a series of controlled experiments, I show how one can derive this mapping and I demonstrate, using empirical evidence, how future sensor networks will eventually emulate our subjective impressions of lighting. Second, using data obtained in a user-study, I quantitatively derive performance estimates of my proposed lighting user interface, and statistically contrast these performance results with those obtained using a traditional interface comprised of sliders and buttons. I demonstrate that my approach enables the user to attain their illumination goals while substantially reducing task-time and fatigue.