Biometrically Augmented Self-Collaborating, Computer Aided Drawing Environment
MetadataShow full item record
This work is framed as a critique of traditional architectural modeling and design software's disembodied tools and static environments. My research proposes an augmentation of these tools with data collected from the designer's body to reinterpret and reveal complex emotional and physical responses while sketching. This thesis aims to identify methods for intersecting biometric measurements within computer-aided design environments, to reveal a designer's proclivities during the design process. It exposes various subconscious behaviors that occur during the process of architectural sketching by measuring and communicating a designer's biometric patterns in real time during the drawing process. In doing so, the research develops a series of inventive virtual design tools that can capitalize on data mined from the body. Through various working prototypes, augmented tools were tested in a collaborative manner to find new conceptual design methods and ways of thinking with drawing and data. Computer aided design environments are generally devoid of representations of their users. Augmenting them with relevant data that enhances the immersive or live perception of users remains a challenge. This is especially true of collaborative design environments where physical presence contributes to decision-making and mutual understanding. By inserting biometric data into custom software in real-time, users may become more aware of themselves, their natural design intent, and their reactions to visual feedback more intuitively. This thesis defines collaboration as a feed back loop between designers and their design tools. In the context of this work, the relationships between the user and the tool take up the questions that previously were asked solely of user-user relationships in design environments and software. The major questions surrounding the thesis begin with the reasoning for the inclusion of biometrics as a catalyzing link between the designer and creativity within a virtual design tool. How can biometrics be used to link the body to a virtual space in a way that removes obstructions for fluid expression? Related, what abilities do biometrics inserted into a design space have for producing customized individual design tools for distinct users? What are effective methods that a design tool can utilize to suggest design trajectories to a user. How are the value judgments made that these suggestions are based on, and what does a drawn suggestion imply when made by a software tool? Directly related, how is this relationship between machine feedback and user drawing represented, and how are these to be valued against each other? Within a digital drawing/design environment where the software itself can draw and respond, what is the correct balance of control between the user and the tool? What new ways of designing can be discovered with these interactions in mind?