Project Overview
This project develops a computational tool for approximating non-human vision in landscape design and analysis. While rigorous mathematical frameworks exist for human visual representation (perspective projections, oblique projections, isometric views), equivalent computational conventions do not exist for ecologically significant species that co-inhabit designed landscapes. This project proposes a standard computational method for constructing drawings that approximate non-human species with different mechanics of perception.
The tool translates three-dimensional spatial data into drawings that approximate non-human perception, accounting for both geometric distortion and color perception that differ fundamentally from human vision. Birds, for example, possess lateral-positioned eyes providing nearly 300-degree fields of view compared to human frontal binocular vision of approximately 120 degrees; tetrachromatic color vision including ultraviolet wavelengths compared to human trichromatic vision; and motion-sensitive peripheral vision adapted for predator detection. By developing rigorous computational methods for approximating these perceptual differences, this project enables evidence-based habitat assessment through design representation. Designers will be able to visualize and evaluate whether spatial configurations provide adequate perceptual conditions for target species under current and projected climate scenarios.
Method
The primary input will be point-cloud data captured using Light Detection and Ranging (LiDAR) technology via the Eagle LiDAR Scanner, a smartphone-compatible device that makes high-resolution spatial capture accessible and portable. Working with this point-cloud model in Grasshopper for Rhinoceros 3D (industry-standard computational design software), I will develop algorithms that: (1) reconstruct three-dimensional geometry from point-cloud data of landscape conditions; (2) apply species-specific visual parameters derived from established vision science research, accounting for field of view geometry, color spectrum sensitivity, visual acuity distribution, and motion-detection capabilities; (3) generate two-dimensional projections that translate perceptual differences into analyzable representations; (4) translate color palettes to approximate non-human color perception for human viewing, including wavelengths outside human visual range.
The computational plugin, provisionally titled Vision, will integrate these workflows alongside existing tools for perspective, oblique, and spherical projections, treating non-human vision as equally valid representational modes for landscape design analysis.
Site and Test Case: Pine Warbler at Harvard Forest
Harvard Forest near Petersham, Massachusetts provides ideal conditions for tool development and validation. As an established Long-term Ecological Research site and National Ecological Observatory Network site with extensive carbon monitoring programs, Harvard Forest offers well-characterized habitat conditions, ongoing biodiversity monitoring, and diverse forest structure including mixed deciduous-coniferous forest, regenerating experimental clearings, mature forest, and natural wetlands.
I will use the Pine Warbler (Setophaga pinus) as the proof-of-concept species for computational tool development. Pine Warblers are well-documented at Harvard Forest with consistent presence across 80+ years of bird surveys, making them an ideal test case for validating the computational tool against known habitat occupation patterns. As a forest-interior species showing population increases despite climate-driven habitat changes, Pine Warblers exemplify climate-indicator species whose habitat is adapting to the changing climate. The species exhibits typical avian visual characteristics including lateral binocular vision, tetrachromatic color perception with ultraviolet sensitivity, and motion-sensitive peripheral vision, making it representative of broader avian visual systems while being locally abundant enough for field observation. The research team will conduct lidar scanning at locations where Pine Warblers have been documented, compile species-specific visual parameters from scientific literature, and develop computational algorithm.