You are here: Dissertation and Thesis Presentations

Dissertation and Thesis Presentations

Candidates who are in the process of defending their doctoral dissertation or master's thesis may submit their information to the Office of Graduate Studies for posting to this page. Submissions intended for this page should be sent at least two weeks before the date of the defense.

professor lecturing in front of a classroom.

Student Name: Wenyan Bi
Graduate Level: Ph.D.
Field of Study/Major: BCaN
Committee Chairs: Prof. Bei Xiao, Prof. Arthur Shapiro, Prof. Nathalie Japkowicz, Prof. Yotam Gingold
Date of Presentation: October 2nd, 2019
Presentation Location: MGC 328
Time of Presentation: 10 am
Title of Dissertation: VISUAL INFERENCE OF MATERIAL PROPERTIES OF DEFORMABLE OBJECTS IN DYNAMIC ENVIRONMENTS

Abstract
Every object is made of certain materials. Visually estimating material properties is a central component of human intelligence because it helps us decide how to interact with an object (e.g., determining the gripping force). Deformable objects such as cloth exhibit their mechanical properties through their interaction with external forces. This thesis aims to understand how dynamic information contributes to the visual estimation of mechanical properties of deformable objects in dynamic scenes.

With Multidimensional Scaling Method (MDS), we first determine two orthogonal perceptual dimensions of the physical properties of cloth in dynamic scenes: one is significantly correlated with bending stiffness, and the other best correlates with mass (Chapter 2). Based on this finding, other studies in this thesis investigate the estimation of stiffness and mass separately. In Chapter 3, we discover that humans can achieve an invariant estimation of stiffness but not mass. Moreover, different sets of motion statistics contribute to the perception of these two properties. Next, we specifically search for robust motion features for stiffness estimation. In Chapter 4, we identify an idiosyncratic deformation pattern (i.e., movement uniformity) to differentiate stiffness across multiple optical materials and scenes, which can be reliably measured by six optical flow features. We further provide evidence showing that movement uniformity is a robust cue for stiffness estimation such that directly manipulating this cue can change the perceived stiffness of the object. Chapter 5 takes a further look into the role of multi-frame motion information. We suggest multi-frame motion information that is represented by dense motion trajectory is important for humans to visually estimate the stiffness of cloth from videos, and is critical for a machine learning model to predict human perceptual scale. Chapter 6 reveals that in the VR/AR environment, the visual information and haptic feedback interact differently than they do in the real world. Together, this work shows that dynamic information is essential for visual inference of mechanical properties of deformable objects. We also identify several robust motion cues for estimating the stiffness of cloth.