Faithfully reproducing real-world objects on a 3D printer is a challenging endeavor. A large number of available materials and freedom in material deposition make efficient exploration of the printing space difficult. Furthermore, current 3D printers can perfectly capture only a small amount of objects from the real world which makes high-quality reproductions challenging. Interestingly, many of the applications for 3D printing are explored by humans either using our sense of touch, sight, or hearing. These senses have inborn limitations given by biological constraints: eyes have limited capability to distinguish high-frequency information; fingers feel applied forces in a non-linear fashion.
In this talk, I will introduce you to the concept of perception-aware fabrication that aims to exploit these limitations to create more efficient fabrication techniques which offer equal or higher quality as perceived by a human observer. A core element of perception-aware fabrication is a perceptual space which models the response of the human sensory system to external stimuli. I will show you how to derive such spaces and how to apply them in the context of computational fabrication. Finally, I will show you how we can leverage perceptual insights to design more efficient numerical simulations. I will demonstrate this general concept in the context of two applications: manufacturing objects with prescribed compliance properties, and designing customized digital styli that mimic the behavior of traditional drawing tools. Last but not least, I will present a technique for an efficient design of surfaces with prescribed reflectance behavior.