"Efficient Material Authoring by Inverse Material Modeling" by Yiwei Hu

Date of Award

Spring 2023

Document Type

Dissertation

Degree Name

Doctor of Philosophy (PhD)

Department

Computer Science

First Advisor

Rushmeier, Holly

Abstract

A real-world material can have a complex and unique visual appearance, making material modeling a challenging problem in computer graphics. From accurate measurement to light-weight capture, from tabulated representation to parametric models, current research trends in material modeling aim at the faithful reproduction of material appearance and at flexible systems for material design. Among the various proposed modeling frameworks, by-example inverse modeling methods provide an efficient way to create and edit digital materials. Users only need to provide a material sample e.g., a photo, and an algorithm automatically creates a material with similar appearance, significantly reducing the effort required in traditional forward material modeling frameworks. In this dissertation, we propose a series of methods to create or edit materials from image exemplars. In the first work we propose a novel inverse material modeling framework that automatically generates procedural material maps i.e., diffuse maps and normal maps from a single input image. Results show that our proposed inverse modeling method can produce high-quality procedural materials to match both stochastic and structured material appearance. The generated procedural materials have parametric control, interactive editability, and enable arbitrary resolution synthesis. However, as an initial effort, limitations exist, therefore we subsequently propose three major improvements 1) a differentiable optimization framework to improve the appearance matching quality considering physically-based materials; 2) a semi-automatic inverse modeling pipeline to remove the dependency on a pre-existing procedural model database; and 3) a multi-modal generative model that automatically generates a variety of procedural materials not only from images but also from text prompts. All these methods create high-quality procedural materials and have fewer limitations than the initial work. Finally, we note that materials represented as pixel maps still prevail in practice because procedural material models are often limited in their expressiveness. In our last work, we present a by-example material transfer approach to intuitively control the appearance of material maps. We show our methods are capable of generating new, visually appealing materials.

Share

COinS