Detailseite
Projekt Druckansicht

Bildsynthese mit prozeduralen Texturen für große digitale Welten

Fachliche Zuordnung Bild- und Sprachverarbeitung, Computergraphik und Visualisierung, Human Computer Interaction, Ubiquitous und Wearable Computing
Förderung Förderung von 2019 bis 2023
Projektkennung Deutsche Forschungsgemeinschaft (DFG) - Projektnummer 431478017
 
The ever increasing demands on realism and detail in virtual 3D scenes lead to a tremendous amount of data. One major driving force is textures which are widely used to represent fine visual details, such as variation of material parameters across surfaces or displacements. It is common that textures are obtained from procedural models which are well suited to create stochastic textures, e.g. to mimic natural phenomena. Procedural texturing is a generative approach where textures are compactly represented by a set of functions which are evaluated to produce the final texture. In this project we build on Procedural Texture Graphs (PTGs), which represent the generative process as a graph where source nodes are mathematical functions, inner nodes are pixel processing operations and sink nodes are the final output textures.In a typical production pipeline, textures are either computed upfront which becomes extremely storage demanding, or are evaluated on-the-fly during texture accesses resulting in many redundant calculations. Our project is concerned with this quandary. We plan to treat procedural texture synthesis and photo-realistic rendering as one tightly coupled entity to make the rendering of highly detailed scenes feasible using texture synthesis on demand -- and reduce the redundant calculations by novel caching schemes accounting for all aspects of the pipeline from texture evaluation to the needs of high-quality rendering. The latter requires texture filtering, i.e. computing the weighted average of texels in a subregion of the texture determined by ray differentials. We plan to address the multitude of challenges that comes along with this approach by first deriving a novel texture filtering theory for prefiltering and antialiasing of textures created from a PTG (storing color as well as normals/displacements). The key will be newly developed caching algorithms which exploit the additional knowledge that a PTG provides, namely how a texture evolves from individual basis functions and operations.The second challenge with procedural textures is, although powerful and flexible, the creation of textures with a desired look: the construction of an appropriate graph is tedious for artists and requires in-depth knowledge of the underlying operations. Our new approach requires to add technical metadata to the PTG, which would make the construction task even more difficult. To overcome this, we want to facilitate the production of the graph by a semi-automatic approach from input exemplars, which are often given for production rendering. We will develop new algorithms to extract a set of elementary functions and combination operators from the exemplars to obtain images with similar appearance. A fully automatic tool would be neither realistic neither relevant, because artists need to control the result. To this end, we aim at a feedback loop approach: artists fix constraints while the algorithms solve sub-problems.
DFG-Verfahren Sachbeihilfen
Internationaler Bezug Frankreich
Mitverantwortlich Dr. Johannes Schudeiske
 
 

Zusatzinformationen

Textvergrößerung und Kontrastanpassung