Logo site CYBER-II
Home pageWorkpackagesWP2. Rendering and interaction with augmented scene
WP2. Rendering of augmented scene
Friday 21 July 2006
Friday 21 July 2006

WP2. Rendering of augmented scene

Different rendering modes exist, from texture mapping to e.g. radiosity solutions. We will address the following approaches.

2.1 360 degrees rendering based on texture mapping

Texture maps coming from different cameras have to be combined in order to visualize the scene from an arbitrary viewpoint. Approaches of so-called view-dependent texture mapping produce good results, but ideally require relatively densely distributed and/or many input images. In our case, input images stem from few cameras that are quite distant from each other. Due to imperfect geometrical modelling, texture maps from different images may not be well-aligned spatially. We want to provide solutions to this problem. Another issue is a potential bad photometric alignment. A full photometric model of the observed objects would allow creating coherent texture maps, but it might be interesting to study intermediate approaches, e.g. based on simpler models of the change in appearance due to changes in viewing position.

2.2 Smooth shadows

Shadows are very important for the realistic aspect of rendered scenes, concerning the geometric level as well as the graphical aspect.

2.3 Computation of light exchanges

Light exchanges can be computed based on the established geometric and photometric models. Light exchanges are of two types: local effects are directly due to light sources (direct illumination, cast shadows). Global effects are due to light reflected by objects in the scene onto other objects or themselves. The associated computations are very complex, and trade-offs between resolution, amount of pre-computations, and computation time have to be made.

This section's articles
  1. WP2. Rendering of augmented scene
    21 July 2006