In this study, an appearance reconstruction method based on extraction of material reflectance properties of a three-dimensional (3D) object from its two-dimensional (2D) images is explained. One of the main advantages of this system is that the reconstructed object can be rendered in real-time with photorealistic quality in varying illumination conditions. The reflectance of the object is decomposed into diffuse and specular components. While the diffuse component is stored in a global texture, the specular component is represented with a Bi-directional Reflectance Distribution Function (BRDF). While estimating the diffuse components, illumination-invariant images of the object are computed from the input images, and a global texture of the object is extracted from these images by using surface particles. The specular reflectance data are collected from the residual images obtained by taking difference between the input images and corresponding illumination-invariant images, and a BRDF model is fitted to these data. At the rendering phase, the diffuse and specular components are blended into each other to achieve a photorealistic appearance of the reconstructed object.