Authors: 
Jungsik Park, Byung-Kuk Seo, Jong-Il Park
Abstract: 
This paper presents a method for interactive deformation of a real object. Our method uses a predefined 3D model of a target object for tracking and deformation. A camera pose relative to the target object is estimated using 3D model-based tracking. Object region in camera image is obtained by projecting the 3D model onto image plane with the estimated camera pose, and a texture map is extracted from the object region and mapped to the 3D model. Then a texture-mapped model is rendered based on a mesh deformed by user via Laplacian operation. Experimental results demonstrate that our method provides user interactions with 3D real objects on real scenes, not augmented virtual contents.