A dataset of Kinect-based 3D scans

Alexandros Doumanoglou, Stylianos Asteriadis, Dimitrios S. Alexiadis, Dimitrios Zarpalas, Petros Daras

Research output: Chapter in Book/Report/Conference proceedingChapterAcademic


Hereby, a new publicly available 3D reconstruction-oriented dataset is presented. It consists of multi-view range scans of small-sized objects using a turntable. Range scans were captured using a Microsoft Kinect sensor, as well as an accurate laser scanner (Vivid VI-700 Non-contact 3D Digitizer), whose reconstructions can serve as ground-truth data. The construction of this dataset was motivated by the lack of a relevant Kinect dataset, despite the fact that Kinect has attracted the attention of many researchers and home enthusiasts. Thus, the core idea behind the construction of this dataset, is to allow the validation of 3D surface reconstruction methodologies for point sets extracted using Kinect sensors. The dataset consists of multi-view range scans of 59 objects, along with the necessary calibration information that can be used for experimentation in the field of 3D reconstruction from Kinect depth data. Two well-known 3D reconstruction methods were selected and applied on the dataset, in order to demonstrate its applicability in the 3D reconstruction field, as well as the challenges that arise. Additionally, the appropriate 3D reconstruction evaluation methodology is presented. Finally, as the dataset comes in classes of similar objects, it can also be used for classification purposes, using the provided 2.5D/3D features.
Original languageEnglish
Title of host publication2013 IEEE 11th IVMSP Workshop: 3D Image/Video Technologies and Applications, IVMSP 2013 - Proceedings
Publication statusPublished - 2013
Externally publishedYes


  • 3D reconstruction dataset
  • Fourier-based 3D reconstruction
  • Kinect Sensor
  • Poisson surface reconstruction
  • Vivid VI-700 Non-contact 3D Digitizer


Dive into the research topics of 'A dataset of Kinect-based 3D scans'. Together they form a unique fingerprint.

Cite this