SHREC2022 - 3D Shape Retrieval Challenge 2022

At Eurographics 2022 Symposium on 3D Object Retrieval, 1-2 September 2022

New since 2020: Full paper submissions will follow a two-stage review process and will be published in the international journal Computes & Graphics upon acceptance.

We strongly encourage to consider the graphics replicability stamp initiative,, and to apply for this additional sign of recognition.


The general objective of the 3D Shape Retrieval Challenge is to evaluate the effectiveness of 3D-shape retrieval algorithms. SHREC2022 is the sixteenth edition of the challenge. Like previous years, it is organized in conjunction with the Eurographics Symposium on 3D Object Retrieval, where the results will be reviewed and presented at the symposium.

Thanks to the efforts of previous track organizers, SHREC already provides many resources to compare and evaluate 3D retrieval methods. For this year's contest, we aim to explore new and updated tracks. Therefore, the participants are invited to have an active role in the organization of the event. This includes proposing track themes, building or acquiring a test collection, and deciding upon the queries, relevance assessment, and performance measures.

The participants of each track will collectively write a paper, which will be peer reviewed, and published in Computers & Graphics upon acceptance. At least one author per track must register for the symposium, and present the results. We also cordially invite all participants of a track to register and attend the workshop.


The following tracks are organized. For description of tasks, the collections, queries, the evaluation procedure, and time schedule, follow the links or contact the track organizer.

  1. Online detection of heterogeneous gestures
    Organizers: Marco Emporio, Anton Pirtac, Ariel Caputo, Marco Cristani, Andrea Giachetti (VIPS lab, University of Verona)
    Contact: Andrea Giachetti
    Web page
  2. Fitting and recognition of simple geometric primitives on point clouds
    Organizers: Chiara Romanengo, Andrea Raffo (CNR-IMATI, Italy)
    Contact: Chiara Romanengo
    Web page
  3. Open-Set 3D Object Retrieval using Multi-Modal Representation
    Organizers: Yue Gao, Yifan Feng, Xibin Zhao (Tsinghua University, China), Yandong Guo (OPPO Inc.)
    Contact: Yifan Feng
    Web page
  4. Pothole and crack detection on road pavement using RGB-D images
    Organizers: A. Ranieri, E. Moscoso Thompson, S. Biasotti (CNR-IMATI, Italy)
    Contact: Elia Moscoso Thompson
    Web page
  5. Sketch-Based 3D Shape Retrieval in the Wild
    Organizers: Jie Qin (Nanjing University of Aeronautics and Astronautics, China), Shuaihang Yuan (New York Universityb, USA), Jiaxin Chen (Beihang University, China), Boulbaba Ben Amor (IMT Nord Europe, France), Yi Fang (NYU Abu Dhabi, UAE)
    Contact: Jie Qin
    Web page
  6. Protein-ligand binding site recognition
    Organizers: L. Gagliardi, W. Rocchia (IIT, Italy), A. Raffo, U. Fugacci, S. Biasotti (CNR-IMATI, Italy)
    Contact: Andrea Raffo
    Web page


The following list is a step-by-step description of the activities:

  1. Track organizers have sent their proposal describing the envisioned task, collection, queries, ground truth, evaluation method, and expected number of participants.
  2. The tracks are selected, and are listed at the SHREC web page. The track organizers start up their track.
  3. Participants register for the tracks they want to participate in.
  4. Each track is performed according to its own time schedule.
  5. The track organizers collect the results.
  6. The track results are combined into a joint paper, to be submitted for review to Computers & Graphics.
  7. The description of the tracks and their results are presented at the Eurographics Symposium on 3D Object Retrieval, 1-2 September 2022.

SHREC Time Schedule

Important dates

For information about the contest, the results, etc. of previous years, see past events:
SHREC 2021 SHREC 2020 SHREC 2019 SHREC 2018, SHREC 2017, SHREC 2016, SHREC 2015,  SHREC 2014, SHREC 2013

For more information, please contact