Software & Data Downloads — InSeGAN-ICCV2021

Instance Segmentation GAN for segmenting (nearly) identical instances of rigid objects in depth images.

This package implements InSeGAN, an unsupervised 3D generative adversarial network (GAN) for segmenting (nearly) identical instances of rigid objects in depth images. For this task, we design a novel GAN architecture to synthesize a multiple-instance depth image with independent control over each instance. InSeGAN takes in a set of code vectors (e.g., random noise vectors), each encoding the 3D pose of an object that is represented by a learned implicit object template. The generator has two distinct modules. The first module, the instance feature generator, uses each encoded pose to transform the implicit template into a feature map representation of each object instance. The second module, the depth image renderer, aggregates all of the single-instance feature maps output by the first module and generates a multiple-instance depth image. A discriminator distinguishes the generated multiple-instance depth images from the distribution of true depth images. The associated software implements all components of our algorithm. We are also releasing the Insta-10 dataset that was used to evaluate the algorithm.

  •  Cherian, A., Pais, G., Jain, S., Marks, T.K., Sullivan, A., "InSeGAN: A Generative Approach to Segmenting Identical Instances in Depth Images", IEEE International Conference on Computer Vision (ICCV), October 2021, pp. 10023-10032.
    BibTeX TR2021-097 PDF Video Data Software Presentation
    • @inproceedings{Cherian2021oct,
    • author = {Cherian, Anoop and Pais, Goncalo and Jain, Siddarth and Marks, Tim K. and Sullivan, Alan},
    • title = {InSeGAN: A Generative Approach to Segmenting Identical Instances in Depth Images},
    • booktitle = {IEEE International Conference on Computer Vision (ICCV)},
    • year = 2021,
    • pages = {10023--10032},
    • month = oct,
    • publisher = {CVF},
    • url = {https://www.merl.com/publications/TR2021-097}
    • }

Access software at https://github.com/merlresearch/InSeGAN-ICCV2021.

Access data at https://www.merl.com/pub/cherian/InSeGAN/.