Image and video matting with membership propagation

Weiwei Du, Kiichi Urahama

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    2 Citations (Scopus)


    Two techniques are devised for a natural image matting method using semi-supervised object extraction. One is a guiding scheme for placement of user strokes specifying object or background regions and the other is a scheme of adjustment of object colors for conforming to composited background colors. We draw strokes at inhomogeneous color regions disclosed with an unsupervised cluster extraction method from which the semi-supervised algorithm is derived. Objects are composited with a new background after their color adjustment using a color transfer method with eigencolor mapping. This image matting method is then extended to videos. Strokes are drawn only in the first frame from which memberships are propagated to successive frames to extract objects in every frame. Performance of the proposed method is examined with images and videos experimented with existing matting methods.

    Original languageEnglish
    Title of host publicationComputer Vision - ACCV 2007 - 8th Asian Conference on Computer Vision, Proceedings
    PublisherSpringer Verlag
    Number of pages11
    EditionPART 2
    ISBN (Print)9783540763895
    Publication statusPublished - 2007
    Event8th Asian Conference on Computer Vision, ACCV 2007 - Tokyo, Japan
    Duration: Nov 18 2007Nov 22 2007

    Publication series

    NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
    NumberPART 2
    Volume4844 LNCS
    ISSN (Print)0302-9743
    ISSN (Electronic)1611-3349


    Other8th Asian Conference on Computer Vision, ACCV 2007

    All Science Journal Classification (ASJC) codes

    • Theoretical Computer Science
    • Computer Science(all)


    Dive into the research topics of 'Image and video matting with membership propagation'. Together they form a unique fingerprint.

    Cite this