Gender classification via lips

Static and dynamic features

D. Stewart, A. Pass, Jianguo Zhang

    Research output: Contribution to journalArticle

    6 Citations (Scopus)

    Abstract

    Automatic gender classification has many security and commercial applications. Various modalities have been investigated for gender classification with face-based classification being the most popular. In some real-world scenarios the face may be partially occluded. In these circumstances a classification based on individual parts of the face known as local features must be adopted. The authors investigate gender classification using lip movements. They show for the first time that important gender-specific information can be obtained from the way in which a person moves their lips during speech. Furthermore, this study indicates that the lip dynamics during speech provide greater gender discriminative information than simply lip appearance. They also show that the lip dynamics and appearance contain complementary gender information such that a model which captures both traits gives the highest overall classification result. They use discrete cosine transform-based features and Gaussian mixture modelling to model lip appearance and dynamics and employ the XM2VTS database for their experiments. These experiments show that a model which captures lip dynamics along with appearance can improve gender classification rates by between 16 and 21% compared with models of only lip appearance.
    Original languageEnglish
    Pages (from-to)28-34
    Number of pages7
    JournalIET Biometrics
    Volume2
    Issue number1
    DOIs
    Publication statusPublished - 2013

    Fingerprint

    Discrete cosine transforms
    Experiments

    Cite this

    Stewart, D. ; Pass, A. ; Zhang, Jianguo. / Gender classification via lips : Static and dynamic features. In: IET Biometrics. 2013 ; Vol. 2, No. 1. pp. 28-34.
    @article{9b4d33f26ac543b8a81a3d08d03770d3,
    title = "Gender classification via lips: Static and dynamic features",
    abstract = "Automatic gender classification has many security and commercial applications. Various modalities have been investigated for gender classification with face-based classification being the most popular. In some real-world scenarios the face may be partially occluded. In these circumstances a classification based on individual parts of the face known as local features must be adopted. The authors investigate gender classification using lip movements. They show for the first time that important gender-specific information can be obtained from the way in which a person moves their lips during speech. Furthermore, this study indicates that the lip dynamics during speech provide greater gender discriminative information than simply lip appearance. They also show that the lip dynamics and appearance contain complementary gender information such that a model which captures both traits gives the highest overall classification result. They use discrete cosine transform-based features and Gaussian mixture modelling to model lip appearance and dynamics and employ the XM2VTS database for their experiments. These experiments show that a model which captures lip dynamics along with appearance can improve gender classification rates by between 16 and 21{\%} compared with models of only lip appearance.",
    author = "D. Stewart and A. Pass and Jianguo Zhang",
    year = "2013",
    doi = "10.1049/iet-bmt.2012.0021",
    language = "English",
    volume = "2",
    pages = "28--34",
    journal = "IET Biometrics",
    issn = "2047-4938",
    publisher = "Institution of Engineering and Technology",
    number = "1",

    }

    Gender classification via lips : Static and dynamic features. / Stewart, D.; Pass, A.; Zhang, Jianguo.

    In: IET Biometrics, Vol. 2, No. 1, 2013, p. 28-34.

    Research output: Contribution to journalArticle

    TY - JOUR

    T1 - Gender classification via lips

    T2 - Static and dynamic features

    AU - Stewart, D.

    AU - Pass, A.

    AU - Zhang, Jianguo

    PY - 2013

    Y1 - 2013

    N2 - Automatic gender classification has many security and commercial applications. Various modalities have been investigated for gender classification with face-based classification being the most popular. In some real-world scenarios the face may be partially occluded. In these circumstances a classification based on individual parts of the face known as local features must be adopted. The authors investigate gender classification using lip movements. They show for the first time that important gender-specific information can be obtained from the way in which a person moves their lips during speech. Furthermore, this study indicates that the lip dynamics during speech provide greater gender discriminative information than simply lip appearance. They also show that the lip dynamics and appearance contain complementary gender information such that a model which captures both traits gives the highest overall classification result. They use discrete cosine transform-based features and Gaussian mixture modelling to model lip appearance and dynamics and employ the XM2VTS database for their experiments. These experiments show that a model which captures lip dynamics along with appearance can improve gender classification rates by between 16 and 21% compared with models of only lip appearance.

    AB - Automatic gender classification has many security and commercial applications. Various modalities have been investigated for gender classification with face-based classification being the most popular. In some real-world scenarios the face may be partially occluded. In these circumstances a classification based on individual parts of the face known as local features must be adopted. The authors investigate gender classification using lip movements. They show for the first time that important gender-specific information can be obtained from the way in which a person moves their lips during speech. Furthermore, this study indicates that the lip dynamics during speech provide greater gender discriminative information than simply lip appearance. They also show that the lip dynamics and appearance contain complementary gender information such that a model which captures both traits gives the highest overall classification result. They use discrete cosine transform-based features and Gaussian mixture modelling to model lip appearance and dynamics and employ the XM2VTS database for their experiments. These experiments show that a model which captures lip dynamics along with appearance can improve gender classification rates by between 16 and 21% compared with models of only lip appearance.

    UR - http://www.scopus.com/inward/record.url?scp=84879343363&partnerID=8YFLogxK

    U2 - 10.1049/iet-bmt.2012.0021

    DO - 10.1049/iet-bmt.2012.0021

    M3 - Article

    VL - 2

    SP - 28

    EP - 34

    JO - IET Biometrics

    JF - IET Biometrics

    SN - 2047-4938

    IS - 1

    ER -