VC Dimensions of Principal Component Analysis

Yohji Akama, Kei Irie, Akitoshi Kawamura, Yasutaka Uwano

Research output: Contribution to journalArticlepeer-review

7 Citations (Scopus)


Motivated by statistical learning theoretic treatment of principal component analysis, we are concerned with the set of points in ℝd that are within a certain distance from a k-dimensional affine subspace. We prove that the VC dimension of the class of such sets is within a constant factor of (k+1)(d-k+1), and then discuss the distribution of eigenvalues of a data covariance matrix by using our bounds of the VC dimensions and Vapnik's statistical learning theory. In the course of the upper bound proof, we provide a simple proof of Warren's bound of the number of sign sequences of real polynomials.

Original languageEnglish
Pages (from-to)589-598
Number of pages10
JournalDiscrete and Computational Geometry
Issue number3
Publication statusPublished - 2010
Externally publishedYes

All Science Journal Classification (ASJC) codes

  • Theoretical Computer Science
  • Geometry and Topology
  • Discrete Mathematics and Combinatorics
  • Computational Theory and Mathematics


Dive into the research topics of 'VC Dimensions of Principal Component Analysis'. Together they form a unique fingerprint.

Cite this