A simpler condition for consistency of a kernel independence test

A simpler condition for consistency of a kernel independence test

Arthur Gretton
Abstract

A statistical test of independence may be constructed using the Hilbert-Schmidt Independence Criterion (HSIC) as a test statistic. The HSIC is defined as the distance between the embedding of the joint distribution, and the embedding of the product of the marginals, in a Reproducing Kernel Hilbert Space (RKHS). It has previously been shown that when the kernel used in defining the joint embedding is characteristic (that is, the embedding of the joint distribution to the feature space is injective), then the HSIC-based test is consistent. In particular, it is sufficient for the product of kernels on the individual domains to be characteristic on the joint domain. In this note, it is established via a result of Lyons (2013) that HSIC-based independence tests are consistent when kernels on the marginals are characteristic on their respective domains, even when the product of kernels is not characteristic on the joint domain.

1 Introduction

The Hilbert-Schmidt Independence Criterion [4] provides a measure of dependence between random variables on domain , and on domain , with joint probability measure on . This dependence measure may be used in statistical tests of dependence [5, 6]. The simplest way to understand HSIC is as the distance between an embedding of the joint distribution and the product of the marginals, to an appropriate feature space [9, 3], which is in our case a reproducing kernel Hilbert space. The distance covariance of [12] is a special case, for a particular choice of kernel [8]. We say the feature space is characteristic when the embedding is injective, and uniquely identifies probability measures [11, 1, 10]. A test based on HSIC is consistent when product of kernels on the domains being compared is characteristic to the joint domain [1, Theorem 3]. This is shown to be the case e.g. when Gaussian kernels are used on each of the domains.

We propose a simpler condition: namely, that the kernels on each of the individual domains and should be characteristic to those domains. The result is a direct consequence of [7, Lemma 3.8]. The result is of particular interest since it may be easier to define characteristic kernels on individual domains than on the joint domain. For example, characteristic kernels may be defined on the group of orthogonal matrices [2, Section 4], and on the semigroup of vectors of non-negative reals [2, Section 5], however a kernel jointly characteristic to both domains (i.e., to orthogonal matrix/non-negative vector pairs) is harder to define.

2 Results

We begin with a result from [10] that characteristic, translation invariant kernels provide injective embeddings of finite signed measures.

Proposition 1 (Injective embeddings of finite signed measures).

Let be a Polish, locally compact Hausdorff space. Let be a -kernel, i.e. a bounded kernel for which , where is the class of continuous functions on that vanish at infinity.111Continuous functions vanishing at infinity are members of such that for all the set is compact. Assume , i.e. the kernel is translation invariant. Define as the RKHS induced by . The following statements are equivalent:

  1. is characteristic

  2. The embedding of a finite signed Borel measure , defined as

    (1)

    is injective.

This result may be obtained by combining [10, Proposition 2], which states that an RKHS is -universal iff the embedding in (1) is injective, with the result in [10, Section 3.2] that translation invariant kernels are -universal iff they are characteristic.

This being the case, a minor adaptation of the proof of [7, Lemma 3.8] leads to the following result.

Theorem 2 (Characteristic kernels and independence measures).

Let and be kernels for the respective RKHSs on and on , with respective feature maps and . Assume both and are characteristic, translation invariant -kernels, satisfying the conditions of Proposition 1. Define the finite signed measure

Define the covariance operator as the embedding of this signed measure into the tensor space222The tensor product is defined such that , . ,

Then iff .

Proof.

The result is straightforward. We now prove the other direction. For every and , we define the finite signed Borel measure

where is the indicator of the set The embedding of this measure to is injective, and is written

where we have used the linearity of the tensor product

Since the embedding is injective, we have that . Since this is true for all , we have that

Define the finite signed measure on , . The above equation can be interpreted as the embedding of this measure to ,

hence , given that the embedding is injective. We conclude that for all Borel sets , and hence . ∎

An important point to note is that the embedding of need not be characteristic to all probability measures: only the embeddings of each of the individual dimensions and need be characteristic. A second point is that a consistent test still requires characteristic kernels on both domains; it is not sufficient for one domain alone to have a characteristic kernel. A simple example can be used to illustrate the resulting failure mode: with a characteristic kernel, with the linear kernel , and points are distributed uniformly on a circular ring centered at the origin. The data are dependent, but HSIC with these kernels will not detect this dependence.

Acknowledgements: Thanks to Joris Mooij, Jonas Peters, Dino Sejdinovic, and Bharath Sriperumbudur for helpful discussions.

References

  • [1] K. Fukumizu, A. Gretton, X. Sun, and B. Schölkopf. Kernel measures of conditional dependence. In Advances in Neural Information Processing Systems 20, pages 489–496, Cambridge, MA, 2008. MIT Press.
  • [2] K. Fukumizu, B. Sriperumbudur, A. Gretton, and B. Schoelkopf. Characteristic kernels on groups and semigroups. In Advances in Neural Information Processing Systems 21, pages 473–480, Red Hook, NY, 2009. Curran Associates Inc.
  • [3] A. Gretton, K. Borgwardt, M. Rasch, B. Schoelkopf, and A. Smola. A kernel two-sample test. JMLR, 13:723–773, 2012.
  • [4] A. Gretton, O. Bousquet, A. J. Smola, and B. Schölkopf. Measuring statistical dependence with Hilbert-Schmidt norms. In S. Jain, H. U. Simon, and E. Tomita, editors, Proceedings of the International Conference on Algorithmic Learning Theory, pages 63–77. Springer-Verlag, 2005.
  • [5] A. Gretton, K. Fukumizu, C.-H. Teo, L. Song, B. Schölkopf, and A. J. Smola. A kernel statistical test of independence. In Advances in Neural Information Processing Systems 20, pages 585–592, Cambridge, MA, 2008. MIT Press.
  • [6] A. Gretton and L. Gyorfi. Consistent nonparametric tests of independence. Journal of Machine Learning Research, 11:1391–1423, 2010.
  • [7] R. Lyons. Distance covariance in metric spaces. The Annals of Probability, 41(5):3051–3696, 2013.
  • [8] D. Sejdinovic, B. Sriperumbudur, A. Gretton, and K. Fukumizu. Equivalence of distance-based and rkhs-based statistics in hypothesis testing. Annals of Statistics, 41(5):2263–2702, 2013.
  • [9] A. J. Smola, A. Gretton, L. Song, and B. Schölkopf. A Hilbert space embedding for distributions. In Proceedings of the International Conference on Algorithmic Learning Theory, volume 4754, pages 13–31. Springer, 2007.
  • [10] B. Sriperumbudur, K. Fukumizu, and G. Lanckriet. Universality, characteristic kernels and RKHS embedding of measures. Journal of Machine Learning Research, 12:2389–2410, 2011.
  • [11] B. Sriperumbudur, A. Gretton, K. Fukumizu, G. Lanckriet, and B. Schölkopf. Hilbert space embeddings and metrics on probability measures. Journal of Machine Learning Research, 11:1517–1561, 2010.
  • [12] G. Székely, M. Rizzo, and N. Bakirov. Measuring and testing dependence by correlation of distances. Ann. Stat., 35(6):2769–2794, 2007.
Comments 0
Request Comment
You are adding the first comment!
How to quickly get a good reply:
  • Give credit where it’s due by listing out the positive aspects of a paper before getting into which changes should be made.
  • Be specific in your critique, and provide supporting evidence with appropriate references to substantiate general statements.
  • Your comment should inspire ideas to flow and help the author improves the paper.

The better we are at sharing our knowledge with each other, the faster we move forward.
""
The feedback must be of minimum 40 characters and the title a minimum of 5 characters
   
Add comment
Cancel
Loading ...
193858
This is a comment super asjknd jkasnjk adsnkj
Upvote
Downvote
""
The feedback must be of minumum 40 characters
The feedback must be of minumum 40 characters
Submit
Cancel

You are asking your first question!
How to quickly get a good answer:
  • Keep your question short and to the point
  • Check for grammar or spelling errors.
  • Phrase it like a question
Test
Test description