The non-central chi-squared distribution is a generalisation of the regular chi-squared distribution. The chi-squared distribution turns up in many statistical tests as the (approximate) distribution of a test statistic under the null hypothesis. Under alternative hypotheses, those same statistics often have approximate non-central chi-squared distributions.

This means that the non-central chi-squared distribution is often used to study the power of said statistical tests. In this post I give the definition of the non-central chi-squared distribution, discuss an important invariance property and show how to efficiently sample from this distribution.

### Definition

Let be a normally distributed random vector with mean and covariance . Given a vector , the non-central chi-squared distribution with degrees of freedom and non-centrality parameter is the distribution of the quantity

This distribution is denoted by . As this notation suggests, the distribution of depends only on , the norm of . The first few times I heard this fact, I had no idea why it would be true (and even found it a little spooky). But, as we will see below, the result is actually a simply consequence of the fact that standard normal vectors are invariant under rotations.

### Rotational invariance

Suppose that we have two vectors such that . We wish to show that if , then

has the same distribution as .

Since and have the same norm there exists an orthogonal matrix such that . Since is orthogonal and , we have . Furthermore, since is orthogonal, preserves the norm . This is because, for all ,

Putting all these pieces together we have

.

Since and have the same distribution, we can conclude that has the same distribution as . Since , we are done.

### Sampling

Above we showed that the distribution of the non-central chi-squared distribution, depends only on the norm of the vector . We will now use this to provide an algorithm that can efficiently generate samples from .

A naive way to sample from would be to sample independent standard normal random variables and then return . But for large values of this would be very slow as we have to simulate auxiliary random variables for each sample from . This approach would not scale well if we needed many samples.

An alternative approach uses the rotation invariance described above. The distribution depends only on and not directly on . Thus, given , we could instead work with where is the vector with a in the first coordinate and s in all other coordinates. If we use instead of , we have

The sum follows the regular chi-squared distribution with degrees of freedom and is independent of . The regular chi-squared distribution is a special case of the gamma distribution and can be effectively sampled with rejection sampling for large shape parameter (see here).

The shape parameter for is , so for large values of we can efficiently sample a value that follows that same distribution as . Finally to get a sample from we independently sample , and then return the sum .

### Conclusion

In this post, we saw that the rotational invariance of the standard normal distribution gives a similar invariance for the non-central chi-squared distribution.

This invariance allowed us to efficiently sample from the non-central chi-squared distribution. The sampling procedure worked by reducing the problem to sampling from the regular chi-squared distribution.

The same invariance property is also used to calculate the cumulative distribution function and density of the non-central chi-squared distribution. Although the resulting formulas are not for the faint of heart.