Computes an upper bound on the TV distance between two Poisson distributions, \(\text{Poisson}(\lambda_J(\alpha))\) and \(\text{Poisson}(\alpha c_J)\), using the Poisson-Poisson KL divergence together with Pinsker's inequality.
Usage
compute_linearization_bound(J, alpha, cJ = log(J))Details
Let \(\lambda = \lambda_J(\alpha)\) (exact shifted mean) and \(\lambda' = \alpha c_J\) (A1 approximate mean).
The KL divergence is: $$KL(\text{Poisson}(\lambda) || \text{Poisson}(\lambda')) = \lambda \log(\lambda/\lambda') + \lambda' - \lambda$$
By Pinsker's inequality: $$d_{TV}(\text{Poisson}(\lambda), \text{Poisson}(\lambda')) \le \sqrt{KL/2}$$
Numerical safeguards handle edge cases where \(\lambda\) or \(c_J\) is zero.