Package: kldest 1.0.0.9001

kldest: Sample-Based Estimation of Kullback-Leibler Divergence

Estimation algorithms for Kullback-Leibler divergence between two probability distributions, based on one or two samples, and including uncertainty quantification. Distributions can be uni- or multivariate and continuous, discrete or mixed.

Authors:Niklas Hartung [aut, cre, cph]

kldest_1.0.0.9001.tar.gz
kldest_1.0.0.9001.zip(r-4.5)kldest_1.0.0.9001.zip(r-4.4)kldest_1.0.0.9001.zip(r-4.3)
kldest_1.0.0.9001.tgz(r-4.4-any)kldest_1.0.0.9001.tgz(r-4.3-any)
kldest_1.0.0.9001.tar.gz(r-4.5-noble)kldest_1.0.0.9001.tar.gz(r-4.4-noble)
kldest_1.0.0.9001.tgz(r-4.4-emscripten)kldest_1.0.0.9001.tgz(r-4.3-emscripten)
kldest.pdf |kldest.html
kldest/json (API)
NEWS

# Install 'kldest' in R:
install.packages('kldest', repos = c('https://niklhart.r-universe.dev', 'https://cloud.r-project.org'))

Peer review:

Bug tracker:https://github.com/niklhart/kldest/issues

On CRAN:

3.90 score 1 stars 20 scripts 237 downloads 20 exports 1 dependencies

Last updated 2 months agofrom:c017f05a9d. Checks:OK: 7. Indexed: yes.

TargetResultDate
Doc / VignettesOKNov 01 2024
R-4.5-winOKNov 01 2024
R-4.5-linuxOKNov 01 2024
R-4.4-winOKNov 01 2024
R-4.4-macOKNov 01 2024
R-4.3-winOKNov 01 2024
R-4.3-macOKNov 01 2024

Exports:combinationsconstDiagMatrixconvergence_ratekld_ci_bootstrapkld_ci_subsamplingkld_discretekld_estkld_est_brnnkld_est_discretekld_est_kde1kld_est_kde2kld_est_neuralkld_est_nnkld_exponentialkld_gaussiankld_uniformkld_uniform_gaussianmvdnormto_uniform_scaletrapz

Dependencies:RANN

Readme and manuals

Help Manual

Help pageTopics
Combinations of input argumentscombinations
Constant plus diagonal matrixconstDiagMatrix
Empirical convergence rate of a KL divergence estimatorconvergence_rate
Detect if a one- or two-sample problem is specifiedis_two_sample
Uncertainty of KL divergence estimate using Efron's bootstrap.kld_ci_bootstrap
Uncertainty of KL divergence estimate using Politis/Romano's subsampling bootstrap.kld_ci_subsampling
Analytical KL divergence for two discrete distributionskld_discrete
Kullback-Leibler divergence estimator for discrete, continuous or mixed data.kld_est
Bias-reduced generalized k-nearest-neighbour KL divergence estimationkld_est_brnn
Plug-in KL divergence estimator for samples from discrete distributionskld_est_discrete
Kernel density-based Kullback-Leibler divergence estimation in any dimensionkld_est_kde
1-D kernel density-based estimation of Kullback-Leibler divergencekld_est_kde1
2-D kernel density-based estimation of Kullback-Leibler divergencekld_est_kde2
Neural KL divergence estimation (Donsker-Varadhan representation) using 'torch'kld_est_neural
k-nearest neighbour KL divergence estimatorkld_est_nn
Analytical KL divergence for two univariate exponential distributionskld_exponential
Analytical KL divergence for two uni- or multivariate Gaussian distributionskld_gaussian
Analytical KL divergence for two uniform distributionskld_uniform
Analytical KL divergence between a uniform and a Gaussian distributionkld_uniform_gaussian
Probability density function of multivariate Gaussian distributionmvdnorm
Transform samples to uniform scaleto_uniform_scale
Matrix trace operatortr
Trapezoidal integration in 1 or 2 dimensionstrapz