Information and Divergence Measures
| Information and Divergence Measures |
| Pubbl/distr/stampa | MDPI - Multidisciplinary Digital Publishing Institute, 2023 |
| Descrizione fisica | 1 online resource (282 p.) |
| Soggetto topico |
Physics
Research & information: general |
| Soggetto non controllato |
academic evaluation
Awad entropy bi-logistic growth bootstrap discrepancy comparison probability (BDCP) clustering concomitants conditional independence COVID-19 data cross tabulations differential geometry discrepancy comparison probability (DCP) divergence-based tests divergences double index divergence test statistic empirical survival Jensen-Shannon divergence epidemic waves exponential family extremal combinatorics FGM family Fisher information Fisher-Tsallis information number geodesic GOS graphs Han's inequality information inequalities joint entropy Kaniadakis logarithm Kolmogorov-Smirnov two-sample test Kullback-Leibler divergence Kullback-Leibler divergence (KLD) Lauricella D-hypergeometric series likelihood ratio test (LRT) LPI minimum Rényi's pseudodistance estimators model selection moment condition models multiple power series Multivariate Cauchy distribution (MCD) multivariate data analysis multivariate Gaussian n/a p-value passive interception systems past entropy polymatroid radar waveform rank function Rao-type tests Rényi's pseudodistance residual entropy restricted minimum Rényi's pseudodistance estimators robustness set function Shannon entropy Shearer's lemma skew logistic distribution statistical divergence statistical K-means statistical manifold submodularity transversality truncated exponential family truncated normal distributions Tsallis divergence Tsallis entropy Tsallis logarithm weighted Kaniadakis divergence weighted Tsallis divergence |
| Formato | Materiale a stampa |
| Livello bibliografico | Monografia |
| Lingua di pubblicazione | eng |
| Record Nr. | UNINA-9911053044603321 |
| MDPI - Multidisciplinary Digital Publishing Institute, 2023 | ||
| Lo trovi qui: Univ. Federico II | ||
| ||
Information Theory in Neuroscience
| Information Theory in Neuroscience |
| Autore | Piasini Eugenio |
| Pubbl/distr/stampa | MDPI - Multidisciplinary Digital Publishing Institute, 2019 |
| Descrizione fisica | 1 online resource (280 p.) |
| Soggetto non controllato |
brain network
categorical perception channel capacity complex networks connectome consciousness decoding discrete Markov chains discrimination eigenvector centrality entorhinal cortex feedforward networks free-energy principle functional connectome Gibbs measures goodness graph theoretical analysis graph theory higher-order correlations hippocampus independent component analysis infomax principle information entropy production information theory integrated information integrated information theory internal model hypothesis Ising model latching maximum entropy maximum entropy principle minimum information partition mismatched decoding mutual information mutual information decomposition navigation network eigen-entropy neural code neural coding neural information propagation neural network neural population coding neuroscience noise correlations orderness perceived similarity perceptual magnet Potts model principal component analysis pulse-gating Queyranne's algorithm recursion redundancy representation spike train statistics spike-time precision submodularity synergy unconscious inference |
| ISBN | 3-03897-665-2 |
| Formato | Materiale a stampa |
| Livello bibliografico | Monografia |
| Lingua di pubblicazione | eng |
| Record Nr. | UNINA-9910346856603321 |
Piasini Eugenio
|
||
| MDPI - Multidisciplinary Digital Publishing Institute, 2019 | ||
| Lo trovi qui: Univ. Federico II | ||
| ||