top

  Info

  • Utilizzare la checkbox di selezione a fianco di ciascun documento per attivare le funzionalità di stampa, invio email, download nei formati disponibili del (i) record.

  Info

  • Utilizzare questo link per rimuovere la selezione effettuata.
Benefit/Cost-Driven Software Development [[electronic resource] ] : With Benefit Points and Size Points
Benefit/Cost-Driven Software Development [[electronic resource] ] : With Benefit Points and Size Points
Autore Hannay Jo Erskine
Pubbl/distr/stampa Cham, : Springer International Publishing AG, 2021
Descrizione fisica 1 online resource (114 p.)
Collana Simula SpringerBriefs on Computing
Soggetto topico Mathematical & statistical software
Software Engineering
Desenvolupament de programari
Anàlisi cost-benefici
Soggetto genere / forma Llibres electrònics
Soggetto non controllato Mathematical Software
Software Engineering
open access
benefits management
benefit points
earned business
value management
benefit/costs index
uncertainty assessment
periodization
Mathematical & statistical software
ISBN 3-030-74218-0
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Record Nr. UNISA-996466404003316
Hannay Jo Erskine  
Cham, : Springer International Publishing AG, 2021
Materiale a stampa
Lo trovi qui: Univ. di Salerno
Opac: Controlla la disponibilità qui
Benefit/Cost-Driven Software Development : With Benefit Points and Size Points
Benefit/Cost-Driven Software Development : With Benefit Points and Size Points
Autore Hannay Jo Erskine
Pubbl/distr/stampa Cham, : Springer International Publishing AG, 2021
Descrizione fisica 1 online resource (114 p.)
Collana Simula SpringerBriefs on Computing
Soggetto topico Mathematical & statistical software
Software Engineering
Desenvolupament de programari
Anàlisi cost-benefici
Soggetto genere / forma Llibres electrònics
Soggetto non controllato Mathematical Software
Software Engineering
open access
benefits management
benefit points
earned business
value management
benefit/costs index
uncertainty assessment
periodization
Mathematical & statistical software
ISBN 3-030-74218-0
Classificazione COM051230COM077000
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Record Nr. UNINA-9910488721003321
Hannay Jo Erskine  
Cham, : Springer International Publishing AG, 2021
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui
Innovative Learning Environments in STEM Higher Education [[electronic resource] ] : Opportunities, Challenges, and Looking Forward / / edited by Jungwoo Ryoo, Kurt Winkelmann
Innovative Learning Environments in STEM Higher Education [[electronic resource] ] : Opportunities, Challenges, and Looking Forward / / edited by Jungwoo Ryoo, Kurt Winkelmann
Autore Ryoo Jungwoo
Edizione [1st ed. 2021.]
Pubbl/distr/stampa Springer Nature, 2021
Descrizione fisica 1 online resource (XV, 137 p. 8 illus., 7 illus. in color.)
Disciplina 519.5
Collana SpringerBriefs in Statistics
Soggetto topico Statistics 
Machine learning
Learning
Instruction
Knowledge representation (Information theory) 
Statistics for Social Sciences, Humanities, Law
Machine Learning
Statistics and Computing/Statistics Programs
Learning & Instruction
Knowledge based Systems
Educació STEM
Educació superior
Soggetto genere / forma Llibres electrònics
Soggetto non controllato Statistics for Social Sciences, Humanities, Law
Machine Learning
Statistics and Computing/Statistics Programs
Learning & Instruction
Knowledge based Systems
Statistics in Social Sciences, Humanities, Law, Education, Behavorial Sciences, Public Policy
Statistics and Computing
Education
Innovative Learning Environments
ILEs
Science, Technology, Engineering, and Math
STEM
virtual reality
VR
augmented reality
mixed reality
cross reality
extended reality
artificial intelligence
AI
adaptive learning
personalized learning
higher education
multimodal learning
mobile learning
Open Access
Social research & statistics
Mathematical & statistical software
Teaching skills & techniques
Cognition & cognitive psychology
Expert systems / knowledge-based systems
ISBN 3-030-58948-X
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Nota di contenuto 1. Introduction -- 2. X-FILEs Vision for personalized and Adaptive Learning -- 3. X-FILEs Vision for Multi-modal Learning Formats -- 4. X-FILEs Vision for Extended/Cross Reality (XR) -- 5. X-FILEs Vision for Artificial Intelligence (AI) and Machine Learning (ML) -- 6. Cross-Cutting Concerns -- 7. Epilogue.
Record Nr. UNISA-996466564503316
Ryoo Jungwoo  
Springer Nature, 2021
Materiale a stampa
Lo trovi qui: Univ. di Salerno
Opac: Controlla la disponibilità qui
Probability in Electrical Engineering and Computer Science [[electronic resource] ] : An Application-Driven Course
Probability in Electrical Engineering and Computer Science [[electronic resource] ] : An Application-Driven Course
Autore Walrand Jean
Pubbl/distr/stampa Cham, : Springer International Publishing AG, 2021
Descrizione fisica 1 online resource (390 p.)
Soggetto topico Maths for computer scientists
Communications engineering / telecommunications
Maths for engineers
Probability & statistics
Soggetto non controllato Probability and Statistics in Computer Science
Communications Engineering, Networks
Mathematical and Computational Engineering
Probability Theory and Stochastic Processes
Statistics for Engineering, Physics, Computer Science, Chemistry and Earth Sciences
Mathematical and Computational Engineering Applications
Probability Theory
Statistics in Engineering, Physics, Computer Science, Chemistry and Earth Sciences
Applied probability
Hypothesis testing
Detection theory
Expectation maximization
Stochastic dynamic programming
Machine learning
Stochastic gradient descent
Deep neural networks
Matrix completion
Linear and polynomial regression
Open Access
Maths for computer scientists
Mathematical & statistical software
Communications engineering / telecommunications
Maths for engineers
Probability & statistics
Stochastics
ISBN 3-030-49995-2
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Record Nr. UNISA-996464521903316
Walrand Jean  
Cham, : Springer International Publishing AG, 2021
Materiale a stampa
Lo trovi qui: Univ. di Salerno
Opac: Controlla la disponibilità qui
Probability in Electrical Engineering and Computer Science : An Application-Driven Course
Probability in Electrical Engineering and Computer Science : An Application-Driven Course
Autore Walrand Jean
Edizione [1st ed.]
Pubbl/distr/stampa Cham, : Springer International Publishing AG, 2021
Descrizione fisica 1 online resource (390 p.)
Soggetto topico Maths for computer scientists
Communications engineering / telecommunications
Maths for engineers
Probability & statistics
Soggetto non controllato Probability and Statistics in Computer Science
Communications Engineering, Networks
Mathematical and Computational Engineering
Probability Theory and Stochastic Processes
Statistics for Engineering, Physics, Computer Science, Chemistry and Earth Sciences
Mathematical and Computational Engineering Applications
Probability Theory
Statistics in Engineering, Physics, Computer Science, Chemistry and Earth Sciences
Applied probability
Hypothesis testing
Detection theory
Expectation maximization
Stochastic dynamic programming
Machine learning
Stochastic gradient descent
Deep neural networks
Matrix completion
Linear and polynomial regression
Open Access
Maths for computer scientists
Mathematical & statistical software
Communications engineering / telecommunications
Maths for engineers
Probability & statistics
Stochastics
ISBN 3-030-49995-2
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Nota di contenuto Intro -- Preface -- Acknowledgements -- Introduction -- About This Second Edition -- Contents -- 1 PageRank: A -- 1.1 Model -- 1.2 Markov Chain -- 1.2.1 General Definition -- 1.2.2 Distribution After n Steps and Invariant Distribution -- 1.3 Analysis -- 1.3.1 Irreducibility and Aperiodicity -- 1.3.2 Big Theorem -- 1.3.3 Long-Term Fraction of Time -- 1.4 Illustrations -- 1.5 Hitting Time -- 1.5.1 Mean Hitting Time -- 1.5.2 Probability of Hitting a State Before Another -- 1.5.3 FSE for Markov Chain -- 1.6 Summary -- 1.6.1 Key Equations and Formulas -- 1.7 References -- 1.8 Problems -- 2 PageRank: B -- 2.1 Sample Space -- 2.2 Laws of Large Numbers for Coin Flips -- 2.2.1 Convergence in Probability -- 2.2.2 Almost Sure Convergence -- 2.3 Laws of Large Numbers for i.i.d. RVs -- 2.3.1 Weak Law of Large Numbers -- 2.3.2 Strong Law of Large Numbers -- 2.4 Law of Large Numbers for Markov Chains -- 2.5 Proof of Big Theorem -- 2.5.1 Proof of Theorem 1.1 (a) -- 2.5.2 Proof of Theorem 1.1 (b) -- 2.5.3 Periodicity -- 2.6 Summary -- 2.6.1 Key Equations and Formulas -- 2.7 References -- 2.8 Problems -- 3 Multiplexing: A -- 3.1 Sharing Links -- 3.2 Gaussian Random Variable and CLT -- 3.2.1 Binomial and Gaussian -- 3.2.2 Multiplexing and Gaussian -- 3.2.3 Confidence Intervals -- 3.3 Buffers -- 3.3.1 Markov Chain Model of Buffer -- 3.3.2 Invariant Distribution -- 3.3.3 Average Delay -- 3.3.4 A Note About Arrivals -- 3.3.5 Little's Law -- 3.4 Multiple Access -- 3.5 Summary -- 3.5.1 Key Equations and Formulas -- 3.6 References -- 3.7 Problems -- 4 Multiplexing: B -- 4.1 Characteristic Functions -- 4.2 Proof of CLT (Sketch) -- 4.3 Moments of N(0, 1) -- 4.4 Sum of Squares of 2 i.i.d. N(0, 1) -- 4.5 Two Applications of Characteristic Functions -- 4.5.1 Poisson as a Limit of Binomial -- 4.5.2 Exponential as Limit of Geometric -- 4.6 Error Function.
4.7 Adaptive Multiple Access -- 4.8 Summary -- 4.8.1 Key Equations and Formulas -- 4.9 References -- 4.10 Problems -- 5 Networks: A -- 5.1 Spreading Rumors -- 5.2 Cascades -- 5.3 Seeding the Market -- 5.4 Manufacturing of Consent -- 5.5 Polarization -- 5.6 M/M/1 Queue -- 5.7 Network of Queues -- 5.8 Optimizing Capacity -- 5.9 Internet and Network of Queues -- 5.10 Product-Form Networks -- 5.10.1 Example -- 5.11 References -- 5.12 Problems -- 6 Networks-B -- 6.1 Social Networks -- 6.2 Continuous-Time Markov Chains -- 6.2.1 Two-State Markov Chain -- 6.2.2 Three-State Markov Chain -- 6.2.3 General Case -- 6.2.4 Uniformization -- 6.2.5 Time Reversal -- 6.3 Product-Form Networks -- 6.4 Proof of Theorem 5.7 -- 6.5 References -- 7 Digital Link-A -- 7.1 Digital Link -- 7.2 Detection and Bayes' Rule -- 7.2.1 Bayes' Rule -- 7.2.2 Circumstances vs. Causes -- 7.2.3 MAP and MLE -- Example: Ice Cream and Sunburn -- 7.2.4 Binary Symmetric Channel -- 7.3 Huffman Codes -- 7.4 Gaussian Channel -- Simulation -- 7.4.1 BPSK -- 7.5 Multidimensional Gaussian Channel -- 7.5.1 MLE in Multidimensional Case -- 7.6 Hypothesis Testing -- 7.6.1 Formulation -- 7.6.2 Solution -- 7.6.3 Examples -- Gaussian Channel -- Mean of Exponential RVs -- Bias of a Coin -- Discrete Observations -- 7.7 Summary -- 7.7.1 Key Equations and Formulas -- 7.8 References -- 7.9 Problems -- 8 Digital Link-B -- 8.1 Proof of Optimality of the Huffman Code -- 8.2 Proof of Neyman-Pearson Theorem 7.4 -- 8.3 Jointly Gaussian Random Variables -- 8.3.1 Density of Jointly Gaussian Random Variables -- 8.4 Elementary Statistics -- 8.4.1 Zero-Mean? -- 8.4.2 Unknown Variance -- 8.4.3 Difference of Means -- 8.4.4 Mean in Hyperplane? -- 8.4.5 ANOVA -- 8.5 LDPC Codes -- 8.6 Summary -- 8.6.1 Key Equations and Formulas -- 8.7 References -- 8.8 Problems -- 9 Tracking-A -- 9.1 Examples -- 9.2 Estimation Problem.
9.3 Linear Least Squares Estimates -- 9.3.1 Projection -- 9.4 Linear Regression -- 9.5 A Note on Overfitting -- 9.6 MMSE -- 9.6.1 MMSE for Jointly Gaussian -- 9.7 Vector Case -- 9.8 Kalman Filter -- 9.8.1 The Filter -- 9.8.2 Examples -- Random Walk -- Random Walk with Unknown Drift -- Random Walk with Changing Drift -- Falling Object -- 9.9 Summary -- 9.9.1 Key Equations and Formulas -- 9.10 References -- 9.11 Problems -- 10 Tracking: B -- 10.1 Updating LLSE -- 10.2 Derivation of Kalman Filter -- 10.3 Properties of Kalman Filter -- 10.3.1 Observability -- 10.3.2 Reachability -- 10.4 Extended Kalman Filter -- 10.4.1 Examples -- 10.5 Summary -- 10.5.1 Key Equations and Formulas -- 10.6 References -- 11 Speech Recognition: A -- 11.1 Learning: Concepts and Examples -- 11.2 Hidden Markov Chain -- 11.3 Expectation Maximization and Clustering -- 11.3.1 A Simple Clustering Problem -- 11.3.2 A Second Look -- 11.4 Learning: Hidden Markov Chain -- 11.4.1 HEM -- 11.4.2 Training the Viterbi Algorithm -- 11.5 Summary -- 11.5.1 Key Equations and Formulas -- 11.6 References -- 11.7 Problems -- 12 Speech Recognition: B -- 12.1 Online Linear Regression -- 12.2 Theory of Stochastic Gradient Projection -- 12.2.1 Gradient Projection -- 12.2.2 Stochastic Gradient Projection -- 12.2.3 Martingale Convergence -- 12.3 Big Data -- 12.3.1 Relevant Data -- 12.3.2 Compressed Sensing -- 12.3.3 Recommendation Systems -- 12.4 Deep Neural Networks -- 12.4.1 Calculating Derivatives -- 12.5 Summary -- 12.5.1 Key Equations and Formulas -- 12.6 References -- 12.7 Problems -- 13 Route Planning: A -- 13.1 Model -- 13.2 Formulation 1: Pre-planning -- 13.3 Formulation 2: Adapting -- 13.4 Markov Decision Problem -- 13.4.1 Examples -- 13.5 Infinite Horizon -- 13.6 Summary -- 13.6.1 Key Equations and Formulas -- 13.7 References -- 13.8 Problems -- 14 Route Planning: B -- 14.1 LQG Control.
14.1.1 Letting N →∞ -- 14.2 LQG with Noisy Observations -- 14.2.1 Letting N →∞ -- 14.3 Partially Observed MDP -- 14.3.1 Example: Searching for Your Keys -- 14.4 Summary -- 14.4.1 Key Equations and Formulas -- 14.5 References -- 14.6 Problems -- 15 Perspective and Complements -- 15.1 Inference -- 15.2 Sufficient Statistic -- 15.2.1 Interpretation -- 15.3 Infinite Markov Chains -- 15.3.1 Lyapunov-Foster Criterion -- 15.4 Poisson Process -- 15.4.1 Definition -- 15.4.2 Independent Increments -- 15.4.3 Number of Jumps -- 15.5 Boosting -- 15.6 Multi-Armed Bandits -- 15.7 Capacity of BSC -- 15.8 Bounds on Probabilities -- 15.8.1 Applying the Bounds to Multiplexing -- 15.9 Martingales -- 15.9.1 Definitions -- 15.9.2 Examples -- 15.9.3 Law of Large Numbers -- 15.9.4 Wald's Equality -- 15.10 Summary -- 15.10.1 Key Equations and Formulas -- 15.11 References -- 15.12 Problems -- Correction to: Probability in Electrical Engineering and Computer Science -- Correction to: Probability in Electrical Engineering and Computer Science (Funding Information) -- A Elementary Probability -- A.1 Symmetry -- A.2 Conditioning -- A.3 Common Confusion -- A.4 Independence -- A.5 Expectation -- A.6 Variance -- A.7 Inequalities -- A.8 Law of Large Numbers -- A.9 Covariance and Regression -- A.10 Why Do We Need a More Sophisticated Formalism? -- A.11 References -- A.12 Solved Problems -- B Basic Probability -- B.1 General Framework -- B.1.1 Probability Space -- B.1.2 Borel-Cantelli Theorem -- B.1.3 Independence -- B.1.4 Converse of Borel-Cantelli Theorem -- B.1.5 Conditional Probability -- B.1.6 Random Variable -- B.2 Discrete Random Variable -- B.2.1 Definition -- B.2.2 Expectation -- B.2.3 Function of a RV -- B.2.4 Nonnegative RV -- B.2.5 Linearity of Expectation -- B.2.6 Monotonicity of Expectation -- B.2.7 Variance, Standard Deviation.
B.2.8 Important Discrete Random Variables -- B.3 Multiple Discrete Random Variables -- B.3.1 Joint Distribution -- B.3.2 Independence -- B.3.3 Expectation of Function of Multiple RVs -- B.3.4 Covariance -- B.3.5 Conditional Expectation -- B.3.6 Conditional Expectation of a Function -- B.4 General Random Variables -- B.4.1 Definitions -- B.4.2 Examples -- B.4.3 Expectation -- B.4.4 Continuity of Expectation -- B.5 Multiple Random Variables -- B.5.1 Random Vector -- B.5.2 Minimum and Maximum of Independent RVs -- B.5.3 Sum of Independent Random Variables -- B.6 Random Vectors -- B.6.1 Orthogonality and Projection -- B.7 Density of a Function of Random Variables -- B.7.1 Linear Transformations -- B.7.2 Nonlinear Transformations -- B.8 References -- B.9 Problems -- References -- Index.
Record Nr. UNINA-9910488709003321
Walrand Jean  
Cham, : Springer International Publishing AG, 2021
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui
Solving PDEs in Python [[electronic resource] ] : The FEniCS Tutorial I / / by Hans Petter Langtangen, Anders Logg
Solving PDEs in Python [[electronic resource] ] : The FEniCS Tutorial I / / by Hans Petter Langtangen, Anders Logg
Autore Langtangen Hans Petter
Edizione [1st ed. 2016.]
Pubbl/distr/stampa Springer Nature, 2016
Descrizione fisica 1 online resource (XI, 146 p. 17 illus., 16 illus. in color.)
Disciplina 004
Collana Simula SpringerBriefs on Computing
Soggetto topico Computer mathematics
Algorithms
Mathematics
Visualization
Computer software
Numerical analysis
Software engineering
Computational Science and Engineering
Mathematical Software
Numerical Analysis
Software Engineering/Programming and Operating Systems
Soggetto non controllato Computational Science and Engineering
Algorithms
Visualization
Mathematical Software
Numerical Analysis
Software Engineering/Programming and Operating Systems
Data and Information Visualization
Software Engineering
Finite element
FEniCS
Partial Differential Equations
Python
Simulation
Open access
Maths for scientists
Combinatorics & graph theory
Mathematical & statistical software
Operating systems
ISBN 3-319-52462-3
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Nota di contenuto 1 Preliminaries -- 2 Fundamentals: Solving the Poisson Equation -- 3 A Gallery of Finite Element Solvers -- 4 Subdomains and Boundary Conditions -- 5 Extensions: Improving the Poisson Solver -- References.
Record Nr. UNINA-9910169179003321
Langtangen Hans Petter  
Springer Nature, 2016
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui