LEADER 02090nam 2200457 450 001 9910704720203321 005 20210520112444.0 035 $a(CKB)5470000002444526 035 $a(OCoLC)869305420 035 $a(EXLCZ)995470000002444526 100 $a20140128d2008 ua 0 101 0 $aeng 135 $aurcn||||||||| 181 $ctxt$2rdacontent 182 $cc$2rdamedia 183 $acr$2rdacarrier 200 00$aFinal environmental impact statement $esite-specific invasive plant treatments for Mt. Hood National Forest and Columbia River Gorge National Scenic Area in Oregon, including Forest Plan Amendment #16 : Mt. Hood National Forest and Columbia River Gorge National Scenic Area, Clackamas, Hood River, Multnomah, and Wasco Counties /$fUnited States Department of Agriculture, Forest Service 210 1$a[Sandy, OR] :$cUnited States Department of Agriculture, Forest Service,$d[2008] 215 $a1 online resource (906 unnumbered pages) $ccolor illustrations, maps 300 $aTitle from title screen (viewed on Jan. 28, 2014). 300 $a"March 2008." 320 $aIncludes bibliographical references. 517 $aFinal environmental impact statement 606 $aInvasive plants$xControl$zOregon$zMount Hood National Forest 606 $aInvasive plants$xControl$zColumbia River Gorge National Scenic Area (Or. and Wash.) 606 $aPlant invasions$zOregon$zMount Hood National Forest$xPrevention 606 $aPlant invasions$zColumbia River Gorge National Scenic Area (Or. and Wash.) 607 $aMount Hood National Forest (Or.) 607 $aColumbia River Gorge National Scenic Area (Or. and Wash.) 608 $aEnvironmental impact statements.$2lcgft 615 0$aInvasive plants$xControl 615 0$aInvasive plants$xControl 615 0$aPlant invasions$xPrevention. 615 0$aPlant invasions 712 02$aUnited States.$bForest Service, 801 0$bGPO 801 1$bGPO 906 $aBOOK 912 $a9910704720203321 996 $aFinal environmental impact statement$93145385 997 $aUNINA LEADER 02824nam 22005172 450 001 9910160959203321 005 20151005020624.0 010 $a0-511-65903-2 035 $a(CKB)3460000000080916 035 $a(SSID)ssj0000699709 035 $a(PQKBManifestationID)11426018 035 $a(PQKBTitleCode)TC0000699709 035 $a(PQKBWorkID)10657727 035 $a(PQKB)10543733 035 $a(UkCbUP)CR9780511659034 035 $a(BIP)035624480 035 $a(PPN)261370340 035 $a(Exl-AI)993460000000080916 035 $a(EXLCZ)993460000000080916 100 $a20091209d1893|||| uy| 0 101 0 $aeng 135 $aur||||||||||| 181 $ctxt$2rdacontent 182 $cc$2rdamedia 183 $acr$2rdacarrier 200 10$aNotes on Recent Researches in Electricity and Magnetism $eIntended as a Sequel to Professor Clerk-Maxwell's Treatise on Electricity and Magnetism /$fJoseph John Thomson 210 1$aPlace of publication not identified :$cpublisher not identified,$d1893. 210 1$aCambridge :$cCambridge University Press 215 $a1 online resource (600 pages) $cdigital, PDF file(s) 225 1 $aCambridge library collection. Physical Sciences 300 $aTitle from publisher's bibliographic system (viewed on 05 Oct 2015). 311 08$a1-108-01520-4 330 $aThis 1893 publication is a central text in the work of the Nobel prize winning physicist Sir Joseph John Thomson (1858?1940). Intended as an extension of James Clerk Maxwell's Treatise on Electricity and Magnetism, it documents the important shift in Thomson's thinking towards the model of the atomic electric field, a theory that would eventually lead to his discovery of the electron. In Chapter 1, Thomson documents his experiments with Faraday tubes, using them to physically demonstrate a 'molecular theory of electricity'. Chapter 2 considers the discharge of electricity through gases, Chapter 3 theories of electrostatics, and Chapters 4?6 are primarily concerned with alternating currents. In addition to providing crucial insight into Thomson's evolving theory of the atom, Recent Researches underscores his commitment to experimental physics, which offers 'all the advantages in vividness which arise from concrete qualities rather than abstract symbols'. 410 0$aCambridge library collection.$pPhysical Sciences. 517 3 $aNotes on Recent Researches in Electricity & Magnetism 606 $aAtomic theory$7Generated by AI 606 $aElectrons$7Generated by AI 615 0$aAtomic theory 615 0$aElectrons 700 $aThomson$b Joseph John$0289256 801 0$bUkCbUP 801 1$bUkCbUP 906 $aBOOK 912 $a9910160959203321 996 $aNotes on Recent Researches in Electricity and Magnetism$92576915 997 $aUNINA LEADER 12545nam 22007455 450 001 9910586578803321 005 20251113181653.0 010 $a981-19-5194-2 024 7 $a10.1007/978-981-19-5194-7 035 $a(MiAaPQ)EBC7072348 035 $a(Au-PeEL)EBL7072348 035 $a(CKB)24360909600041 035 $a(PPN)264192206 035 $a(OCoLC)1340956697 035 $a(DE-He213)978-981-19-5194-7 035 $a(EXLCZ)9924360909600041 100 $a20220729d2022 u| 0 101 0 $aeng 135 $aurcnu|||||||| 181 $ctxt$2rdacontent 182 $cc$2rdamedia 183 $acr$2rdacarrier 200 10$aData Science $e8th International Conference of Pioneering Computer Scientists, Engineers and Educators, ICPCSEE 2022, Chengdu, China, August 19?22, 2022, Proceedings, Part I /$fedited by Yang Wang, Guobin Zhu, Qilong Han, Hongzhi Wang, Xianhua Song, Zeguang Lu 205 $a1st ed. 2022. 210 1$aSingapore :$cSpringer Nature Singapore :$cImprint: Springer,$d2022. 215 $a1 online resource (455 pages) 225 1 $aCommunications in Computer and Information Science,$x1865-0937 ;$v1628 311 08$aPrint version: Wang, Yang Data Science Singapore : Springer,c2022 9789811951930 327 $aIntro -- Preface -- Organization -- Contents - Part I -- Contents - Part II -- Big Data Mining and Knowledge Management -- Self-attention Based Multimodule Fusion Graph Convolution Network for Traffic Flow Prediction -- 1 Introduction -- 2 Spatiotemporal Prediction in Deep Learning -- 2.1 Time Correlation Research -- 2.2 Time Correlation Research -- 3 Prediction Model of Traffic Flow Based on Multi-module Fusion -- 3.1 Model Frame Diagram -- 3.2 Space-Time Decoupling -- 3.3 Spatial Convolution -- 3.4 Spatial Self-attention -- 3.5 Temporal Convolution -- 3.6 Time Self-attention -- 3.7 Information Fusion and GRU -- 4 Experimental Analysis -- 4.1 Dataset -- 4.2 Analysis of Results -- 5 Conclusion -- References -- Data Analyses and Parallel Optimization of the Tropical-Cyclone Coupled Numerical Model -- 1 Introduction -- 1.1 A Subsection Sample -- 2 Model Setup -- 2.1 Atmospheric Model Setup -- 2.2 Hydrodynamic Model Setup -- 2.3 Ocean Wave Model Setup -- 2.4 HPC Facilities -- 2.5 Coupled Variables -- 3 Scaling Experiments -- 3.1 Parallel Tests Analysis -- 3.2 SWAN Model Parallel Algorithm Optimization -- 3.3 Ocean Model Grid Optimization -- 4 Parallel Test Results -- 5 Model Results Discussion -- 6 Conclusion -- References -- Factorization Machine Based on Bitwise Feature Importance for CTR Prediction -- 1 Introduction -- 2 Related Work -- 3 Our Approach -- 3.1 Embedding Layer -- 3.2 Learning -- 4 Experiments -- 4.1 Experimental Settings -- 4.2 Hyperparameter Study -- 4.3 Ablation Study -- 4.4 Performance Comparison -- 5 Conclusion -- References -- Focusing on the Importance of Features for CTR Prediction -- 1 Introduction -- 2 ECABiNet Model -- 2.1 Sparse Input and Embedding Layer -- 2.2 Layer Norm -- 2.3 ECANET Layer -- 2.4 Feature Cross Layer -- 2.5 DNN Layer -- 2.6 Output -- 3 Experiment -- 3.1 Experimental Setup. 327 $a3.2 LayerNorm Effect Comparison -- 3.3 Comparison of the Effects of Different Attention Modules -- 3.4 Comparison of the Classic Model -- 3.5 Study HyperParameter -- 4 Related Work -- 5 Conclusions -- References -- Active Anomaly Detection Technology Based on Ensemble Learning -- 1 Introduction -- 2 Problem Statement -- 3 Proposed Model -- 3.1 Supervised Ensemble Learning Model -- 3.2 Human Participation -- 3.3 Model Self-training -- 3.4 Experiment -- 3.5 Conclusion -- References -- Automatic Generation of Graduation Thesis Comments Based on Multilevel Analysis -- 1 Introduction -- 2 Technical Principle -- 2.1 BERT Model Introduced -- 2.2 Basic Structure of the BERT Model -- 2.3 Comparison with Other Algorithms -- 3 Project Analysis -- 3.1 Technical Route -- 3.2 Technical Analysis -- 4 Project Implementation -- 4.1 Database Established Modification -- 4.2 Student Information Input -- 4.3 Neural Network Training -- 4.4 Automatically Generate Comments -- 5 Conclusion -- References -- A Survey of Malware Classification Methods Based on Data Flow Graph -- 1 Introduction -- 2 Data Flow Graph -- 2.1 Basic Concepts of the Data Flow Graph -- 2.2 Data Flow Graph Corresponding to Common APIs -- 2.3 Extension of Data Flow Graphs -- 3 Malware Classification Based on Data Flow Graph -- 3.1 User-Defined Data Flow Graph Feature-based Malware Classification -- 3.2 Data Flow Graph Similarity-Based Malware Classification -- 3.3 Graph Neural Network-Based Malware Classification -- 4 Discussion -- 5 Conclusion -- References -- Anomaly Detection of Multivariate Time Series Based on Metric Learning -- 1 Introduction -- 2 Preliminaries -- 3 Proposed Model -- 3.1 Preprocessing -- 3.2 Encoder for High-Dimensional Time Series Data -- 3.3 Attentional Center Learning -- 3.4 Loss Function -- 3.5 Semisupervised Learning -- 4 Experiments -- 4.1 Dataset -- 4.2 Setup -- 4.3 Result. 327 $a5 Conclusion -- References -- Social Network Analysis of Coauthor Networks in Inclusive Finance in China -- 1 Introduction and Motivation -- 2 Data Collection and Preprocessing -- 3 Results -- 3.1 General Characteristics of the Coauthor Network -- 3.2 Ego Characteristics of the Coauthor Network -- 3.3 The Evolution of Cohesive Subgroups in the Coauthor Network -- 4 Conclusions -- References -- Multirelationship Aware Personalized Recommendation Model -- 1 Introduction -- 2 Preliminary Preparation -- 2.1 Problem Definition -- 2.2 Data Preprocessing -- 2.3 User Relationship Graphs -- 3 Modeling and Training -- 3.1 MrAPR Model -- 3.2 Model Training -- 4 Experiment -- 4.1 Dataset -- 4.2 Baselines and Evaluation Metrics -- 4.3 Parameter Settings -- 4.4 Ablation Experiments -- 5 Conclusion -- References -- Machine Learning for Data Science -- Preliminary Study on Adapting ProtoPNet to Few-Shot Learning Using MAML -- 1 Introduction -- 2 Related Work -- 2.1 Few-Shot Learning -- 2.2 Interpretability -- 3 Proposed Methods -- 3.1 Adapting ProtoPNet to MAML -- 3.2 Evaluating Models -- 4 Experiments -- 4.1 Datasets -- 4.2 Experiment 1: Omniglot Few-Shot Classification -- 4.3 Experiment 2: MiniImagenet Few-Shot Classification -- 4.4 Experiment 3: Interpretability Analysis on Omngilot -- 4.5 Experiment 4: Preliminary Interpretability Analysis on MiniImagenet -- 5 Conclusion and Future Work -- References -- A Preliminary Study of Interpreting CNNs Using Soft Decision Trees -- 1 Introduction -- 2 Related Work -- 3 Proposed Methods -- 3.1 Model Foundations -- 3.2 Using Normal/Soft Decision Trees to Interpret CNNs -- 3.3 Evaluating Interpretability -- 4 Experiments -- 4.1 Dataset and Experimental Setup -- 4.2 Experiment 1: Classification Performance -- 4.3 Experiment 2: Visualization of Normal/Soft Decision Trees' Top Features. 327 $a4.4 Experiment 3: Interpretability Performance -- 4.5 Experiment 4: Scores of Human Experts on Tag Clarity -- 5 Conclusion and Future Work -- References -- Deep Reinforcement Learning with Fuse Adaptive Weighted Demonstration Data -- 1 Introduction -- 2 Related Work -- 2.1 Deep Reinforcement Learning -- 2.2 Multiagent Reinforcement Learning -- 3 Methods -- 4 Experimental Results and Analysis -- 4.1 Experimental Environment and Data -- 4.2 Results and Analysis -- 5 Discussion -- References -- DRIB: Interpreting DNN with Dynamic Reasoning and Information Bottleneck -- 1 Introduction -- 2 Related Works -- 2.1 Explain the Existing Deep Learning Models -- 2.2 Construction of Interpretable Deep Learning Models -- 3 Method -- 3.1 Dynamic Reasoning Decision Module -- 3.2 Information Bottleneck Verification Module -- 4 Experiments -- 4.1 Experimental Settings -- 4.2 Interpretability of Calculation in Dynamic Reasoning Decision -- 4.3 Explainability of Attribution in the Information Bottleneck -- 4.4 Visualization of Understandability -- 5 Conclusion -- References -- Multimedia Data Management and Analysis -- Advanced Generative Adversarial Network for Image Superresolution -- 1 Introduction -- 2 Related Work -- 3 GAN and SRGAN -- 4 Proposed Method -- 4.1 Generator Network Structure -- 4.2 Discriminator Network Structure -- 4.3 Loss Function -- 5 Experiment Results and Analysis -- 5.1 Implementation Details -- 5.2 Datasets and Evaluation Metrics -- 5.3 Experimental Results and Analysis -- 5.4 Ablation Study -- 6 Conclusions -- References -- Real-World Superresolution by Using Deep Degradation Learning -- 1 Introduction -- 2 Related Work -- 2.1 Real-World Superresolution -- 2.2 Contrastive Learning -- 3 PurPosed Method -- 3.1 Overview of the Unsupervised Framework -- 3.2 Degradation Model -- 3.3 Reconstruction Model -- 4 Experiments -- 4.1 Training Data. 327 $a4.2 Training Details -- 4.3 Training Details -- 5 Conclusion -- References -- Probability Loop Closure Detection with Fisher Kernel Framework for Visual SLAM -- 1 Introduction -- 2 Related Works -- 3 Methodology -- 3.1 Fisher Vector Generation -- 3.2 Probability Visual Vocabulary -- 3.3 Loop Closure Detection -- 4 Results and Discussion -- 4.1 Dataset and Preprocessing -- 4.2 Evaluation Metrics -- 4.3 2D Motion -- 4.4 3D Motion -- 4.5 Bidirectional Loops -- 4.6 Ablation Study -- 5 Conclusions -- References -- A Complex Background Image Registration Method Based on the Optical Flow Field Algorithm -- 1 Introduction -- 2 The Proposed Method -- 3 Evaluation Functions -- 4 Experimental Results -- 5 Conclusion -- References -- Collaborative Learning Method for Natural Image Captioning -- 1 Introduction -- 2 Related Work -- 3 Method -- 3.1 P2PM: Pix2Pix Inverting Module -- 3.2 NLGM: Natural Language Generation Module -- 3.3 Collaborative Learning Framework -- 4 Experiments -- 4.1 Experimental Setup -- 4.2 Main Results -- 5 Conclusion -- References -- Visual Analysis of the National Characteristics of the COVID-19 Vaccine Based on Knowledge Graph -- 1 Introduction -- 2 Related Studies -- 3 Construction of the COVID-19 Vaccine Knowledge Graph -- 3.1 Data Acquisition -- 3.2 Entity Extraction to Construct a Relational Model -- 3.3 Knowledge Graph Establishment -- 4 Visual Analysis of the National Characteristics of the COVID-19 Vaccine -- 5 Conclusions and Recommendations -- References -- Speech Recognition for Parkinson's Disease Based on Improved Genetic Algorithm and Data Enhancement Technology -- 1 Introduction -- 2 The Proposed Methods -- 2.1 Method -- 3 Speech Recognition and Diagnosis -- 3.1 Data Preprocessing -- 3.2 Improved GA-SVM Model -- 3.3 Speech Recognition Algorithm -- 4 Experiment and Evaluation -- 4.1 Experiment Setup. 327 $a4.2 Comparison and Analysis of Results. 330 $aThis two volume set (CCIS 1628 and 1629) constitutes the refereed proceedings of the 8th International Conference of Pioneering Computer Scientists, Engineers and Educators, ICPCSEE 2022 held in Chengdu, China, in August, 2022. The 65 full papers and 26 short papers presented in these two volumes were carefully reviewed and selected from 261 submissions. The papers are organized in topical sections on: Big Data Mining and Knowledge Management; Machine Learning for Data Science; Multimedia Data Management and Analysis. 410 0$aCommunications in Computer and Information Science,$x1865-0937 ;$v1628 606 $aData mining 606 $aApplication software 606 $aMachine learning 606 $aEducation$xData processing 606 $aSocial sciences$xData processing 606 $aData Mining and Knowledge Discovery 606 $aComputer and Information Systems Applications 606 $aMachine Learning 606 $aComputers and Education 606 $aComputer Application in Social and Behavioral Sciences 615 0$aData mining. 615 0$aApplication software. 615 0$aMachine learning. 615 0$aEducation$xData processing. 615 0$aSocial sciences$xData processing. 615 14$aData Mining and Knowledge Discovery. 615 24$aComputer and Information Systems Applications. 615 24$aMachine Learning. 615 24$aComputers and Education. 615 24$aComputer Application in Social and Behavioral Sciences. 676 $a005.7 702 $aWang$b Yang 801 0$bMiAaPQ 801 1$bMiAaPQ 801 2$bMiAaPQ 906 $aBOOK 912 $a9910586578803321 996 $aData Science$91562261 997 $aUNINA