2008 6th International Symposium on Chinese Spoken Language Processing
| 2008 6th International Symposium on Chinese Spoken Language Processing |
| Pubbl/distr/stampa | [Place of publication not identified], : I E E E, 2008 |
| Disciplina | 495.10285 |
| Soggetto topico |
Chinese language - Machine translating
Chinese language - Spoken Chinese Chinese language - Technical Chinese |
| ISBN |
1-5090-8072-4
1-4244-2943-9 |
| Formato | Materiale a stampa |
| Livello bibliografico | Monografia |
| Lingua di pubblicazione | eng |
| Record Nr. | UNISA-996203035203316 |
| [Place of publication not identified], : I E E E, 2008 | ||
| Lo trovi qui: Univ. di Salerno | ||
| ||
2008 6th International Symposium on Chinese Spoken Language Processing
| 2008 6th International Symposium on Chinese Spoken Language Processing |
| Pubbl/distr/stampa | [Place of publication not identified], : I E E E, 2008 |
| Disciplina | 495.10285 |
| Soggetto topico |
Chinese language - Machine translating
Chinese language - Spoken Chinese Chinese language - Technical Chinese |
| ISBN |
9781509080724
1509080724 9781424429431 1424429439 |
| Formato | Materiale a stampa |
| Livello bibliografico | Monografia |
| Lingua di pubblicazione | eng |
| Record Nr. | UNINA-9910145403803321 |
| [Place of publication not identified], : I E E E, 2008 | ||
| Lo trovi qui: Univ. Federico II | ||
| ||
2012 8th International Symposium on Chinese Spoken Language Processing
| 2012 8th International Symposium on Chinese Spoken Language Processing |
| Pubbl/distr/stampa | [Place of publication not identified], : IEEE, 2012 |
| Disciplina | 495.10285 |
| Soggetto topico |
Chinese language - Machine translating
Chinese language - Spoken Chinese Chinese language - Technical Chinese |
| ISBN |
1-4673-2507-4
1-4673-2505-8 |
| Formato | Materiale a stampa |
| Livello bibliografico | Monografia |
| Lingua di pubblicazione | eng |
| Record Nr. | UNISA-996279719603316 |
| [Place of publication not identified], : IEEE, 2012 | ||
| Lo trovi qui: Univ. di Salerno | ||
| ||
2012 8th International Symposium on Chinese Spoken Language Processing
| 2012 8th International Symposium on Chinese Spoken Language Processing |
| Pubbl/distr/stampa | [Place of publication not identified], : IEEE, 2012 |
| Disciplina | 495.10285 |
| Soggetto topico |
Chinese language - Machine translating
Chinese language - Spoken Chinese Chinese language - Technical Chinese |
| ISBN |
9781467325073
1467325074 9781467325059 1467325058 |
| Formato | Materiale a stampa |
| Livello bibliografico | Monografia |
| Lingua di pubblicazione | eng |
| Record Nr. | UNINA-9910133593603321 |
| [Place of publication not identified], : IEEE, 2012 | ||
| Lo trovi qui: Univ. Federico II | ||
| ||
2014 9th International Symposium on Chinese Spoken Language Processing (ISCSLP 2014) : 12-14 September 2014, Singapore / / edited by Minghui Dong, Jianhua Tao, Haizhou LI, Thomas Fang Zheng and Yanfeng Lu
| 2014 9th International Symposium on Chinese Spoken Language Processing (ISCSLP 2014) : 12-14 September 2014, Singapore / / edited by Minghui Dong, Jianhua Tao, Haizhou LI, Thomas Fang Zheng and Yanfeng Lu |
| Pubbl/distr/stampa | IEEE |
| Disciplina | 495.10285 |
| Soggetto topico |
Chinese language - Machine translating
Chinese language - Spoken Chinese Chinese language - Technical Chinese Speech processing systems Automatic speech recognition |
| ISBN | 1-4799-4219-7 |
| Formato | Materiale a stampa |
| Livello bibliografico | Monografia |
| Lingua di pubblicazione | eng |
| Altri titoli varianti |
The 9th International Symposium on Chinese Spoken Language Processing
Chinese Spoken Language Processing |
| Record Nr. | UNISA-996279324203316 |
| IEEE | ||
| Lo trovi qui: Univ. di Salerno | ||
| ||
2014 9th International Symposium on Chinese Spoken Language Processing (ISCSLP 2014) : 12-14 September 2014, Singapore / / edited by Minghui Dong, Jianhua Tao, Haizhou LI, Thomas Fang Zheng and Yanfeng Lu
| 2014 9th International Symposium on Chinese Spoken Language Processing (ISCSLP 2014) : 12-14 September 2014, Singapore / / edited by Minghui Dong, Jianhua Tao, Haizhou LI, Thomas Fang Zheng and Yanfeng Lu |
| Pubbl/distr/stampa | IEEE |
| Disciplina | 495.10285 |
| Soggetto topico |
Chinese language - Machine translating
Chinese language - Spoken Chinese Chinese language - Technical Chinese Speech processing systems Automatic speech recognition |
| ISBN |
9781479942190
1479942197 |
| Formato | Materiale a stampa |
| Livello bibliografico | Monografia |
| Lingua di pubblicazione | eng |
| Altri titoli varianti |
The 9th International Symposium on Chinese Spoken Language Processing
Chinese Spoken Language Processing |
| Record Nr. | UNINA-9910135091403321 |
| IEEE | ||
| Lo trovi qui: Univ. Federico II | ||
| ||
ISCSLP : 2018 11th International Symposium on Chinese Spoken Language Processing : November 26-29, 2018, Academia Sinica, Taiwan, Taiwan / / Institute of Electrical and Electronics Engineers
| ISCSLP : 2018 11th International Symposium on Chinese Spoken Language Processing : November 26-29, 2018, Academia Sinica, Taiwan, Taiwan / / Institute of Electrical and Electronics Engineers |
| Pubbl/distr/stampa | Piscataway, New Jersey : , : Institute of Electrical and Electronics Engineers, , 2018 |
| Descrizione fisica | 1 online resource (xxiii, 504 pages) |
| Disciplina | 495.10285 |
| Soggetto topico |
Chinese language - Machine translating
Chinese language - Spoken Chinese |
| ISBN | 1-5386-5627-2 |
| Formato | Materiale a stampa |
| Livello bibliografico | Monografia |
| Lingua di pubblicazione | eng |
| Record Nr. | UNINA-9910332519003321 |
| Piscataway, New Jersey : , : Institute of Electrical and Electronics Engineers, , 2018 | ||
| Lo trovi qui: Univ. Federico II | ||
| ||
ISCSLP : 2018 11th International Symposium on Chinese Spoken Language Processing : November 26-29, 2018, Academia Sinica, Taiwan, Taiwan / / Institute of Electrical and Electronics Engineers
| ISCSLP : 2018 11th International Symposium on Chinese Spoken Language Processing : November 26-29, 2018, Academia Sinica, Taiwan, Taiwan / / Institute of Electrical and Electronics Engineers |
| Pubbl/distr/stampa | Piscataway, New Jersey : , : Institute of Electrical and Electronics Engineers, , 2018 |
| Descrizione fisica | 1 online resource (xxiii, 504 pages) |
| Disciplina | 495.10285 |
| Soggetto topico |
Chinese language - Machine translating
Chinese language - Spoken Chinese |
| ISBN | 1-5386-5627-2 |
| Formato | Materiale a stampa |
| Livello bibliografico | Monografia |
| Lingua di pubblicazione | eng |
| Record Nr. | UNISA-996577953903316 |
| Piscataway, New Jersey : , : Institute of Electrical and Electronics Engineers, , 2018 | ||
| Lo trovi qui: Univ. di Salerno | ||
| ||
Machine translation : 18th China conference, CCMT 2022, Lhasa, China, August 6-10, 2022 : revised selected papers / / Tong Xiao and Juan Pino
| Machine translation : 18th China conference, CCMT 2022, Lhasa, China, August 6-10, 2022 : revised selected papers / / Tong Xiao and Juan Pino |
| Autore | Xiao Tong |
| Pubbl/distr/stampa | Singapore : , : Springer, , [2022] |
| Descrizione fisica | 1 online resource (175 pages) |
| Disciplina | 495.10285 |
| Collana | Communications in Computer and Information Science |
| Soggetto topico |
Chinese language - Machine translating
Machine translating |
| ISBN | 981-19-7960-X |
| Formato | Materiale a stampa |
| Livello bibliografico | Monografia |
| Lingua di pubblicazione | eng |
| Nota di contenuto |
Intro -- Preface -- Organization -- Contents -- PEACook: Post-editing Advancement Cookbook -- 1 Introduction -- 2 Related Work -- 2.1 APE Problem and APE Metrics -- 2.2 APE Baselines -- 3 PEACook Corpus -- 3.1 PEACook Corpus Details -- 4 Baseline Model Experiments -- 4.1 Pre-training AR-APE Model -- 4.2 Fine-Tuning AR-APE Model -- 4.3 Pre-training NAR-APE Model -- 4.4 Fine-Tuning NAR-APE Model -- 5 Conclusion -- References -- Hot-Start Transfer Learning Combined with Approximate Distillation for Mongolian-Chinese Neural Machine Translation -- 1 Introduction -- 2 Background -- 2.1 NMT -- 2.2 Transfer Learning -- 2.3 Pre-train Techniques -- 3 Methods -- 3.1 Word Alignment Under Hot-Start -- 3.2 Approximate Distillation -- 4 Experiment -- 4.1 Settings -- 4.2 Results and Analysis -- 4.3 Ablation Test -- 4.4 Case Analysis -- 5 Conclusion -- References -- Review-Based Curriculum Learning for Neural Machine Translation -- 1 Introduction -- 2 Related Work -- 3 Review-Based Curriculum Learning -- 3.1 Time-Based Review Method -- 3.2 Master-Based Review Method -- 3.3 General Domain Enhanced Training -- 4 Experiment -- 4.1 Data and Setup -- 4.2 Main Results -- 5 Analysis -- 5.1 Effect of Mixed Fine Tuning -- 5.2 Low-Resource Scenario -- 5.3 Data Sharding -- 5.4 Training Efficiency -- 6 Conclusion -- References -- Multi-strategy Enhanced Neural Machine Translation for Chinese Minority Languages -- 1 Introduction -- 2 Dataset -- 3 System Overview -- 3.1 Back-Translation -- 3.2 Alternated Training -- 3.3 Ensemble -- 4 Experiments -- 4.1 Mongolian Chinese -- 4.2 TibetanChinese -- 4.3 UyghurChinese -- 5 Analysis -- 5.1 The Effect of Different Back-Translation Methods -- 5.2 The Impact of Sentence Segmentation on the Translation Quality of Machine Translation -- 5.3 Analysis of BLEU Scores of MongolianChinese Machine Translation on the Development Set.
6 Conclusion -- References -- Target-Side Language Model for Reference-Free Machine Translation Evaluation -- 1 Introduction -- 2 Target-Side Language Model Metrics -- 3 Experiments -- 3.1 Datasets and Baselines -- 3.2 Results -- 3.3 Discussion -- 4 Conclusion -- References -- Life Is Short, Train It Less: Neural Machine Tibetan-Chinese Translation Based on mRASP and Dataset Enhancement -- 1 Introduction -- 2 Prerequisite -- 2.1 Neural Machine Translation with mRASP -- 2.2 Diversification Method -- 2.3 Curvature -- 3 Methodology -- 3.1 Overall Structure -- 3.2 Curvature Based Checkpoint Hijack -- 4 Experiments -- 4.1 Dataset Description and Finetune Parameters -- 4.2 Experiment Result -- 5 Conclusion -- References -- Improving the Robustness of Low-Resource Neural Machine Translation with Adversarial Examples -- 1 Introduction -- 2 Background and Related Work -- 2.1 Neural Machine Translation -- 2.2 Adversarial Example, Adversarial Attack and Adversarial Training in NLP -- 2.3 Genetic Algorithm-Based Adversarial Attack -- 2.4 Gradient-Based Adversarial Attack -- 3 Adversarial Examples Based on Reinforcement Learning -- 3.1 Reinforcement Learning -- 3.2 Environment -- 3.3 Agent -- 4 Experiment -- 4.1 Data Preprocessing -- 4.2 NMT Model -- 4.3 Evaluating Indicator -- 4.4 Adversarial Attack Results and Analysis -- 4.5 Adversarial Training Results and Analysis -- 4.6 Ablation Study -- 5 Conclusion -- References -- Dynamic Mask Curriculum Learning for Non-Autoregressive Neural Machine Translation -- 1 Introduction -- 2 Background -- 2.1 Non-autoregressive Neural Machine Translation -- 2.2 Curriculum Learning -- 3 Method -- 3.1 Model -- 3.2 Dynamic Mask Curriculum Learning -- 3.3 Train and Inference -- 4 Experiment -- 4.1 Data Preparation -- 4.2 Configuration -- 4.3 Baseline -- 4.4 Results -- 5 Analysis -- 5.1 Mask Strategy -- 5.2 Method Generality. 6 Conclusion -- References -- Dynamic Fusion Nearest Neighbor Machine Translation via Dempster-Shafer Theory -- 1 Introduction -- 2 Background -- 3 Method -- 3.1 Dempster-Shafer Theory -- 3.2 Label Smoothing -- 4 Experiment -- 4.1 Experimental Setup -- 4.2 Result and Analysis -- 4.3 Robustness -- 4.4 Case Study -- 5 Conclusion -- References -- A Multi-tasking and Multi-stage Chinese Minority Pre-trained Language Model -- 1 Introduction -- 2 Related Work -- 2.1 Pre-trained Language Model -- 2.2 Multilingual Model -- 2.3 Chinese Minority Languages -- 3 Main Methods -- 3.1 Model Architecture -- 3.2 Multi-tasking Multi-stage Pre-training -- 3.3 Model Parameter Details -- 3.4 Model Setting Details -- 4 Experiments -- 4.1 Main Results -- 4.2 Case Study -- 5 Conclusion -- References -- An Improved Multi-task Approach to Pre-trained Model Based MT Quality Estimation -- 1 Introduction -- 2 Related Works -- 3 PE Based Multi-task Learning for Sentence Level QE -- 3.1 Multi-task Learning Framework for QE -- 3.2 PE Based Multi-task Learning QE -- 3.3 Multi-model Ensemble -- 4 Experiments -- 4.1 Dataset -- 4.2 Model Training and Evaluation Metric -- 4.3 Experimental Results and Analysis -- 4.4 Ablation Study -- 5 Conclusion -- References -- Optimizing Deep Transformers for Chinese-Thai Low-Resource Translation -- 1 Introduction -- 2 Background -- 2.1 Transformer -- 2.2 Low-Resource NMT -- 2.3 Parameter Initialization for Deep Transformers -- 2.4 Deep Transformers for Low-Resource Tasks -- 3 Our Work -- 3.1 Data Processing -- 3.2 Exploration of Training Settings -- 3.3 Deep Transformers for Low-Resource Machine Translation -- 4 Related Work -- 5 Conclusion -- References -- CCMT 2022 Translation Quality Estimation Task -- 1 Introduction -- 2 Estimation System -- 3 Data -- 4 Method -- 4.1 System Training -- 4.2 System Test -- 5 Experiment -- 5.1 System Environment. 5.2 Experiment Settings -- 5.3 Experiment Result -- 6 Conclusion -- References -- Effective Data Augmentation Methods for CCMT 2022 -- 1 Introduction -- 2 System Architecture -- 3 Methods -- 3.1 Data Augmentation -- 3.2 CE Task and EC Task -- 3.3 CThai Task and ThaiC Task -- 4 Experiments -- 4.1 System Settings -- 4.2 Data Pre-processing -- 4.3 Experimental Results -- 5 Conclusion -- References -- NJUNLP's Submission for CCMT 2022 Quality Estimation Task -- 1 Introduction -- 2 Methods -- 2.1 Existing Methods -- 2.2 Proposed Methods -- 3 Experiments -- 3.1 Dataset -- 3.2 Settings -- 3.3 Single Model Results -- 3.4 Ensemble -- 3.5 Analysis -- 4 Conclusion -- References -- ISTIC's Thai-to-Chinese Neural Machine Translation System for CCMT' 2022 -- 1 Introduction -- 2 System Architecture -- 2.1 Baseline System -- 2.2 Our System -- 3 Methods -- 3.1 Back Translation -- 3.2 Add External Data -- 3.3 Model Averaging -- 3.4 Model Ensemble Strategy -- 4 Experiments -- 4.1 System Settings -- 4.2 Data Preprocessing -- 4.3 Experimental Results -- 4.4 Conclusion -- References -- Author Index. |
| Record Nr. | UNISA-996503566103316 |
Xiao Tong
|
||
| Singapore : , : Springer, , [2022] | ||
| Lo trovi qui: Univ. di Salerno | ||
| ||
Machine Translation : 18th China Conference, CCMT 2022, Lhasa, China, August 6–10, 2022, Revised Selected Papers / / edited by Tong Xiao, Juan Pino
| Machine Translation : 18th China Conference, CCMT 2022, Lhasa, China, August 6–10, 2022, Revised Selected Papers / / edited by Tong Xiao, Juan Pino |
| Autore | Xiao Tong |
| Edizione | [1st ed. 2022.] |
| Pubbl/distr/stampa | Singapore : , : Springer Nature Singapore : , : Imprint : Springer, , 2022 |
| Descrizione fisica | 1 online resource (175 pages) |
| Disciplina | 495.10285 |
| Collana | Communications in Computer and Information Science |
| Soggetto topico |
Natural language processing (Computer science)
Database management Computer science Coding theory Information theory Social sciences - Data processing Natural Language Processing (NLP) Database Management Computer Science Logic and Foundations of Programming Coding and Information Theory Computer Application in Social and Behavioral Sciences |
| ISBN |
9789811979606
981197960X |
| Formato | Materiale a stampa |
| Livello bibliografico | Monografia |
| Lingua di pubblicazione | eng |
| Nota di contenuto | PEACook: Post-Editing Advancement Cookbook -- Hot-start Transfer Learning combined with Approximate Distillation for Mongolian- Chinese Neural Machine Translation -- Review-based Curriculum Learning for Neural Machine Translation -- Multi-Strategy Enhanced Neural Machine Translation for Chinese Minority Language -- Target-side Language Model for Reference-free Machine Translation Evaluation -- Life Is Short, Train It Less: Neural Machine Tibetan-Chinese Translation Based on mRASP and Dataset Enhancement -- Improving the Robustness of Low-Resource Neural Machine Translation with Adversarial Examples -- Dynamic Mask Curriculum Learning for Non-Autoregressive Neural Machine Translation -- Dynamic Fusion Nearest Neighbor Machine Translation via Dempster–Shafer Theory -- A Multi-tasking and Multi-stage Chinese Minority Pre-Trained Language Model -- An improved Multi-task Approach to Pre-trained Model Based MT Quality Estimation -- Optimizing Deep Transformers for Chinese-Thai Low-Resource Translation -- HW-TSC Submission for CCMT 2022 Translation Quality Estimation Task -- Effective Data Augmentation Methods for CCMT 2022 -- NJUNLP’s Submission for CCMT 2022 Quality Estimation Task -- ISTIC’s Thai-to-Chinese Neural Machine Translation System for CCMT’ 2022. |
| Record Nr. | UNINA-9910634040703321 |
Xiao Tong
|
||
| Singapore : , : Springer Nature Singapore : , : Imprint : Springer, , 2022 | ||
| Lo trovi qui: Univ. Federico II | ||
| ||