top

  Info

  • Utilizzare la checkbox di selezione a fianco di ciascun documento per attivare le funzionalità di stampa, invio email, download nei formati disponibili del (i) record.

  Info

  • Utilizzare questo link per rimuovere la selezione effettuata.
Machine translation [[electronic resource] /] / Bożena Henisz-Dostert, R. Ross Macdonald, Michael Zarechnak
Machine translation [[electronic resource] /] / Bożena Henisz-Dostert, R. Ross Macdonald, Michael Zarechnak
Autore Henisz-Dostert Bozena
Edizione [Reprint 2011]
Pubbl/distr/stampa The Hague ; ; New York, : Mouton, c1979
Descrizione fisica 1 online resource (280 p.)
Disciplina 418.02
Altri autori (Persone) MacdonaldR. Ross <1922-> (Roderick Ross)
ZarechnakMichael
Collana Trends in Linguistics. Studies and Monographs [TiLSM]
Trends in linguistics.
Soggetto topico Machine translating
ISBN 3-11-081667-9
Classificazione ES 960
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Nota di contenuto pt. 1. The history of machine translation / Michael Zarechnak -- pt. 2. The problem of machine translation / R. Ross Macdonald -- pt. 3. Users' evaluation of machine translation / Bożena Henisz-Dostert.
Record Nr. UNINA-9910785608003321
Henisz-Dostert Bozena  
The Hague ; ; New York, : Mouton, c1979
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui
Machine translation / / Bożena Henisz-Dostert, R. Ross Macdonald, Michael Zarechnak
Machine translation / / Bożena Henisz-Dostert, R. Ross Macdonald, Michael Zarechnak
Autore Henisz-Dostert Bozena
Edizione [Reprint 2011]
Pubbl/distr/stampa The Hague ; ; New York, : Mouton, c1979
Descrizione fisica 1 online resource (280 p.)
Disciplina 418.02
Altri autori (Persone) MacdonaldR. Ross <1922-> (Roderick Ross)
ZarechnakMichael
Collana Trends in Linguistics. Studies and Monographs [TiLSM]
Trends in linguistics.
Soggetto topico Machine translating
ISBN 3-11-081667-9
Classificazione ES 960
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Nota di contenuto pt. 1. The history of machine translation / Michael Zarechnak -- pt. 2. The problem of machine translation / R. Ross Macdonald -- pt. 3. Users' evaluation of machine translation / Bożena Henisz-Dostert.
Record Nr. UNINA-9910823354503321
Henisz-Dostert Bozena  
The Hague ; ; New York, : Mouton, c1979
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui
Machine translation : 18th China conference, CCMT 2022, Lhasa, China, August 6-10, 2022 : revised selected papers / / Tong Xiao and Juan Pino
Machine translation : 18th China conference, CCMT 2022, Lhasa, China, August 6-10, 2022 : revised selected papers / / Tong Xiao and Juan Pino
Autore Xiao Tong
Pubbl/distr/stampa Singapore : , : Springer, , [2022]
Descrizione fisica 1 online resource (175 pages)
Disciplina 495.10285
Collana Communications in Computer and Information Science
Soggetto topico Chinese language - Machine translating
Machine translating
ISBN 981-19-7960-X
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Nota di contenuto Intro -- Preface -- Organization -- Contents -- PEACook: Post-editing Advancement Cookbook -- 1 Introduction -- 2 Related Work -- 2.1 APE Problem and APE Metrics -- 2.2 APE Baselines -- 3 PEACook Corpus -- 3.1 PEACook Corpus Details -- 4 Baseline Model Experiments -- 4.1 Pre-training AR-APE Model -- 4.2 Fine-Tuning AR-APE Model -- 4.3 Pre-training NAR-APE Model -- 4.4 Fine-Tuning NAR-APE Model -- 5 Conclusion -- References -- Hot-Start Transfer Learning Combined with Approximate Distillation for Mongolian-Chinese Neural Machine Translation -- 1 Introduction -- 2 Background -- 2.1 NMT -- 2.2 Transfer Learning -- 2.3 Pre-train Techniques -- 3 Methods -- 3.1 Word Alignment Under Hot-Start -- 3.2 Approximate Distillation -- 4 Experiment -- 4.1 Settings -- 4.2 Results and Analysis -- 4.3 Ablation Test -- 4.4 Case Analysis -- 5 Conclusion -- References -- Review-Based Curriculum Learning for Neural Machine Translation -- 1 Introduction -- 2 Related Work -- 3 Review-Based Curriculum Learning -- 3.1 Time-Based Review Method -- 3.2 Master-Based Review Method -- 3.3 General Domain Enhanced Training -- 4 Experiment -- 4.1 Data and Setup -- 4.2 Main Results -- 5 Analysis -- 5.1 Effect of Mixed Fine Tuning -- 5.2 Low-Resource Scenario -- 5.3 Data Sharding -- 5.4 Training Efficiency -- 6 Conclusion -- References -- Multi-strategy Enhanced Neural Machine Translation for Chinese Minority Languages -- 1 Introduction -- 2 Dataset -- 3 System Overview -- 3.1 Back-Translation -- 3.2 Alternated Training -- 3.3 Ensemble -- 4 Experiments -- 4.1 Mongolian Chinese -- 4.2 TibetanChinese -- 4.3 UyghurChinese -- 5 Analysis -- 5.1 The Effect of Different Back-Translation Methods -- 5.2 The Impact of Sentence Segmentation on the Translation Quality of Machine Translation -- 5.3 Analysis of BLEU Scores of MongolianChinese Machine Translation on the Development Set.
6 Conclusion -- References -- Target-Side Language Model for Reference-Free Machine Translation Evaluation -- 1 Introduction -- 2 Target-Side Language Model Metrics -- 3 Experiments -- 3.1 Datasets and Baselines -- 3.2 Results -- 3.3 Discussion -- 4 Conclusion -- References -- Life Is Short, Train It Less: Neural Machine Tibetan-Chinese Translation Based on mRASP and Dataset Enhancement -- 1 Introduction -- 2 Prerequisite -- 2.1 Neural Machine Translation with mRASP -- 2.2 Diversification Method -- 2.3 Curvature -- 3 Methodology -- 3.1 Overall Structure -- 3.2 Curvature Based Checkpoint Hijack -- 4 Experiments -- 4.1 Dataset Description and Finetune Parameters -- 4.2 Experiment Result -- 5 Conclusion -- References -- Improving the Robustness of Low-Resource Neural Machine Translation with Adversarial Examples -- 1 Introduction -- 2 Background and Related Work -- 2.1 Neural Machine Translation -- 2.2 Adversarial Example, Adversarial Attack and Adversarial Training in NLP -- 2.3 Genetic Algorithm-Based Adversarial Attack -- 2.4 Gradient-Based Adversarial Attack -- 3 Adversarial Examples Based on Reinforcement Learning -- 3.1 Reinforcement Learning -- 3.2 Environment -- 3.3 Agent -- 4 Experiment -- 4.1 Data Preprocessing -- 4.2 NMT Model -- 4.3 Evaluating Indicator -- 4.4 Adversarial Attack Results and Analysis -- 4.5 Adversarial Training Results and Analysis -- 4.6 Ablation Study -- 5 Conclusion -- References -- Dynamic Mask Curriculum Learning for Non-Autoregressive Neural Machine Translation -- 1 Introduction -- 2 Background -- 2.1 Non-autoregressive Neural Machine Translation -- 2.2 Curriculum Learning -- 3 Method -- 3.1 Model -- 3.2 Dynamic Mask Curriculum Learning -- 3.3 Train and Inference -- 4 Experiment -- 4.1 Data Preparation -- 4.2 Configuration -- 4.3 Baseline -- 4.4 Results -- 5 Analysis -- 5.1 Mask Strategy -- 5.2 Method Generality.
6 Conclusion -- References -- Dynamic Fusion Nearest Neighbor Machine Translation via Dempster-Shafer Theory -- 1 Introduction -- 2 Background -- 3 Method -- 3.1 Dempster-Shafer Theory -- 3.2 Label Smoothing -- 4 Experiment -- 4.1 Experimental Setup -- 4.2 Result and Analysis -- 4.3 Robustness -- 4.4 Case Study -- 5 Conclusion -- References -- A Multi-tasking and Multi-stage Chinese Minority Pre-trained Language Model -- 1 Introduction -- 2 Related Work -- 2.1 Pre-trained Language Model -- 2.2 Multilingual Model -- 2.3 Chinese Minority Languages -- 3 Main Methods -- 3.1 Model Architecture -- 3.2 Multi-tasking Multi-stage Pre-training -- 3.3 Model Parameter Details -- 3.4 Model Setting Details -- 4 Experiments -- 4.1 Main Results -- 4.2 Case Study -- 5 Conclusion -- References -- An Improved Multi-task Approach to Pre-trained Model Based MT Quality Estimation -- 1 Introduction -- 2 Related Works -- 3 PE Based Multi-task Learning for Sentence Level QE -- 3.1 Multi-task Learning Framework for QE -- 3.2 PE Based Multi-task Learning QE -- 3.3 Multi-model Ensemble -- 4 Experiments -- 4.1 Dataset -- 4.2 Model Training and Evaluation Metric -- 4.3 Experimental Results and Analysis -- 4.4 Ablation Study -- 5 Conclusion -- References -- Optimizing Deep Transformers for Chinese-Thai Low-Resource Translation -- 1 Introduction -- 2 Background -- 2.1 Transformer -- 2.2 Low-Resource NMT -- 2.3 Parameter Initialization for Deep Transformers -- 2.4 Deep Transformers for Low-Resource Tasks -- 3 Our Work -- 3.1 Data Processing -- 3.2 Exploration of Training Settings -- 3.3 Deep Transformers for Low-Resource Machine Translation -- 4 Related Work -- 5 Conclusion -- References -- CCMT 2022 Translation Quality Estimation Task -- 1 Introduction -- 2 Estimation System -- 3 Data -- 4 Method -- 4.1 System Training -- 4.2 System Test -- 5 Experiment -- 5.1 System Environment.
5.2 Experiment Settings -- 5.3 Experiment Result -- 6 Conclusion -- References -- Effective Data Augmentation Methods for CCMT 2022 -- 1 Introduction -- 2 System Architecture -- 3 Methods -- 3.1 Data Augmentation -- 3.2 CE Task and EC Task -- 3.3 CThai Task and ThaiC Task -- 4 Experiments -- 4.1 System Settings -- 4.2 Data Pre-processing -- 4.3 Experimental Results -- 5 Conclusion -- References -- NJUNLP's Submission for CCMT 2022 Quality Estimation Task -- 1 Introduction -- 2 Methods -- 2.1 Existing Methods -- 2.2 Proposed Methods -- 3 Experiments -- 3.1 Dataset -- 3.2 Settings -- 3.3 Single Model Results -- 3.4 Ensemble -- 3.5 Analysis -- 4 Conclusion -- References -- ISTIC's Thai-to-Chinese Neural Machine Translation System for CCMT' 2022 -- 1 Introduction -- 2 System Architecture -- 2.1 Baseline System -- 2.2 Our System -- 3 Methods -- 3.1 Back Translation -- 3.2 Add External Data -- 3.3 Model Averaging -- 3.4 Model Ensemble Strategy -- 4 Experiments -- 4.1 System Settings -- 4.2 Data Preprocessing -- 4.3 Experimental Results -- 4.4 Conclusion -- References -- Author Index.
Record Nr. UNISA-996503566103316
Xiao Tong  
Singapore : , : Springer, , [2022]
Materiale a stampa
Lo trovi qui: Univ. di Salerno
Opac: Controlla la disponibilità qui
Machine translation : 18th China conference, CCMT 2022, Lhasa, China, August 6-10, 2022 : revised selected papers / / Tong Xiao and Juan Pino
Machine translation : 18th China conference, CCMT 2022, Lhasa, China, August 6-10, 2022 : revised selected papers / / Tong Xiao and Juan Pino
Autore Xiao Tong
Pubbl/distr/stampa Singapore : , : Springer, , [2022]
Descrizione fisica 1 online resource (175 pages)
Disciplina 495.10285
Collana Communications in Computer and Information Science
Soggetto topico Chinese language - Machine translating
Machine translating
ISBN 981-19-7960-X
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Nota di contenuto Intro -- Preface -- Organization -- Contents -- PEACook: Post-editing Advancement Cookbook -- 1 Introduction -- 2 Related Work -- 2.1 APE Problem and APE Metrics -- 2.2 APE Baselines -- 3 PEACook Corpus -- 3.1 PEACook Corpus Details -- 4 Baseline Model Experiments -- 4.1 Pre-training AR-APE Model -- 4.2 Fine-Tuning AR-APE Model -- 4.3 Pre-training NAR-APE Model -- 4.4 Fine-Tuning NAR-APE Model -- 5 Conclusion -- References -- Hot-Start Transfer Learning Combined with Approximate Distillation for Mongolian-Chinese Neural Machine Translation -- 1 Introduction -- 2 Background -- 2.1 NMT -- 2.2 Transfer Learning -- 2.3 Pre-train Techniques -- 3 Methods -- 3.1 Word Alignment Under Hot-Start -- 3.2 Approximate Distillation -- 4 Experiment -- 4.1 Settings -- 4.2 Results and Analysis -- 4.3 Ablation Test -- 4.4 Case Analysis -- 5 Conclusion -- References -- Review-Based Curriculum Learning for Neural Machine Translation -- 1 Introduction -- 2 Related Work -- 3 Review-Based Curriculum Learning -- 3.1 Time-Based Review Method -- 3.2 Master-Based Review Method -- 3.3 General Domain Enhanced Training -- 4 Experiment -- 4.1 Data and Setup -- 4.2 Main Results -- 5 Analysis -- 5.1 Effect of Mixed Fine Tuning -- 5.2 Low-Resource Scenario -- 5.3 Data Sharding -- 5.4 Training Efficiency -- 6 Conclusion -- References -- Multi-strategy Enhanced Neural Machine Translation for Chinese Minority Languages -- 1 Introduction -- 2 Dataset -- 3 System Overview -- 3.1 Back-Translation -- 3.2 Alternated Training -- 3.3 Ensemble -- 4 Experiments -- 4.1 Mongolian Chinese -- 4.2 TibetanChinese -- 4.3 UyghurChinese -- 5 Analysis -- 5.1 The Effect of Different Back-Translation Methods -- 5.2 The Impact of Sentence Segmentation on the Translation Quality of Machine Translation -- 5.3 Analysis of BLEU Scores of MongolianChinese Machine Translation on the Development Set.
6 Conclusion -- References -- Target-Side Language Model for Reference-Free Machine Translation Evaluation -- 1 Introduction -- 2 Target-Side Language Model Metrics -- 3 Experiments -- 3.1 Datasets and Baselines -- 3.2 Results -- 3.3 Discussion -- 4 Conclusion -- References -- Life Is Short, Train It Less: Neural Machine Tibetan-Chinese Translation Based on mRASP and Dataset Enhancement -- 1 Introduction -- 2 Prerequisite -- 2.1 Neural Machine Translation with mRASP -- 2.2 Diversification Method -- 2.3 Curvature -- 3 Methodology -- 3.1 Overall Structure -- 3.2 Curvature Based Checkpoint Hijack -- 4 Experiments -- 4.1 Dataset Description and Finetune Parameters -- 4.2 Experiment Result -- 5 Conclusion -- References -- Improving the Robustness of Low-Resource Neural Machine Translation with Adversarial Examples -- 1 Introduction -- 2 Background and Related Work -- 2.1 Neural Machine Translation -- 2.2 Adversarial Example, Adversarial Attack and Adversarial Training in NLP -- 2.3 Genetic Algorithm-Based Adversarial Attack -- 2.4 Gradient-Based Adversarial Attack -- 3 Adversarial Examples Based on Reinforcement Learning -- 3.1 Reinforcement Learning -- 3.2 Environment -- 3.3 Agent -- 4 Experiment -- 4.1 Data Preprocessing -- 4.2 NMT Model -- 4.3 Evaluating Indicator -- 4.4 Adversarial Attack Results and Analysis -- 4.5 Adversarial Training Results and Analysis -- 4.6 Ablation Study -- 5 Conclusion -- References -- Dynamic Mask Curriculum Learning for Non-Autoregressive Neural Machine Translation -- 1 Introduction -- 2 Background -- 2.1 Non-autoregressive Neural Machine Translation -- 2.2 Curriculum Learning -- 3 Method -- 3.1 Model -- 3.2 Dynamic Mask Curriculum Learning -- 3.3 Train and Inference -- 4 Experiment -- 4.1 Data Preparation -- 4.2 Configuration -- 4.3 Baseline -- 4.4 Results -- 5 Analysis -- 5.1 Mask Strategy -- 5.2 Method Generality.
6 Conclusion -- References -- Dynamic Fusion Nearest Neighbor Machine Translation via Dempster-Shafer Theory -- 1 Introduction -- 2 Background -- 3 Method -- 3.1 Dempster-Shafer Theory -- 3.2 Label Smoothing -- 4 Experiment -- 4.1 Experimental Setup -- 4.2 Result and Analysis -- 4.3 Robustness -- 4.4 Case Study -- 5 Conclusion -- References -- A Multi-tasking and Multi-stage Chinese Minority Pre-trained Language Model -- 1 Introduction -- 2 Related Work -- 2.1 Pre-trained Language Model -- 2.2 Multilingual Model -- 2.3 Chinese Minority Languages -- 3 Main Methods -- 3.1 Model Architecture -- 3.2 Multi-tasking Multi-stage Pre-training -- 3.3 Model Parameter Details -- 3.4 Model Setting Details -- 4 Experiments -- 4.1 Main Results -- 4.2 Case Study -- 5 Conclusion -- References -- An Improved Multi-task Approach to Pre-trained Model Based MT Quality Estimation -- 1 Introduction -- 2 Related Works -- 3 PE Based Multi-task Learning for Sentence Level QE -- 3.1 Multi-task Learning Framework for QE -- 3.2 PE Based Multi-task Learning QE -- 3.3 Multi-model Ensemble -- 4 Experiments -- 4.1 Dataset -- 4.2 Model Training and Evaluation Metric -- 4.3 Experimental Results and Analysis -- 4.4 Ablation Study -- 5 Conclusion -- References -- Optimizing Deep Transformers for Chinese-Thai Low-Resource Translation -- 1 Introduction -- 2 Background -- 2.1 Transformer -- 2.2 Low-Resource NMT -- 2.3 Parameter Initialization for Deep Transformers -- 2.4 Deep Transformers for Low-Resource Tasks -- 3 Our Work -- 3.1 Data Processing -- 3.2 Exploration of Training Settings -- 3.3 Deep Transformers for Low-Resource Machine Translation -- 4 Related Work -- 5 Conclusion -- References -- CCMT 2022 Translation Quality Estimation Task -- 1 Introduction -- 2 Estimation System -- 3 Data -- 4 Method -- 4.1 System Training -- 4.2 System Test -- 5 Experiment -- 5.1 System Environment.
5.2 Experiment Settings -- 5.3 Experiment Result -- 6 Conclusion -- References -- Effective Data Augmentation Methods for CCMT 2022 -- 1 Introduction -- 2 System Architecture -- 3 Methods -- 3.1 Data Augmentation -- 3.2 CE Task and EC Task -- 3.3 CThai Task and ThaiC Task -- 4 Experiments -- 4.1 System Settings -- 4.2 Data Pre-processing -- 4.3 Experimental Results -- 5 Conclusion -- References -- NJUNLP's Submission for CCMT 2022 Quality Estimation Task -- 1 Introduction -- 2 Methods -- 2.1 Existing Methods -- 2.2 Proposed Methods -- 3 Experiments -- 3.1 Dataset -- 3.2 Settings -- 3.3 Single Model Results -- 3.4 Ensemble -- 3.5 Analysis -- 4 Conclusion -- References -- ISTIC's Thai-to-Chinese Neural Machine Translation System for CCMT' 2022 -- 1 Introduction -- 2 System Architecture -- 2.1 Baseline System -- 2.2 Our System -- 3 Methods -- 3.1 Back Translation -- 3.2 Add External Data -- 3.3 Model Averaging -- 3.4 Model Ensemble Strategy -- 4 Experiments -- 4.1 System Settings -- 4.2 Data Preprocessing -- 4.3 Experimental Results -- 4.4 Conclusion -- References -- Author Index.
Record Nr. UNINA-9910634040703321
Xiao Tong  
Singapore : , : Springer, , [2022]
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui
Machine translation : 17th China Conference, CCMT 2021, Xining, China, October 8-10, 2021 : revised selected papers / / Jingsong Su, Rico Sennrich (editors)
Machine translation : 17th China Conference, CCMT 2021, Xining, China, October 8-10, 2021 : revised selected papers / / Jingsong Su, Rico Sennrich (editors)
Pubbl/distr/stampa Singapore : , : Springer, , [2021]
Descrizione fisica 1 online resource (137 pages)
Disciplina 418.020285
Collana Communications in Computer and Information Science
Soggetto topico Machine translating
Computational linguistics
ISBN 981-16-7512-0
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Record Nr. UNINA-9910508476103321
Singapore : , : Springer, , [2021]
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui
Machine translation : 17th China Conference, CCMT 2021, Xining, China, October 8-10, 2021 : revised selected papers / / Jingsong Su, Rico Sennrich (editors)
Machine translation : 17th China Conference, CCMT 2021, Xining, China, October 8-10, 2021 : revised selected papers / / Jingsong Su, Rico Sennrich (editors)
Pubbl/distr/stampa Singapore : , : Springer, , [2021]
Descrizione fisica 1 online resource (137 pages)
Disciplina 418.020285
Collana Communications in Computer and Information Science
Soggetto topico Machine translating
Computational linguistics
ISBN 981-16-7512-0
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Record Nr. UNISA-996464403403316
Singapore : , : Springer, , [2021]
Materiale a stampa
Lo trovi qui: Univ. di Salerno
Opac: Controlla la disponibilità qui
Machine translation : 16th China Conference, CCMT 2020, Hohhot, China, October 10-12, 2020, revised selected papers / / Junhui Li, Andy Way (editors)
Machine translation : 16th China Conference, CCMT 2020, Hohhot, China, October 10-12, 2020, revised selected papers / / Junhui Li, Andy Way (editors)
Edizione [1st ed. 2020.]
Pubbl/distr/stampa Gateway East, Singapore : , : Springer, , [2020]
Descrizione fisica 1 online resource (XII, 143 p. 14 illus.)
Disciplina 418.020285
Collana Communications in computer and information science
Soggetto topico Machine translating
ISBN 981-336-162-X
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Nota di contenuto Transfer Learning for Chinese-Lao Neural Machine Translation with Linguistic Similarity -- MTNER: A Corpus for Mongolian Tourism Named Entity Recognition -- Unsupervised Machine Translation Quality Estimation in Black-box Setting -- YuQ: A Chinese-Uyghur Medical-domain Neural Machine Translation Dataset Towards Knowledge-driven -- Quality Estimation for Machine Translation with Multi-granularity Interaction -- Transformer-based unified neural network for quality estimation and Transformer-based re-decoding model for machine translation -- NJUNLP’s Machine Translation System for CCMT-2020 Uighur → Chinese Translation Task -- Description and Findings of OPPO’s Machine Translation Systems for CCMT 2020 -- Tsinghua University Neural Machine Translation Systems for CCMT 2020 -- BJTU’s Submission to CCMT 2020 Quality Estimation Task -- NJUNLP’s Submission for CCMT20 Quality Estimation Task -- Tencent Submissions for the CCMT 2020 Quality Estimation Task -- Neural Machine Translation based on Back-Translation for Multilingual Translation Evaluation Task. .
Record Nr. UNISA-996465345603316
Gateway East, Singapore : , : Springer, , [2020]
Materiale a stampa
Lo trovi qui: Univ. di Salerno
Opac: Controlla la disponibilità qui
Machine translation : 16th China Conference, CCMT 2020, Hohhot, China, October 10-12, 2020, revised selected papers / / Junhui Li, Andy Way (editors)
Machine translation : 16th China Conference, CCMT 2020, Hohhot, China, October 10-12, 2020, revised selected papers / / Junhui Li, Andy Way (editors)
Edizione [1st ed. 2020.]
Pubbl/distr/stampa Gateway East, Singapore : , : Springer, , [2020]
Descrizione fisica 1 online resource (XII, 143 p. 14 illus.)
Disciplina 418.020285
Collana Communications in computer and information science
Soggetto topico Machine translating
ISBN 981-336-162-X
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Nota di contenuto Transfer Learning for Chinese-Lao Neural Machine Translation with Linguistic Similarity -- MTNER: A Corpus for Mongolian Tourism Named Entity Recognition -- Unsupervised Machine Translation Quality Estimation in Black-box Setting -- YuQ: A Chinese-Uyghur Medical-domain Neural Machine Translation Dataset Towards Knowledge-driven -- Quality Estimation for Machine Translation with Multi-granularity Interaction -- Transformer-based unified neural network for quality estimation and Transformer-based re-decoding model for machine translation -- NJUNLP’s Machine Translation System for CCMT-2020 Uighur → Chinese Translation Task -- Description and Findings of OPPO’s Machine Translation Systems for CCMT 2020 -- Tsinghua University Neural Machine Translation Systems for CCMT 2020 -- BJTU’s Submission to CCMT 2020 Quality Estimation Task -- NJUNLP’s Submission for CCMT20 Quality Estimation Task -- Tencent Submissions for the CCMT 2020 Quality Estimation Task -- Neural Machine Translation based on Back-Translation for Multilingual Translation Evaluation Task. .
Record Nr. UNINA-9910447240503321
Gateway East, Singapore : , : Springer, , [2020]
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui
Machine translation
Machine translation
Pubbl/distr/stampa Dordrecht, : Kluwer Academic Publishers, 1989-
Disciplina 418.020285
Soggetto topico Machine translating
Computational linguistics
Traduction automatique
Linguistique informatique
Soggetto genere / forma Czasopismo językoznawcze
Periodicals.
ISSN 1573-0573
Formato Materiale a stampa
Livello bibliografico Periodico
Lingua di pubblicazione eng
Altri titoli varianti MT
Record Nr. UNINA-9910142667403321
Dordrecht, : Kluwer Academic Publishers, 1989-
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui
Machine translation
Machine translation
Pubbl/distr/stampa Dordrecht, : Kluwer Academic Publishers, 1989-
Disciplina 418.020285
Soggetto topico Machine translating
Computational linguistics
Traduction automatique
Linguistique informatique
Soggetto genere / forma Czasopismo językoznawcze
Periodicals.
ISSN 1573-0573
Formato Materiale a stampa
Livello bibliografico Periodico
Lingua di pubblicazione eng
Altri titoli varianti MT
Record Nr. UNISA-996216678703316
Dordrecht, : Kluwer Academic Publishers, 1989-
Materiale a stampa
Lo trovi qui: Univ. di Salerno
Opac: Controlla la disponibilità qui