Vai al contenuto principale della pagina
| Titolo: |
Human-computer interaction : interaction techniques and novel applications : thematic area, HCI 2021, held as part of the 23rd HCI International Conference, HCII 2021, virtual event, July 24-29, 2021, Proceedings. Part II / / edited by Masaaki Kurosu
|
| Pubblicazione: | Cham, Switzerland : , : Springer, , [2021] |
| ©2021 | |
| Descrizione fisica: | 1 online resource (672 pages) |
| Disciplina: | 004.019 |
| Soggetto topico: | Human-computer interaction |
| Persona (resp. second.): | KurosuMasaaki <1948-> |
| Note generali: | Includes index. |
| Nota di contenuto: | Intro -- Foreword -- HCI International 2021 Thematic Areas and Affiliated Conferences -- Contents - Part II -- Novel Interaction Techniques -- Performance Evaluation and Efficiency of Laser Holographic Peripherals -- 1 Introduction -- 2 Related Work -- 2.1 The Development of Keyboard -- 2.2 Mechanical Keyboard Types -- 2.3 Introduction of Virtual Keyboard -- 2.4 How Virtual Keyboard and Mouse Function -- 2.5 How Virtual Keyboard and Mouse Process the Information -- 3 Experimental Setup -- 4 Experimental Result -- 4.1 Keyboard Experiment Results -- 4.2 Mouse Experiment Results -- 4.3 Possible Improvements in Hardware -- 4.4 Possible Improvements Through AutoCorrect and RNN-LM -- 5 Conclusion -- References -- Using Real-Pen Specific Features of Active Stylus to Cope with Input Latency -- 1 Introduction -- 2 Background and Related Work -- 2.1 Nature of Touch Screen Input Latency -- 2.2 Focus of Input Latency Studies -- 2.3 Latency Compensation Approaches -- 3 Orientation, Tilt, Pressure for Latency Compensation -- 3.1 Real-Pen Specific Features of Active Stylus -- 3.2 Dataset and Preprocessing -- 3.3 Implementation Details -- 4 Study 1: Latency Compensation Accuracy -- 4.1 Apparatus -- 4.2 Participants -- 4.3 Task and Procedure -- 4.4 Measurements -- 4.5 Results and Discussion -- 5 Study 2: Users' Perception of Latency -- 5.1 Apparatus -- 5.2 Participants -- 5.3 Task and Procedure -- 5.4 Results and Discussion -- 6 Conclusion and Future Work -- References -- Comparing Eye Tracking and Head Tracking During a Visual Attention Task in Immersive Virtual Reality -- 1 Introduction -- 2 Materials and Methods -- 2.1 Participants -- 2.2 Virtual Environment and Data Collection -- 2.3 Comparison Between Head-Tracking and Eye-Tracking -- 3 Results -- 4 Discussion -- 5 Conclusions -- References. |
| Investigation of Motion Video Enhancement for Image-Based Avatars on Small Displays -- 1 Introduction -- 2 Importance of an Upright Posture for Avatars -- 3 Method for Enhancing Body Sway Motions -- 4 Subjective Assessment -- 4.1 Conditions of the Subjective Assessment -- 4.2 Result of Subjective Assessment -- 5 Conclusion -- References -- Sound Symbolic Words as a Game Controller -- 1 Introduction -- 2 Method -- 2.1 Game Design -- 2.2 Game System -- 3 Experiment -- 3.1 Experiment Design -- 4 Result and Discussion -- 4.1 Result -- 4.2 Discussion -- References -- Towards Improved Vibro-Tactile P300 BCIs -- 1 Introduction -- 2 Methods -- 2.1 Subjects -- 2.2 Materials and Methods -- 2.3 Stimulation Sequence -- 2.4 One Session -- 2.5 Feature Expression and Classification -- 2.6 Indecisive Answers -- 2.7 ERP Plots -- 2.8 ITR -- 3 Results -- 4 Discussion -- References -- Talking Through the Eyes: User Experience Design for Eye Gaze Redirection in Live Video Conferencing -- 1 Introduction -- 2 Related Work -- 2.1 No Eye Contact in VC -- 2.2 ER Function Using Hardware -- 2.3 ER Research Using Software -- 2.4 ER Method Using Avatar -- 2.5 Summary -- 3 Methods -- 3.1 User Research Design -- 3.2 Interview with VC Users -- 4 Survey -- 4.1 The Degree to Which You Care About Yourself Photographed on the Camera -- 4.2 Camera Face Angle Desired by the User -- 4.3 Camera Gaze Experience for Intentional Eye Contact -- 5 Result -- 5.1 Degree of Awkwardness About the Self-shot on the Camera -- 5.2 Face Angle Preference -- 5.3 Experience of Intentional Camera Gaze -- 6 User Experience-Oriented ER Guideline Proposal -- 6.1 3D Source for 3D Face Angle -- 6.2 3D Face Selection with Morphing Image Step 4 -- 6.3 Teleprompter -- 7 Conclusion and Future Studies -- References -- Evaluating the Accuracy and User Experience of a Gesture-Based Infrared Remote Control in Smart Homes. | |
| 1 Introduction -- 2 Related Work -- 2.1 Gestures for Entertainment and Work -- 2.2 Gestures in the Home Environment -- 3 Hardware Concept and Realization -- 4 Experimental Design of User Studies -- 5 Results of User Tests -- 5.1 User Experience -- 5.2 Accuracy -- 6 Discussion -- 7 Conclusion and Next Steps -- References -- Detection of Finger Contact with Skin Based on Shadows and Texture Around Fingertips -- 1 Introduction -- 2 Related Work -- 2.1 Operations and Inputs in Virtual Reality Environments -- 2.2 Input Methods Using Finger Contact with Body -- 3 Proposed Method -- 4 Prototype Implementation -- 4.1 Phase 1: Fingertip Extraction -- 4.2 Phase 2: Image Enhancement -- 4.3 Phase 3: Contact Detection -- 5 Dataset -- 5.1 Recoding Input Motion Videos -- 5.2 Data Augmentation and Datasets -- 6 Experiments -- 6.1 Experiment 1: Performance Evaluation of Contact Detection Model -- 6.2 Experiment 2: Performance Evaluation for New Users -- 6.3 Experiment 3: Performance Evaluation for New Lighting Environments -- 6.4 Experiment 4: Performance Evaluation for New Users and Lighting Environments -- 6.5 Discussion -- 7 Conclusion -- References -- Character Input Method Working on 1-in. Round Screen for Tiny Smartwatches -- 1 Introduction -- 2 Previous Research -- 3 Proposed Method -- 3.1 Key Layout at Standby Status -- 3.2 Selection of a Row of Hiragana -- 3.3 Selection of a Character -- 4 Input Speed and Error Rate of Beginners -- 4.1 Experimental Condition -- 4.2 Experimental Procedure -- 4.3 Experimental Result -- 5 A 30-Day Experiment -- 5.1 Experimental Condition -- 5.2 Experimental Result -- 6 Conclusion -- References -- One Stroke Alphanumeric Input Method by Sliding-in and Sliding-out on the Smartwatch Screen -- 1 Introduction -- 2 Related Researches -- 3 Proposed Method -- 4 Experiment -- 4.1 Experiment Preparation. | |
| 4.2 Input Speed and Error Rate of Beginners -- 4.3 Long-Term Experiment -- 4.4 Input Speed and Error Rate of the Expert -- 4.5 Comparison to SliT -- 4.6 Comparison to the Other Related Method -- 5 Comparison -- References -- Research on Hand Detection in Complex Scenes Based on RGB-D Sensor -- 1 Introduction -- 1.1 Research Background -- 1.2 Related Research Statuses -- 1.3 Research Content -- 2 Method of Hand Segmentation and Contour Extraction -- 2.1 Data Acquisition Equipment -- 2.2 Preprocessing -- 2.3 Background Modeling -- 2.4 Hand Segmentation Method -- 2.5 Contour Extraction Method -- 3 Experimental Verification and Result Analysis -- 3.1 Results in Complex Scenario -- 3.2 Evaluation Parameters -- 4 Conclusion -- References -- It's a Joint Effort: Understanding Speech and Gesture in Collaborative Tasks -- 1 Introduction -- 2 Related Work -- 2.1 Gesture in Psychology -- 2.2 Gesture in HCI -- 2.3 Multimodal Communication and Collaboration -- 3 Method -- 3.1 Participants -- 3.2 Procedure -- 3.3 Apparatus -- 3.4 Block Layouts -- 3.5 Conditions -- 3.6 Data Analysis -- 4 Results -- 4.1 Differences in Gesture Use -- 4.2 Differences in Speech Use -- 4.3 Co-expression of Speech and Gesture -- 4.4 Task Performance -- 5 Discussion -- 5.1 Different Strategies for Different Modalities -- 5.2 Speech and Gesture for Resolving Orientation -- 5.3 Improved Performance with Gesture -- 6 Implications for Design -- 7 Conclusion -- References -- Human-Robot Interaction -- Analysing Action and Intention Recognition in Human-Robot Interaction with ANEMONE -- 1 Introduction -- 2 ANEMONE -- 3 Practical Application of ANEMONE -- 3.1 Phase 1: Preparation -- 3.2 Phase 2: Selection of UX Evaluation Type -- 3.3 Plan and Conduct the UX Evaluation -- 3.4 Phase 4: Analysis of Collected Data and Identifying UX Problems. | |
| 3.5 Phase 5: Organising the Identified UX Problems in Scope and Severity -- 4 Lessons Learned and Recommendations -- 5 Concluding Remarks -- References -- A Robot that Tells You It is Watching You with Its Eyes -- 1 Introduction -- 2 Related Studies -- 2.1 Effects of Eyes on Communication -- 2.2 Eyes of Social Robots -- 2.3 Eyes and Human-Robot Interaction -- 2.4 Social Robots in Public Spaces -- 3 Proposal of an Eye-Based Interaction -- 3.1 KiroPi V2 -- 3.2 Concept -- 4 Development -- 4.1 User Position Estimation -- 4.2 Sizes and Positions of Eyes -- 4.3 Telling Distance -- 4.4 Telling Angle -- 5 Experiment 1: Eyes Expression Evaluation -- 5.1 Convergence Angle and User's Shadow -- 5.2 Eyes Movement Bias -- 6 Experiment 2: Shopping Scenario -- 6.1 Scenario -- 6.2 Method -- 6.3 Results -- 7 Discussion -- 8 Conclusion -- References -- Am I Conquering the Robot? the Impact of Personality on the Style of Cooperation with an Automatic System -- 1 Introduction -- 2 Humans' Trust in Robot: Review of Related Studies -- 3 Experiment -- 3.1 Robot System -- 3.2 Measurement Tools and Interview -- 3.3 Task and Procedure -- 4 Results and Analysis -- 4.1 Evaluation of the Performance and TR -- 4.2 Evaluation of DC and TR -- 4.3 Evaluation of EQ, DC, and TR -- 4.4 "Happiness" of the Robot -- 5 Discussion and Conclusions -- References -- Kansei Evaluation of Robots in Virtual Space Considering Their Physical Attributes -- 1 Introduction -- 2 Robot Pairs -- 2.1 Talking Robot (Robot Pair 1) -- 2.2 Floating Guide Robot (Robot Pair 2) -- 2.3 Vending Machine Robot (Robot Pair 3) -- 3 Experiment Method -- 3.1 Questionnaire -- 3.2 Experimental Procedure -- 4 Experiment Results -- 5 Discussion -- 6 Conclusion and Future Works -- References -- The Use of a Sex Doll as Proxy Technology to Study Human-Robot Interaction -- 1 Introduction -- 2 Method. | |
| 2.1 Procedure and Measures. | |
| Titolo autorizzato: | Human-computer interaction ![]() |
| ISBN: | 3-030-78465-7 |
| Formato: | Materiale a stampa |
| Livello bibliografico | Monografia |
| Lingua di pubblicazione: | Inglese |
| Record Nr.: | 996464515103316 |
| Lo trovi qui: | Univ. di Salerno |
| Opac: | Controlla la disponibilità qui |