1.

Record Nr.

UNINA9910495202103321

Titolo

Performance Evaluation and Benchmarking : 12th TPC Technology Conference, TPCTC 2020, Tokyo, Japan, August 31, 2020, Revised Selected Papers / / edited by Raghunath Nambiar, Meikel Poess

Pubbl/distr/stampa

Cham : , : Springer International Publishing : , : Imprint : Springer, , 2021

ISBN

3-030-84924-4

Edizione

[1st ed. 2021.]

Descrizione fisica

1 online resource (XIII, 113 p. 34 illus., 13 illus. in color.)

Collana

Programming and Software Engineering, , 2945-9168 ; ; 12752

Disciplina

005.74

Soggetti

Electronic digital computers - Evaluation

Database management

Application software

Computer systems

Expert systems (Computer science)

Information technology - Management

System Performance and Evaluation

Database Management

Computer and Information Systems Applications

Computer System Implementation

Knowledge Based Systems

Computer Application in Administrative Data Processing

Lingua di pubblicazione

Inglese

Formato

Materiale a stampa

Livello bibliografico

Monografia

Nota di contenuto

Towards Testing ACID Compliance in the LDBC Social Network Benchmark -- EXPOSE: Experimental Performance Evaluation of Stream Processing Engines Made Easy -- Revisiting Issues in Benchmarking Metric Selection -- Performance Evaluation for Digital Transformation -- Experimental Comparison of Relational and NoSQL Document Systems: the Case of Decision Support -- A Framework for Supporting Repetition and Evaluation in the Process of Cloud-based DBMS Performance Benchmarking -- Benchmarking AI Inference: Where we are in 2020 -- A Domain Independent Benchmark Evolution Model for



the Transaction Processing Performance Council.

Sommario/riassunto

This book constitutes the refereed post-conference proceedings of the 12th TPC Technology Conference on Performance Evaluation and Benchmarking, TPCTC 2020, held in August 2020. The 8 papers presented were carefully reviewed and cover the following topics: testing ACID compliance in the LDBC social network benchmark; experimental performance evaluation of stream processing engines made easy; revisiting issues in benchmarking metric selection; performance evaluation for digital transformation; experimental comparison of relational and NoSQL document systems; a framework for supporting repetition and evaluation in the process of cloud-based DBMS performance benchmarking; benchmarking AI inference; a domain independent benchmark evolution model for the transaction processing performance council.