1.

Record Nr.

UNINA9911011818803321

Autore

Wang Qingyun

Titolo

AI for Research and Scalable, Efficient Systems : Second International Workshop, AI4Research 2025, and First International Workshop, SEAS 2025, Held in Conjunction with AAAI 2025, Philadelphia, PA, USA, February 25–March 4, 2025, Proceedings / / edited by Qingyun Wang, Wenpeng Yin, Abhishek Aich, Yumin Suh, Kuan-Chuan Peng

Pubbl/distr/stampa

Singapore : , : Springer Nature Singapore : , : Imprint : Springer, , 2025

ISBN

9789819689125

Edizione

[1st ed. 2025.]

Descrizione fisica

1 online resource (412 pages)

Collana

Communications in Computer and Information Science, , 1865-0937 ; ; 2533

Altri autori (Persone)

YinWenpeng

AichAbhishek

SuhYumin

PengKuan-Chuan

Disciplina

006.31

Soggetti

Machine learning

Natural language processing (Computer science)

Expert systems (Computer science)

Data mining

Artificial intelligence - Data processing

Artificial intelligence

Machine Learning

Natural Language Processing (NLP)

Knowledge Based Systems

Data Mining and Knowledge Discovery

Data Science

Artificial Intelligence

Lingua di pubblicazione

Inglese

Formato

Materiale a stampa

Livello bibliografico

Monografia

Nota di contenuto

-- AI4Research 2025.  -- ResearchCodeAgent: An LLM Multi-Agent System for Automated Codification of Research Methodologies.  -- LLMs Tackle Meta-Analysis: Automating Scientific Hypothesis Generation with Statistical Rigor.  -- AuditBench: A Benchmark for



Large Language Models in Financial Statement Auditing.  -- Clustering Time Series Data with Gaussian Mixture Embeddings in a Graph Autoencoder Framework.  -- Empowering AI as Autonomous Researchers: Evaluating LLMs in Generating Novel Research Ideas through Automated Metrics.  -- Multi-LLM Collaborative Caption Generation in Scientific Documents.  -- CypEGAT: A Deep Learning Framework Integrating Protein Language Model and Graph Attention Networks for Enhanced CYP450s Substrate Prediction.  -- Understanding How Paper Writers Use AI-Generated Captions in Figure Caption Writing.  -- SEAS 2025.  -- ssProp: Energy-Efficient Training for Convolutional Neural Networks with Scheduled Sparse Back Propagation.  -- Knowledge Distillation with Training Wheels.  -- PickLLM: Context-Aware RL-Assisted Large Language Model Routing.  -- ZNorm: Z-Score Gradient Normalization Accelerating Skip-Connected Network Training without Architectural Modification.  -- The Impact of Multilingual Model Scaling on Seen and Unseen Language Performance.  -- Information Consistent Pruning: How to Efficiently Search for Sparse Networks?.  -- Efficient Image Similarity Search with Quadtrees.

Sommario/riassunto

This book constitutes the proceedings of the Second International Workshop, AI4Research 2025, and First International Workshop, SEAS 2025, which were held in conjunction with AAAI 2025, Philadelphia, PA, USA, during February 25–March 4, 2025. AI4Research 2025 presented 8 full papers from 35 submissions. The papers covered diverse areas such as agent debate evaluation, taxonomy expansion, hypothesis generation, AI4Research benchmarks, caption generation, drug discovery, and financial auditing. SEAS 2025 accepted 7 full papers from 17 submissions. These papers explore the efficiency and scalability of AI models.