Mahmud / Doborjeh / Wong | Neural Information Processing | Buch | 978-981-966593-8 | sack.de

Buch, Englisch, 448 Seiten, Format (B × H): 155 mm x 235 mm, Gewicht: 727 g

Reihe: Lecture Notes in Computer Science

Mahmud / Doborjeh / Wong

Neural Information Processing

31st International Conference, ICONIP 2024, Auckland, New Zealand, December 2-6, 2024, Proceedings, Part VII
Erscheinungsjahr 2025
ISBN: 978-981-966593-8
Verlag: Springer

31st International Conference, ICONIP 2024, Auckland, New Zealand, December 2-6, 2024, Proceedings, Part VII

Buch, Englisch, 448 Seiten, Format (B × H): 155 mm x 235 mm, Gewicht: 727 g

Reihe: Lecture Notes in Computer Science

ISBN: 978-981-966593-8
Verlag: Springer


The eleven-volume set LNCS 15286-15296 constitutes the refereed proceedings of the 31st International Conference on Neural Information Processing, ICONIP 2024, held in Auckland, New Zealand, in December 2024.
The 318 regular papers presented in the proceedings set were carefully reviewed and selected from 1301 submissions. They focus on four main areas, namely: theory and algorithms; cognitive neurosciences; human-centered computing; and applications.

Mahmud / Doborjeh / Wong Neural Information Processing jetzt bestellen!

Zielgruppe


Research

Weitere Infos & Material


LoTraNet: Locality-guided Transformer Network for Image Manipulation Localization.- Progressive EMD-based Trajectory Prediction: A Multistage Approach for Enhanced Human Trajectory Forecasting.- Dual-Level Contrastive Learning Framework.- DLAFormer: A Novel Approach to Image Super-Resolution with Comprehensive Attention Mechanisms.- Audio-Infused Automatic Image Colorization by Exploiting Audio Scene Semantics.- CoMISI: Multimodal Speaker Identification in Diverse Audio-Visual Conditions through Cross-Modal Interaction.- Multi-scale Spatial Feature Aggregation For Efficient Super Resolution.- SCANet: Split Coordinate Attention Network for Building Footprint Extraction.- XFusion: Cross-Attention Transformer for Multi-Focus Image Fusion.- Guided DiffusionDet: Guided Diffusion Model for Object Detection with Resample Mechanism.- Mutual Information-based Mixed Precision Quantization.- MLLM-Driven Semantic Enhancement and Alignment for Text-Based Person Search.- TFCM: Tuning-Free Facial Concept-Erasure in Text-to-Image Models through Attention and Sample Modulation.- Selecting the Best Sequential Transfer Path for Medical Image Segmentation with Limited Labeled Data.- Knowledge Distillation with Differentiable Optimal Transport on Graph Neural Networks.- Test-Time Intensity Consistency Adaptation for Shadow Detection.- Learning from Noisy Labels for Long-tailed Data via Optimal Transport.- LCRPS: Large-Capacity Residual Plane Steganography Based on Multiple Adversarial Networks.- Aesthetics-Guided Multi-scale Feature Fusion for Style Transfer.- BEVRoad: A Cross-Modal and Temporary-Recurrent 3D Object Detector for Infrastructure Perception.- Dilated Pyramid Attention in Hierarchical Vision Transformer for Texture Recognition.- Attention-based Domain Adaptive YOLO For Cross-domain Object Detection.- In-WSOD: Integrality Weakly Supervised Object Detection with Classification and Localization Consistency.- GLEGNet: Infrared and Visible Image Fusion Via Global-Local Feature Extraction and Edge-Gradient Preservation.- Mending of Spatio-Temporal Dependencies in Block Adjacency Matrix.- CaDT-Net: A Cascaded Deformable Transformer Network for Multiclass Breast Cancer Histopathological Image Classification.- DIFA: Deformable Implicit Feature Alignment for Roadside Cooperative Perception.- Transferring Teacher’s Invariance to Student Through Data Augmentation Optimization.- AARR-Net: An Attention Assistance Feature Fusion and Model Recursive Recovery Network for Category-level 6D Object Pose Estimation.- BRS-YOLO: A Balanced Optical Remote Sensing Object Detection Method.- HDKI: A Hierarchical Deep Koopman Framework for Spatio-Temporal Prediction with Image Observations.



Ihre Fragen, Wünsche oder Anmerkungen
Vorname*
Nachname*
Ihre E-Mail-Adresse*
Kundennr.
Ihre Nachricht*
Lediglich mit * gekennzeichnete Felder sind Pflichtfelder.
Wenn Sie die im Kontaktformular eingegebenen Daten durch Klick auf den nachfolgenden Button übersenden, erklären Sie sich damit einverstanden, dass wir Ihr Angaben für die Beantwortung Ihrer Anfrage verwenden. Selbstverständlich werden Ihre Daten vertraulich behandelt und nicht an Dritte weitergegeben. Sie können der Verwendung Ihrer Daten jederzeit widersprechen. Das Datenhandling bei Sack Fachmedien erklären wir Ihnen in unserer Datenschutzerklärung.