Skip to main content
  • Company
    • About Us
    • Projects
    • Startup Lab
    • AI Solutions
    • Security Expertise
    • Contact
  • Knowledge
    • Blog
    • Research
hello@horizon-dynamics.tech
Horizon Dynamics
  1. Home
  2. Research
  3. Adaptive knowledge distillation 2025
Company
  • About Us
  • Projects
  • Startup Lab
  • AI Solutions
  • Security Expertise
  • Contact
Contact Ushello@horizon-dynamics.tech
Horizon Dynamics
© 2013 - 2026 Horizon Dynamics LLC — All rights reserved.

Right Solution For True Ideas

Publications/2025
Journal Articles2025

Method of Adaptive Knowledge Distillation from Multi-Teacher to Student Deep Learning Models

Pavlo Radiuk, Oleksandr Chaban, Eduard Manziuk

Journal of Edge Computing, Vol. 4, No. 2, pp. 159-178

Knowledge DistillationMulti-TeacherDeep LearningModel CompressionEdge Computing
View Publication

Abstract

This paper proposes a method of adaptive knowledge distillation from multiple teacher models to a student model for efficient deployment on edge devices. The approach dynamically weights teacher contributions based on their expertise for different input samples.

Citation

Pavlo Radiuk, Oleksandr Chaban, Eduard Manziuk. "Method of Adaptive Knowledge Distillation from Multi-Teacher to Student Deep Learning Models". Journal of Edge Computing, Vol. 4, No. 2, pp. 159-178, 2025.

Related Publications

2025Journal Articles

Towards Transparent AI in Medicine: ECG-Based Arrhythmia Detection with Explainable Deep Learning

Pavlo Radiuk, Liliana Klymenko, Iurii Krak

Technologies
2024Journal Articles

Explainable Deep Learning: A Visual Analytics Approach with Transition Matrices

Pavlo Radiuk, Oleksander Barmak, Eduard Manziuk et al.

Mathematics
All Publications