SS2023 Project - Deep Learning Model Compression Techniques for High Speed Object Detection

Posted on Apr 3, 2023 by Saif Khan

High Speed Object Detection

Supervisor: Talha Uddin Sheikh

Study and implementation of different pruning and quantization techniques for deep learning model compression for state of the art object detection models.


  • Study of different pruning methods for deep learning models
  • Study of different quantization methods for deep learning models
  • Implementation of different pruning and quantization methods on a state of the art object detection model
  • Experimentation and benchmarking

[1] Helms, D., Amende, K., Bukhari, S., de Graaff, T., Frickenstein, A., Hafner, F., … & Vemparala, M. R. (2021, July). Optimizing Neural Networks for Embedded Hardware. In SMACD/PRIME 2021; International Conference on SMACD and 16th Conference on PRIME (pp. 1-6). VDE.
[2] Hawks, B., Duarte, J., Fraser, N. J., Pappalardo, A., Tran, N., & Umuroglu, Y. (2021). Ps and qs: Quantization-aware pruning for efficient low latency neural network inference. Frontiers in Artificial Intelligence, 4, 676564.
[3] Hubara, I., Courbariaux, M., Soudry, D., El-Yaniv, R., & Bengio, Y. (2017). Quantized neural networks: Training neural networks with low precision weights and activations. The Journal of Machine Learning Research, 18(1), 6869-6898.


At MindGarage, we believe that creativity and innovation are essential for advancing the field of Artificial Intelligence. That's why we provide an open and unconstrained environment for highly motivated students to explore the possibilities of Deep Learning. We encourage freedom of thought and creativity in tackling challenging problems, and we're always on the lookout for talented individuals to join our team. If you're passionate about AI and want to contribute to groundbreaking research in Deep Learning, we invite you to learn more about our lab and our projects.


Gottlieb-Daimler-Str. 48 (48-462),
67663 Kaiserslautern

Copyright © 2023 RPTU. All rights reserved.

Contact | Imprint | Privacy Policy