Unveiling Energy Efficiency in Deep Learning

1Georgia State University, 2The University of North Carolina at Charlotte, 3Toyota InfoTech Labs

News

description [Jan 4, 2024] Code and datasets released on GitHub.
description [Oct 25, 2023] Paper released on arXiv.
description [Sep 22, 2023] Paper accepted to ACM/IEEE Symposium on Edge Computing (SEC 2023).

Abstract

Today, deep learning optimization is primarily driven by research focused on achieving high inference accuracy and reducing latency.However, the energy efficiency aspect is often overlooked, possibly due to a lack of sustainability mindset in the field and the absence of a holistic energy dataset.In this paper, we conduct a threefold study, including energy measurement, prediction, and efficiency scoring, with an objective to foster transparency in power and energy consumption within deep learning across various edge devices. Firstly, we present a detailed, first-of-its-kind measurement study that uncovers the energy consumption characteristics of on-device deep learning. This study results in the creation of three extensive energy datasets for edge devices, covering a wide range of kernels, state-of-the-art DNN models, and popular AI applications. Secondly, we design and implement the first kernel-level energy predictors for edge devices based on our kernel-level energy dataset. Evaluation results demonstrate the ability of our predictors to provide consistent and accurate energy estimations on unseen DNN models. Lastly, we introduce two scoring metrics, PCS and IECS, developed to convert complex power and energy consumption data of an edge device into an easily understandable manner for edge device end-users. We hope our work can help shift the mindset of both end-users and the research community towards sustainability in edge computing, a principle that drives our research.


Energy Efficiency Benchmark and Scoring

Inference Energy Consumption Score (IECS)

The IECS is designed to assess edge device energy efficiency, and calculated as the sum of inference energy consumption (IEC) for all six edge AI applications under CPU, GPU, and NNAPI delegates.

IECS

Power Consumption Score (PCS)

The PCS is designed to capture the aggregated power efficiency (PE) for running all six edge AI applications with 12 reference DNN models using CPU, GPU, and NNAPI delegates.

PCS

Energy Efficiency vs. AI Performance

The figure illustrates a tradeoff among AI performance, PCS, IECS, and its selling price. In the figure, a larger ball represents a higher selling price for the device. An edge device that demonstrates superior power efficiency (higher PCS) and AI inference performance (higher AI performance score) is positioned toward the top-right corner of the figure. We observe that scoring metrics have a significant impact on benchmarking results for edge devices. The proposed IECS can effectively balance power efficiency with AI inference performance.


Presentation


BibTeX

@inproceedings{tu2023energy,
  author    = {Tu, Xiaolong and Mallik, Anik and Chen, Dawei and Han, Kyungtae and Altintas, Onur and Wang, Haoxin and Xie, Jiang},
  title     = {Unveiling Energy Efficiency in Deep Learning: Measurement, Prediction, and Scoring across Edge Devices},
  booktitle = {Proc. The Eighth ACM/IEEE Symposium on Edge Computing (SEC)},
  pages     = {1--14},
  year      = {2023},
}

Acknowledgements

This work was supported by funds from Toyota Motor North America and by the US National Science Foundation (NSF) under Grant No. 1910667, 1910891, and 2025284.