Recently, the role of mobile devices has changed from a calling or entertaining device to a tool for making life easier. However, this growing role is associated with extensive computing requirements to execute tasks such as object detection. Moreover, executing heavy tasks in a mobile with limited resources takes a long processing time and consumes much energy. This paper presents a computational offloading framework to improve the performance of object detection tasks. The framework uses a simple decision model to select between local processing and offloading based on the network context. A demo has been developed to evaluate the framework performance. The experimental work includes different image sizes and employs 3G and Wi-Fi networks. The results show a response time speedup that could reach five times for small images and 1.5 times for big images. The energy saving also ranges from 80% to 50%. Furthermore, offloading through a Wi-Fi network shows more performance stability than a 3G network. Finally, results demonstrate that offloading the object detection computation decreases the memory allocations to less than 1% in comparison with local processing.

A Computational Offloading Framework for Object Detection in Mobile Devices

AbdelAty Maged;
2017-01-01

Abstract

Recently, the role of mobile devices has changed from a calling or entertaining device to a tool for making life easier. However, this growing role is associated with extensive computing requirements to execute tasks such as object detection. Moreover, executing heavy tasks in a mobile with limited resources takes a long processing time and consumes much energy. This paper presents a computational offloading framework to improve the performance of object detection tasks. The framework uses a simple decision model to select between local processing and offloading based on the network context. A demo has been developed to evaluate the framework performance. The experimental work includes different image sizes and employs 3G and Wi-Fi networks. The results show a response time speedup that could reach five times for small images and 1.5 times for big images. The energy saving also ranges from 80% to 50%. Furthermore, offloading through a Wi-Fi network shows more performance stability than a 3G network. Finally, results demonstrate that offloading the object detection computation decreases the memory allocations to less than 1% in comparison with local processing.
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11582/317872
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
social impact