Neural style transfer (NST) is a powerful technique that merges the content of one image with the artistic style of another. However, existing NST models often suffer from computational inefficiency, limiting their practical applicability. In this paper, we propose a lightweight NST approach based on XiNets, neural networks especially developed for microcontrollers and embedded applications. Our XiNet-based pipeline achieves an extremely lightweight transfer style mechanism and high-quality stylized outputs, significantly reducing computational complexity. Furthermore, we explore various optimizations and loss functions to optimize network generation performance. The combination of our novel loss factors and an extremely lightweight generative network enables performance comparable to those offered by networks of larger orders of magnitude. Experimental results demonstrate comparable style transfer quality to state-of-the-art approaches while running on embedded devices like a Raspberry Pi 4 single-board computer and an STM32H7 microcontroller. Our lightweight NST framework allows for real-time style transfer applications on resource-constrained devices without compromising visual fidelity.

Painting the Starry Night using XiNets

Alberto Ancilotto;Elisabetta Farella
2024-01-01

Abstract

Neural style transfer (NST) is a powerful technique that merges the content of one image with the artistic style of another. However, existing NST models often suffer from computational inefficiency, limiting their practical applicability. In this paper, we propose a lightweight NST approach based on XiNets, neural networks especially developed for microcontrollers and embedded applications. Our XiNet-based pipeline achieves an extremely lightweight transfer style mechanism and high-quality stylized outputs, significantly reducing computational complexity. Furthermore, we explore various optimizations and loss functions to optimize network generation performance. The combination of our novel loss factors and an extremely lightweight generative network enables performance comparable to those offered by networks of larger orders of magnitude. Experimental results demonstrate comparable style transfer quality to state-of-the-art approaches while running on embedded devices like a Raspberry Pi 4 single-board computer and an STM32H7 microcontroller. Our lightweight NST framework allows for real-time style transfer applications on resource-constrained devices without compromising visual fidelity.
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11582/345408
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
social impact