2024-12-292022978-1-6654-6297-610.1109/DASC/PiCom/CBDCom/Cy55231.2022.99280212-s2.0-85145354572https://doi.org/10.1109/DASC/PiCom/CBDCom/Cy55231.2022.9928021https://hdl.handle.net/20.500.14288/21892Current wearable devices are capable of monitoring various health indicators as well as fitness and/or physical activity types. However, even on the latest models of many wearable devices, users need to manually enter the type of work-out or physical activity they are performing. In order to automate real-time physical activity recognition, in this study, we develop a deep transfer learning-based physical activity recognition framework using acceleration data acquired through inertial measurement units (IMUs). Towards this goal, we modify a pre-trained version of the GoogLeNet convolutional neural network and fine-tune it with data from IMUs. To make IMU data compatible with GoogLeNet, we propose three novel data transform approaches based on continuous wavelet transform: Horizontal Concatenation (HC), Acceleration-Magnitude (AM), and Pixelwise Axes-Averaging (PA). We evaluate the performance of our approaches using the real-world PAMAP2 dataset. The three approaches result in 0.93, 0.95 and 0.98 validation accuracy and 0.75, 0.85 and 0.91 test accuracy, respectively. The PA approach yields the highest weighted F1 score (0.91) and activity-specific true positive ratios. Overall, our methods and results show that accurate real-time physical activity recognition can be achieved using transfer learning and convolutional neural networks.Automation and control systemsComputer scienceArtificial intelligenceInformation systemsTheory and methodsElectrical engineeringElectronic engineeringPhysical activity recognition using deep transfer learning with convolutional neural networksConference proceeding948109800016N/A40006