Researcher: Madani, Alireza
Name Variants
Madani, Alireza
Email Address
Birth Date
Search Results
Now showing 1 - 3 of 3
Publication Metadata only An adaptive admittance controller for collaborative drilling with a robot based on subtask classification via deep learning(Elsevier, 2022) Aydin, Yusuf; N/A; N/A; N/A; Department of Mechanical Engineering; Güler, Berk; Niaz, Pouya Pourakbarian; Madani, Alireza; Başdoğan, Çağatay; Master Student; Master Student; Master Student; Faculty Member; Department of Mechanical Engineering; Graduate School of Sciences and Engineering; Graduate School of Sciences and Engineering; Graduate School of Sciences and Engineering; College of Engineering; N/A; N/A; N/A; 125489In this paper, we propose a supervised learning approach based on an Artificial Neural Network (ANN) model for real-time classification of subtasks in a physical human-robot interaction (pHRI) task involving contact with a stiff environment. In this regard, we consider three subtasks for a given pHRI task: Idle, Driving, and Contact. Based on this classification, the parameters of an admittance controller that regulates the interaction between human and robot are adjusted adaptively in real time to make the robot more transparent to the operator (i.e. less resistant) during the Driving phase and more stable during the Contact phase. The Idle phase is primarily used to detect the initiation of task. Experimental results have shown that the ANN model can learn to detect the subtasks under different admittance controller conditions with an accuracy of 98% for 12 participants. Finally, we show that the admittance adaptation based on the proposed subtask classifier leads to 20% lower human effort (i.e. higher transparency) in the Driving phase and 25% lower oscillation amplitude (i.e. higher stability) during drilling in the Contact phase compared to an admittance controller with fixed parameters.Publication Metadata only Physical activity recognition using deep transfer learning with convolutional neural networks(Institute of Electrical and Electronics Engineers Inc., 2022) Department of Electrical and Electronics Engineering; Department of Computer Engineering; N/A; N/A; Gürsoy, Beren Semiz; Gürsoy, Mehmet Emre; Ataseven, Berke; Madani, Alireza; Faculty Member; Faculty Member; Master Student; Master Student; Department of Electrical and Electronics Engineering; Department of Computer Engineering; College of Engineering; College of Engineering; N/A; Graduate School of Sciences and Engineering; 332403; 330368; N/A; N/ACurrent wearable devices are capable of monitoring various health indicators as well as fitness and/or physical activity types. However, even on the latest models of many wearable devices, users need to manually enter the type of work-out or physical activity they are performing. In order to automate real-time physical activity recognition, in this study, we develop a deep transfer learning-based physical activity recognition framework using acceleration data acquired through inertial measurement units (IMUs). Towards this goal, we modify a pre-trained version of the GoogLeNet convolutional neural network and fine-tune it with data from IMUs. To make IMU data compatible with GoogLeNet, we propose three novel data transform approaches based on continuous wavelet transform: Horizontal Concatenation (HC), Acceleration-Magnitude (AM), and Pixelwise Axes-Averaging (PA). We evaluate the performance of our approaches using the real-world PAMAP2 dataset. The three approaches result in 0.93, 0.95 and 0.98 validation accuracy and 0.75, 0.85 and 0.91 test accuracy, respectively. The PA approach yields the highest weighted F1 score (0.91) and activity-specific true positive ratios. Overall, our methods and results show that accurate real-time physical activity recognition can be achieved using transfer learning and convolutional neural networks.Publication Metadata only Robot-assisted drilling on curved surfaces with haptic guidance under adaptive admittance control(Institute of Electrical and Electronics Engineers (IEEE), 2022) Aydın, Yusuf; N/A; N/A; N/A; Department of Mechanical Engineering; Madani, Alireza; Niaz, Pouya Pourakbarian; Güler, Berk; Başdoğan, Çağatay; Master Student; Master Student; Master Student; Faculty Member; Department of Mechanical Engineering; Koç Üniversitesi İş Bankası Yapay Zeka Uygulama ve Araştırma Merkezi (KUIS AI)/ Koç University İş Bank Artificial Intelligence Center (KUIS AI); Graduate School of Sciences and Engineering; Graduate School of Sciences and Engineering; Graduate School of Sciences and Engineering; College of Engineering; N/A; N/A; N/A; 125489Drilling a hole on a curved surface with a desired angle is prone to failure when done manually, due to the difficulties in drill alignment and also inherent instabilities of the task, potentially causing injury and fatigue to the workers. On the other hand, it can be impractical to fully automate such a task in real manufacturing environments because the parts arriving at an assembly line can have various complex shapes where drill point locations are not easily accessible, making automated path planning difficult. In this work, an adaptive admittance controller with 6 degrees of freedom is developed and deployed on a KUKA LBR iiwa 7 cobot such that the operator is able to manipulate a drill mounted on the robot with one hand comfortably and open holes on a curved surface with haptic guidance of the cobot and visual guidance provided through an AR interface. Real-time adaptation of the admittance damping provides more transparency when driving the robot in free space while ensuring stability during drilling. After the user brings the drill sufficiently close to the drill target and roughly aligns to the desired drilling angle, the haptic guidance module fine tunes the alignment first and then constrains the user movement to the drilling axis only, after which the operator simply pushes the drill into the workpiece with minimal effort. Two sets of experiments were conducted to investigate the potential benefits of the haptic guidance module quantitatively (Experiment I) and also the practical value of the proposed pHRI system for real manufacturing settings based on the subjective opinion of the participants (Experiment II). The results of Experiment I, conducted with 3 naive participants, show that the haptic guidance improves task completion time by 26% while decreasing human effort by 16% and muscle activation levels by 27% compared to no haptic guidance condition. The results of Experiment II, conducted with 3 experienced industrial workers, show that the proposed system is perceived to be easy to use, safe, and helpful in carrying out the drilling task.