Research Outputs
Permanent URI for this communityhttps://hdl.handle.net/20.500.14288/2
Browse
2 results
Search Results
Publication Metadata only Synergychain: blockchain-assisted adaptive cyber-physical p2p energy trading(Ieee-Inst Electrical Electronics Engineers Inc, 2021) Bouachir, Ouns; Aloqaily, Moayad; N/A; Department of Computer Engineering; Ali, Faizan Safdar; Özkasap, Öznur; Master Student; Faculty Member; Department of Computer Engineering; Graduate School of Sciences and Engineering; College of Engineering; N/A; 113507Industrial investments into distributed energy resource technologies are increasing and playing a pivotal role in the global transactive energy, as part of a wider drive to provide a clean and stable source of energy. The management of prosumers, which consume and as well as generate energy, with heterogeneous energy sources is critical for sustainable and efficient energy trading procedures. This article proposes a blockchain-assisted adaptive model, namely SynergyChain, for improving the scalability and decentralization of the prosumer grouping mechanism in the context of peer-to-peer energy trading. Smart contracts are used for storing the transaction information and for the creation of the prosumer groups. SynergyChain integrates a reinforcement learning module to further improve the overall system performance and profitability by creating a self-adaptive grouping technique. The proposed SynergyChain is developed using Python and Solidity and has been tested using Ethereum test nets. The comprehensive analysis using the hourly energy consumption dataset shows a 39.7% improvement in the performance and scalability of the system as compared to the centralized systems. The evaluation results confirm that SynergyChain can reduce the request completion time along with an 18.3% improvement in the overall profitability of the system as compared to its counterparts.Publication Metadata only Use of non-verbal vocalizations for continuous emotion recognition from speech and head motion(IEEE, 2019) N/A; Department of Computer Engineering; Fatima, Syeda Narjis; Erzin, Engin; PhD Student; Faculty Member; Department of Computer Engineering; Graduate School of Sciences and Engineering; College of Engineering; N/A; 34503Dyadic interactions are reflective of mutual engagement between their participants through different verbal and non-verbal voicing cues. This study aims to investigate the effect of these cues on continuous emotion recognition (CER) using speech and head motion data. We exploit the non-verbal vocalizations that are extracted from speech as a complementary source of information and investigate their effect for the CER problem using gaussian mixture and convolutional neural network based regression frameworks. Our methods are evaluated on the CreativeIT database, which consists of speech and full-body motion capture under dyadic interaction settings. Head motion, acoustic features of speech and histograms of non-verbal vocalizations are employed to estimate activation, valence and dominance attributes for the CER problem. Our experimental evaluations indicate a strong improvement of CER performance, especially of the activation attribute, with the use of non-verbal vocalization cues of speech.