Publication:
Probing human-soundscape interaction using observational user experience methods

dc.contributor.departmentDepartment of Media and Visual Arts
dc.contributor.departmentDepartment of Mechanical Engineering
dc.contributor.departmentGraduate School of Social Sciences and Humanities
dc.contributor.kuauthorObaid, Mohammad
dc.contributor.kuauthorYantaç, Asım Evren
dc.contributor.kuauthorYücetürk, Selman
dc.contributor.schoolcollegeinstituteCollege of Engineering
dc.contributor.schoolcollegeinstituteCollege of Social Sciences and Humanities
dc.contributor.schoolcollegeinstituteGRADUATE SCHOOL OF SOCIAL SCIENCES AND HUMANITIES
dc.date.accessioned2024-11-10T00:06:40Z
dc.date.issued2016
dc.description.abstractSound, whose perception depends on spatial, temporal and cognitive factors, is an intangible issue within interaction design. It is not very easy for interaction designers to explore, understand, or ideate on this intangible and complex phenomenon as they mostly rely on visual language, sketches, or physical prototypes. In this paper, we present initial insights to the design of an interactive mediated sound reality system, which refines the users' interaction with a soundscape. The main contribution of this study is the insights gathered through the use of three observational user experience (UX) methods: (1) note-taking in soundwalks; (2) soundscape visualization; (3) auditory journey maps to overcome the above-mentioned difficulty in rationalizing the intangibility of human-soundscape interaction with focusing, recording and reflecting spatial, temporal and interactive aspects of soundscape.
dc.description.indexedbyWOS
dc.description.indexedbyScopus
dc.description.openaccessNO
dc.description.sponsoredbyTubitakEuN/A
dc.identifier.doi10.1145/2971485.2971495
dc.identifier.isbn978-1-4503-4763-1
dc.identifier.scopus2-s2.0-84997343110
dc.identifier.urihttps://doi.org/10.1145/2971485.2971495
dc.identifier.urihttps://hdl.handle.net/20.500.14288/16653
dc.identifier.wos390298600033
dc.keywordsHuman sound interaction
dc.keywordsSoundscape visualization
dc.keywordsMediated sound reality
dc.keywordsAttentive user interfaces
dc.language.isoeng
dc.publisherAssoc Computing Machinery
dc.relation.ispartofProceedings of the Nordichi '16: The 9th Nordic Conference on Human-Computer Interaction - Game Changing Design
dc.subjectComputer science
dc.subjectCbernetics
dc.subjectHuman engineering
dc.subjectSocial sciences
dc.titleProbing human-soundscape interaction using observational user experience methods
dc.typeConference Proceeding
dspace.entity.typePublication
local.contributor.kuauthorYücetürk, Selman
local.contributor.kuauthorObaid, Mohammad
local.contributor.kuauthorYantaç, Asım Evren
local.publication.orgunit1GRADUATE SCHOOL OF SOCIAL SCIENCES AND HUMANITIES
local.publication.orgunit1College of Engineering
local.publication.orgunit1College of Social Sciences and Humanities
local.publication.orgunit2Department of Mechanical Engineering
local.publication.orgunit2Department of Media and Visual Arts
local.publication.orgunit2Graduate School of Social Sciences and Humanities
relation.isOrgUnitOfPublication483fa792-2b89-4020-9073-eb4f497ee3fd
relation.isOrgUnitOfPublicationba2836f3-206d-4724-918c-f598f0086a36
relation.isOrgUnitOfPublicatione192fff1-4efe-45a7-ab71-30233fc185a9
relation.isOrgUnitOfPublication.latestForDiscovery483fa792-2b89-4020-9073-eb4f497ee3fd
relation.isParentOrgUnitOfPublication8e756b23-2d4a-4ce8-b1b3-62c794a8c164
relation.isParentOrgUnitOfPublication3f7621e3-0d26-42c2-af64-58a329522794
relation.isParentOrgUnitOfPublicationc5c9bf5f-4655-411c-a602-0d68f2e2ad88
relation.isParentOrgUnitOfPublication.latestForDiscovery8e756b23-2d4a-4ce8-b1b3-62c794a8c164

Files