Publication: Probing human-soundscape interaction using observational user experience methods
dc.contributor.department | Department of Media and Visual Arts | |
dc.contributor.department | Department of Mechanical Engineering | |
dc.contributor.department | Graduate School of Social Sciences and Humanities | |
dc.contributor.kuauthor | Obaid, Mohammad | |
dc.contributor.kuauthor | Yantaç, Asım Evren | |
dc.contributor.kuauthor | Yücetürk, Selman | |
dc.contributor.schoolcollegeinstitute | College of Engineering | |
dc.contributor.schoolcollegeinstitute | College of Social Sciences and Humanities | |
dc.contributor.schoolcollegeinstitute | GRADUATE SCHOOL OF SOCIAL SCIENCES AND HUMANITIES | |
dc.date.accessioned | 2024-11-10T00:06:40Z | |
dc.date.issued | 2016 | |
dc.description.abstract | Sound, whose perception depends on spatial, temporal and cognitive factors, is an intangible issue within interaction design. It is not very easy for interaction designers to explore, understand, or ideate on this intangible and complex phenomenon as they mostly rely on visual language, sketches, or physical prototypes. In this paper, we present initial insights to the design of an interactive mediated sound reality system, which refines the users' interaction with a soundscape. The main contribution of this study is the insights gathered through the use of three observational user experience (UX) methods: (1) note-taking in soundwalks; (2) soundscape visualization; (3) auditory journey maps to overcome the above-mentioned difficulty in rationalizing the intangibility of human-soundscape interaction with focusing, recording and reflecting spatial, temporal and interactive aspects of soundscape. | |
dc.description.indexedby | WOS | |
dc.description.indexedby | Scopus | |
dc.description.openaccess | NO | |
dc.description.sponsoredbyTubitakEu | N/A | |
dc.identifier.doi | 10.1145/2971485.2971495 | |
dc.identifier.isbn | 978-1-4503-4763-1 | |
dc.identifier.scopus | 2-s2.0-84997343110 | |
dc.identifier.uri | https://doi.org/10.1145/2971485.2971495 | |
dc.identifier.uri | https://hdl.handle.net/20.500.14288/16653 | |
dc.identifier.wos | 390298600033 | |
dc.keywords | Human sound interaction | |
dc.keywords | Soundscape visualization | |
dc.keywords | Mediated sound reality | |
dc.keywords | Attentive user interfaces | |
dc.language.iso | eng | |
dc.publisher | Assoc Computing Machinery | |
dc.relation.ispartof | Proceedings of the Nordichi '16: The 9th Nordic Conference on Human-Computer Interaction - Game Changing Design | |
dc.subject | Computer science | |
dc.subject | Cbernetics | |
dc.subject | Human engineering | |
dc.subject | Social sciences | |
dc.title | Probing human-soundscape interaction using observational user experience methods | |
dc.type | Conference Proceeding | |
dspace.entity.type | Publication | |
local.contributor.kuauthor | Yücetürk, Selman | |
local.contributor.kuauthor | Obaid, Mohammad | |
local.contributor.kuauthor | Yantaç, Asım Evren | |
local.publication.orgunit1 | GRADUATE SCHOOL OF SOCIAL SCIENCES AND HUMANITIES | |
local.publication.orgunit1 | College of Engineering | |
local.publication.orgunit1 | College of Social Sciences and Humanities | |
local.publication.orgunit2 | Department of Mechanical Engineering | |
local.publication.orgunit2 | Department of Media and Visual Arts | |
local.publication.orgunit2 | Graduate School of Social Sciences and Humanities | |
relation.isOrgUnitOfPublication | 483fa792-2b89-4020-9073-eb4f497ee3fd | |
relation.isOrgUnitOfPublication | ba2836f3-206d-4724-918c-f598f0086a36 | |
relation.isOrgUnitOfPublication | e192fff1-4efe-45a7-ab71-30233fc185a9 | |
relation.isOrgUnitOfPublication.latestForDiscovery | 483fa792-2b89-4020-9073-eb4f497ee3fd | |
relation.isParentOrgUnitOfPublication | 8e756b23-2d4a-4ce8-b1b3-62c794a8c164 | |
relation.isParentOrgUnitOfPublication | 3f7621e3-0d26-42c2-af64-58a329522794 | |
relation.isParentOrgUnitOfPublication | c5c9bf5f-4655-411c-a602-0d68f2e2ad88 | |
relation.isParentOrgUnitOfPublication.latestForDiscovery | 8e756b23-2d4a-4ce8-b1b3-62c794a8c164 |