Change search
Link to record
Permanent link

Direct link
Publications (5 of 5) Show all publications
Tobisková, N., Gull, E. S., Janardhanan, S., Pederson, T. & Malmsköld, L. (2023). Augmented Reality for AI-driven Inspection?: A Comparative Usability Study. Paper presented at Procedia CIRPOpen AccessVolume 119, Pages 734 - 7392023 33rd CIRP Design Conference, Sydney 17 May 2023, through 19 May 2023. Procedia CIRP, 119, 734-739
Open this publication in new window or tab >>Augmented Reality for AI-driven Inspection?: A Comparative Usability Study
Show others...
2023 (English)In: Procedia CIRP, ISSN 2212-8271, E-ISSN 2212-8271, Vol. 119, p. 734-739Article in journal (Refereed) Published
Abstract [en]

Inspection in Aerospace industry can, as well as many other industrial applications, benefit from using Augmented Reality (AR) due to its ability to superimpose helpful digital information in 3D, leading to fewer errors and decreased mental demand. However, each AR device has advantages and disadvantages, and not all AR devices are suitable for use in industrial settings. We compare a tripod-fitted-adjustable-arm tablet-based AR solution (Apple iPad Pro) to head-mounted AR (Microsoft HoloLens 2) and a traditional, computer screen-based human-machine interface (HMI), all three designed to guide operators based on previously performed AI-based image analysis. Following an iterative design process with three formative evaluations, a final field test in a real industrial shop floor engaging 6 professional inspectors revealed an overall preference for the tripod-fitted iPad variant which receiving the best scores in most dimensions covered in both a usability-focused SUS questionnaire (score 71) and a NASA-RTLX form focused on perceived workload. More specifically, the tripod-fitted iPad was considered more usable (SUS) than the classic computer display HMI (M=5.83, SD=4.92, p=0.034, N=6); the temporal demand (NASA-RTLX) was considered lower using the iPad compared to both HoloLens 2 and the HMI (M=6.67, SD=4.08, p=0.010; M=10.83, SD=9.70, p=0.040, N=6), respectively. 

Place, publisher, year, edition, pages
Elsevier, 2023
Keywords
Aerospace industry; Hand held computers; Human computer interaction; Inspection; NASA; Aerospace; Automated visual inspection; Collaborative automation; Digital information; Human Machine Interface; Industrial settings; MicroSoft; Traditional computers; Usability evaluation; Usability studies; Augmented reality
National Category
Human Aspects of ICT
Research subject
Production Technology; Work Integrated Learning
Identifiers
urn:nbn:se:hv:diva-21175 (URN)10.1016/j.procir.2023.03.122 (DOI)2-s2.0-85169900193 (Scopus ID)
Conference
Procedia CIRPOpen AccessVolume 119, Pages 734 - 7392023 33rd CIRP Design Conference, Sydney 17 May 2023, through 19 May 2023
Note

CC BY 4.0

Available from: 2024-01-16 Created: 2024-01-16 Last updated: 2024-01-16
Tobisková, N., Malmsköld, L. & Pederson, T. (2023). Head-Mounted Augmented Reality Support for Assemblers of Wooden Trusses. Paper presented at 33rd CIRP Design Conference, Sydney 17 May 2023 through 19 May 2023, Code 191393. Procedia CIRP, 119, 134-139
Open this publication in new window or tab >>Head-Mounted Augmented Reality Support for Assemblers of Wooden Trusses
2023 (English)In: Procedia CIRP, ISSN 2212-8271, E-ISSN 2212-8271, Vol. 119, p. 134-139Article in journal (Refereed) Published
Abstract [en]

Wooden-house assembly is an area where still a big part of the work is done manually. In this case study, pairs of operators compose large wooden pieces together based on paper-print instructions complemented by visual guidance in the shape of laser marks projected from lasers mounted in the ceiling, based on Computer-aided design (CAD) data. Augmented Reality (AR) head-mounted displays (HMD) offer a unique platform for providing instructions and additional information superimposed in the work environment and thus can provide guidance in a cognitively ergonomic way. A particular advantage compared to other computing platforms is that the operators have free hands and can perform the manual work and follow guidance simultaneously. We present an evaluation of a prototype that dynamically transforms a CAD data file with design and measurements of wooden trusses to be manufactured, into an AR-based guidance system developed in Unity for Microsoft HoloLens 2 devices. We used an iterative participatory design process for prototyping and think-aloud protocol combined with observations for evaluation, involving professional assemblers in different stages of the process. Participants found the solution to potentially save time in their everyday work and simplify the task by offering increased visibility of the marks compared to the existing laser projection. Large-scale deployment of the system is still facing design challenges of which some are also discussed in the paper.  

Place, publisher, year, edition, pages
Elsevier, 2023
Keywords
Computer aided design; Helmet mounted displays; Human computer interaction; Trusses; Case-studies; Computer-aided design; Design data; Hololens 2; Industrial assemblies; Interaction design; Manufacturing; Paper prints; Visual guidance; Wooden house; Augmented reality
National Category
Human Aspects of ICT
Research subject
Work Integrated Learning; Production Technology
Identifiers
urn:nbn:se:hv:diva-21174 (URN)10.1016/j.procir.2023.02.130 (DOI)2-s2.0-85169933114 (Scopus ID)
Conference
33rd CIRP Design Conference, Sydney 17 May 2023 through 19 May 2023, Code 191393
Note

CC BY 4.0

Available from: 2024-01-16 Created: 2024-01-16 Last updated: 2024-01-16Bibliographically approved
Jernsand, E. M., Kraff, H., Törngren, S. O., Adolfsson, C., Björner, E., Omondi, L., . . . Ulver, S. (2023). Tourism memories: a collaborative reflection on inclusion and exclusion. Tourism Recreation Resarch, 48(6), 820-830
Open this publication in new window or tab >>Tourism memories: a collaborative reflection on inclusion and exclusion
Show others...
2023 (English)In: Tourism Recreation Resarch, ISSN 0250-8281, E-ISSN 2320-0308, Vol. 48, no 6, p. 820-830Article in journal (Refereed) Published
Abstract [en]

The purpose of this paper is to explore how people’s differentiated privileged and marginalised positions in society create instances of inclusion and exclusion in tourism. Eight authors utilised their diverse disciplinary and theoretical bases to engage in individual autoethnography and collaborative reflections of their personal experiences of being tourists and hosts. Through our Western and non-Western, White and non-White experiences, we reveal experiences from a multitude of perspectives, and problematise the dominant White racial frame. The methodology illustrates unquestioned privileges and feelings of discomfort when personally faced with exclusionary practices and creates an understanding of how individuals have different experiences of enchantment and the tourist gaze. The experience of marginalisation is serial and dialectical, which illustrates the complexity of tourism. The paper contributes to an enhanced and multifaceted understanding of tourism experiences and proposes measures to reveal issues of exclusion. Also, the use of autoethnography and collaborative reflection as methodological tools provide opportunities for researchers and practitioners to engage in reflexive conversation on discriminatory practices, and how they hinder certain individuals and groups from enjoying tourism products and services.

Place, publisher, year, edition, pages
Taylor & Francis, 2023
Keywords
Tourism experiences, privileged positions: White racial frame, autoethnography, critical tourism
National Category
Business Administration Human Geography
Research subject
Work Integrated Learning
Identifiers
urn:nbn:se:hv:diva-20007 (URN)10.1080/02508281.2023.2207153 (DOI)000987121000001 ()2-s2.0-85159351598 (Scopus ID)
Note

The research is part of the project The Role of Tourism in Multi-cultural Societies funded by the Swedish Research Council for Sustainable Development, FORMAS (FR-2018/0010).

Available from: 2023-06-01 Created: 2023-06-01 Last updated: 2024-01-10
Tobisková, N., Malmsköld, L. & Pederson, T. (2022). Multimodal Augmented Reality and Subtle Quidance for Industrial Assembly: A Survey and Ideation Method. Paper presented at 14th International Conference, VAMR 2022Held as Part of the 24th HCI International Conference, HCII 2022Virtual Event, June 26 – July 1, 2022Proceedings, Part II. Lecture Notes in Computer Science, 13318 LNCS, 329-349
Open this publication in new window or tab >>Multimodal Augmented Reality and Subtle Quidance for Industrial Assembly: A Survey and Ideation Method
2022 (English)In: Lecture Notes in Computer Science, ISSN 0302-9743, E-ISSN 1611-3349, Vol. 13318 LNCS, p. 329-349Article in journal (Refereed) Published
Abstract [en]

Industrial manual assembly is a relatively established use case for emerging head-mounted Augmented Reality (AR) platforms: operators get visual support in placing pieces depending on where they are in the assembly process. However, is vision the only suitable sensory modality for such guidance? We present a systematic review of previous work done on multimodal guidance and subtle guidance approaches, confirming that explicit visual cues dominate. We then outline a three-step method for generating multisensory guidance ideas intended for real-world task support based on task observation that led to identification of 18 steps in truss assembly, brainstorming AR guidance approaches related to assembly and maintenance, and mapping of brainstorming results to the observed task. We illustrated the use of the method by deploying it on our current mission in producing AR guidance approaches for an industrial partner involved in designing and assembling wooden trusses. In this work, we went beyond the standard visual AR guidance in two ways, 1) by opening for guidance through auditory, tactile, and olfactory sensory channels, 2) by considering subtle guidance as alternative or complement to explicit information presentation. We presented a resulting set of multisensory guidance ideas, each tied to one of the 18 steps in the observed truss assembly task. To mention a few which we intend to investigate further: smell for gradual warning about non-imminent potential hazardous situations; 3D sound to guide operators to location of different tools; thermos-haptics for subtle notifications about contextual events (e.g., happening at other assembly stations). The method presented helped us to explore all modalities and to identify new possibilities. More work is needed to understand how different modalities can be combined and the impact of different modality distractions on task performance. © 2022, Springer Nature Switzerland AG.

Place, publisher, year, edition, pages
Springer Science+Business Media B.V., 2022
Keywords
Trusses; Assembly process; Guidance; Ideation methods; Industrial assemblies; Manual assembly; Multi-modal; Multimodal Interaction; Multisensory; Subtle cue; Survey methods; Augmented reality
National Category
Robotics Computer Systems
Research subject
Production Technology; Work Integrated Learning
Identifiers
urn:nbn:se:hv:diva-19151 (URN)10.1007/978-3-031-06015-1_23 (DOI)000870269100023 ()2-s2.0-85131960208 (Scopus ID)
Conference
14th International Conference, VAMR 2022Held as Part of the 24th HCI International Conference, HCII 2022Virtual Event, June 26 – July 1, 2022Proceedings, Part II
Available from: 2022-10-31 Created: 2022-10-31 Last updated: 2024-04-08Bibliographically approved
Kadish, D., Sarkheyli-Hägele, A., Font, J., Hägele, G., Niehorster, D. C. & Pederson, T. (2022). Towards Situation Awareness and Attention Guidance in a Multiplayer Environment using Augmented Reality and Carcassonne. In: Extended Abstracts of the 2022 Annual Symposium on Computer-Human Interaction in Play (CHI PLAY '22): . Paper presented at 2022 Annual Symposium on Computer-Human Interaction in Play (CHI PLAY '22). Association for Computing Machinery, New York, NY, USA (pp. 133-139). Association for Computing Machinery (ACM)
Open this publication in new window or tab >>Towards Situation Awareness and Attention Guidance in a Multiplayer Environment using Augmented Reality and Carcassonne
Show others...
2022 (English)In: Extended Abstracts of the 2022 Annual Symposium on Computer-Human Interaction in Play (CHI PLAY '22), Association for Computing Machinery (ACM), 2022, p. 133-139Conference paper, Published paper (Refereed)
Abstract [en]

Augmented reality (AR) games are a rich environment for researching and testing computational systems that provide subtle user guidance and training. In particular computer systems that aim to augment a user’s situation awareness benefit from the range of sensors and computing power available in AR headsets. The main focus of this work-in-progress paper is the introduction of the concept of the individualized Situation Awareness-based Attention Guidance (SAAG) system used to increase humans’ situating awareness and the augmented reality version of the board game Carcassonne for validation and evaluation of SAAG. Furthermore, we present our initial work in developing the SAAG pipeline, the generation of game state encodings, the development and training of a game AI, and the design of situation modeling and eye-tracking processes. © 2022 Owner/Author.

Place, publisher, year, edition, pages
Association for Computing Machinery (ACM), 2022
Keywords
Augmented reality; Computer games; Computing power; Eye tracking; Carcassonne; Computational system; Computing power; Guidance system; Knowledge-representation; Multiplayers; Sensor power; Situation awareness; User guidance; User training; Knowledge representation
National Category
Information Systems, Social aspects
Identifiers
urn:nbn:se:hv:diva-19482 (URN)10.1145/3505270.3558322 (DOI)2-s2.0-85143123116 (Scopus ID)
Conference
2022 Annual Symposium on Computer-Human Interaction in Play (CHI PLAY '22). Association for Computing Machinery, New York, NY, USA
Available from: 2022-12-22 Created: 2022-12-22 Last updated: 2022-12-22Bibliographically approved
Organisations
Identifiers
ORCID iD: ORCID iD iconorcid.org/0000-0002-4005-9926

Search in DiVA

Show all publications