Combining Indoor and Outdoor Positioning for Navigation in AR Environments

Downloads

Authors

  • Krzysztof Skabek Faculty of Computer Science and Mathematics, Cracow University of Technology, Kraków, Poland
  • Dominika Rola Faculty of Computer Science and Mathematics, Cracow University of Technology, Kraków, Poland
  • Wojciech Zamarski Faculty of Computer Science and Mathematics, Cracow University of Technology, Kraków, Poland

Abstract

This article presents a comparative analysis of augmented reality (AR) technologies – Vuforia, Immersal, MultiSet, and the ARCore Geospatial Application Programming Interface (API) – in terms of performance, accuracy, and interference tolerance for indoor and outdoor positioning and navigation. Two test environments were used: an indoor (laboratory) setup enabling detailed module testing, and a hybrid deployment on the Cracow University of Technology (CUT) campus to illustrate the feasibility of AR navigation in diverse environmental conditions. The research was conducted according to six scenarios. One involved outdoor GPS navigation, while the others concerned indoor navigation. Based on the measurements, recommendations are provided for selecting AR localization platforms for mixed navigation. As part of the detailed testing, an AR navigation system was implemented on the CUT campus as a combination of indoor and outdoor approaches. The final implementation was developed in the Unity environment. Software tests were conducted with particular emphasis on transitions between indoor and outdoor navigation.

Keywords:

augmented reality (AR), 3D reconstruction, photogrammetry, LiDAR (light detection and ranging), indoor and outdoor positioning, geolocalization

References



  1. Alarifi M.A. et al., Ultra wideband indoor positioning technologies: Analysis and recent advances, Sensors, 16(5): 707, 2016, https://doi.org/10.3390/s16050707


  2. Reitmayr P., Schmalstieg D., Location Based Applications for Mobile Augmented Reality, [In:] Proceedings of the Fourth Australasian User Interface Conference on User Interfaces 2003, Vol. 18, pp. 65–73, Australian Computer Society, 2003.


  3. Rauschnabel M., Rossmann A., tom Dieck T., An adoption framework for mobile augmented reality games: The case of Pokémon GO, Computers in Human Behavior, 76: 276–286, 2017, https://doi.org/10.1016/j.chb.2017.07.030


  4. Nee A.Y.C., Ong S.K., Chryssolouris G., Mourtzis D., Augmented reality applications in design and manufacturing, CIRP Annals, 61(2): 657–679, 2012, https://doi.org/10.1016/j.cirp.2012.05.010


  5. Vávra M. et al., Recent development of augmented reality in surgery: A review, Journal of Healthcare Engineering, 2017: 4574172, 2017, https://doi.org/10.1155/2017/4574172


  6. Buttiglione M.D., Guerrera F., Piazzolla P., Colombo G., Ruffini E., Gribaudo M., Collaborative virtual reality framework for surgical training and simulation, [In:] 39th Proceedings of ECMS 2025, Vol. 39, Iss. 1, pp. 661–667, Catania, Italy, 2025, https://doi.org/10.7148/2025-0661


  7. Billinghurst M., Clark A., Lee G., A survey of augmented reality, Foundations and Trends in Human-Computer Interaction, 8(2-3): 73–272, 2015, https://doi.org/10.1561/1100000049


  8. Immersal, Immersal SDK Documentation, 2023, https://developers.immersal.com/docs/ [accessed: 29.05.2025].


  9. PTC, Vuforia Engine Library, 2024, https://library.vuforia.com [accessed: 29.05.2025].


  10. PTC, Area Targets, Vuforia Engine Library, 2021, https://library.vuforia.com/features/area-targets.html [accessed: 10.09.2025].


  11. Google, ARCore Geospatial API Documentation, 2024, https://developers.google.com/ar [accessed: 29.05.2025].


  12. Guo X., Ansari N., Hu F., Shao Y., Elikplim N.R., Li L., A survey on fusion-based indoor positioning, IEEE Communications Surveys & Tutorials, 22(1): 566–594, 2020, https://doi.org/10.1109/COMST.2019.2951036


  13. MultiSet AI, MultiSet AI Developer Documentation, 2025, https://docs.multiset.ai [accessed: 29.05.2025].


  14. Zamarski W., Navigation methods in augmented reality on the example of the CUT campus, Master Thesis, Cracow University of Technology, Cracow, 2025.


  15. Sattler T. et al., Benchmarking 6DOF outdoor visual localization in changing conditions, [In:] Proceedings of the 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 8601–8610, Salt Lake City, USA, 2018, https://doi.org/10.1109/CVPR.2018.00897


  16. Mur-Artal R., Tardos J.D., ORB-SLAM2: An open-source SLAM system for monocular, stereo, and RGB-D cameras, IEEE Transactions on Robotics, 33(5): 1255–1262, 2017, https://doi.org/10.1109/TRO.2017.2705103


  17. Arandjelović R., Gronat P., Torii A., Pajdla T., Sivic J., NetVLAD: CNN architecture for weakly supervised place recognition, IEEE Transactions on Pattern Analysis and Machine Intelligence, 40(6): 1437–1451, 2018, https://doi.org/10.1109/TPAMI.2017.2711011


  18. Hess W., Kohler D., Rapp H., Andor D., Real-time loop closure in 2D LIDAR SLAM, [In:] IEEE ICRA Workshop, 2016, https://research.google.com/pubs/archive/45466.pdf


  19. Unity Technologies, AR Foundation Documentation, 2025, https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@latest/ [accessed: 10.09.2025].


  20. Reinhardt T., Using Global Localization to Improve Navigation, Google Research, Feb 11, 2019, https://research.google/blog/using-global-localization-to-improve-navigation/ [accessed: 10.09.2025].


  21. Hakamäki S., Assessing Localization Accuracy of Google ARCore Geospatial API, MSc Thesis, Tampere University, 2024, https://trepo.tuni.fi/bitstream/handle/10024/157627/Hakam%C3%A4kiSaku.pdf


  22. Qiao J., Yang F., Liu J., Huang G., Zhang W., Li M., Advancements in indoor precision positioning: A comprehensive survey of UWB and Wi-Fi RTT positioning, Network, 4(4): 545–566, 2024, https://doi.org/10.3390/network4040027


  23. Kosek-Szott K., Szott S., Ciezobka W., Wojnar M., Rusek K., Segev J., Indoor positioning with Wi-Fi location: A survey of IEEE 802.11mc/az/bk fine timing measurement research, Computer Communications, 247: 108400, 2026, https://doi.org/10.1016/j.comcom.2025.108400


  24. Hashem O., Harras K.A., Youssef M., Accurate indoor positioning using IEEE 802.11mc round trip time, Pervasive and Mobile Computing, 75: 101416, 2021, https://doi.org/10.1016/j.pmcj.2021.101416


  25. Schepers D., Ranganathan A., Privacy-preserving positioning in Wi-Fi fine timing measurement, Proceedings on Privacy Enhancing Technologies, 2022(2): 325–343, 2022, https://crysp.petsymposium.org/popets/2022/popets-2022-0048.pdf


  26. Bai L., Ciravegna F., Bond R., Mulvenna S., A low cost indoor positioning system Using Bluetooth Low Energy, IEEE Access, 8: 136858–136871, 2020, https://doi.org/10.1109/ACCESS.2020.3012342


  27. Bencak P., Hercog D., Lerher T., Indoor positioning system based on Bluetooth Low Energy for intralogistics, Electronics, 11(3): 308, 2022, https://doi.org/10.3390/electronics11030308


  28. Cho J., Jeong S., Kim J.-Y., Kim G.-H., Lee J., Lee B., Real-time indoor localization and safety applications using UWB, KSCE Journal of Civil Engineering, 29(8): 100164, 2025, https://doi.org/10.1016/j.kscej.2025.100164


  29. Minervini A., Carrio A., Guglieri G., Enhancing Visual–Inertial Odometry robustness and accuracy in challenging environments, Robotics, 14(6): 71, 2025, https://doi.org/10.3390/robotics14060071


  30. Zhang J., Yu X., Sier H., Zhang H., Westerlund T., Event-based sensor fusion and application on odometry: A survey, arXiv, 2024, https://arxiv.org/abs/2410.15480


  31. Teo T.-A., Yang C.-C., Evaluating the accuracy and quality of an iPad Pro’s built-in LiDAR for BIM, Developments in the Built Environment, 14: 100169, 2023, https://doi.org/10.1016/j.dibe.2023.100169


  32. Abdel-Majeed H.M., Shaker I.F., Abdel-Wahab A.M., Awad A.A.D.I., Indoor mapping accuracy comparison between Apple pro devises’ LiDAR sensor and terrestrial laser scanner, HBRC Journal, 20(1): 915–931, 2024, https://doi.org/10.1080/16874048.2024.2408839


  33. Sheshtar F.M., Alhatlani W.M., Moulden M., Kim J.H., Comparative analysis of LiDAR and photogrammetry for 3D crime scene reconstruction, Applied Sciences, 15(3): 1085, 2025, https://doi.org/10.3390/app15031085


  34. Niantic, Visual Positioning System (VPS) Documentation, 2025, https://lightship.dev/docs/ardk/features/lightship_vps/[accessed: 10.09.2025].


  35. Microsoft, Azure Spatial Anchors – Product Lifecycle (Retired Nov 20, 2024), 2025, https://learn.microsoft.com/en-us/lifecycle/products/azure-spatial-anchors [accessed: 10.09.2025].


  36. Google, Build global-scale, immersive, location-based AR experiences with the ARCore Geospatial API, 2025. https://developers.google.com/ar/develop/geospatial [accessed: 10.09.2025].


  37. Schönberger J.L., Frahm J.-M., Structure-from-motion revisited, [In:] 2016 IEEE Conference on Computer Vision and Pattern Recognition, pp. 4104–4113, 2016, https://openaccess.thecvf.com/content_cvpr_2016/papers/Schonberger_Structure-From-Motion_Revisited_CVPR_2016_paper.pdf


  38. DeTone D., Malisiewicz T., Rabinovich A., SuperPoint: Self-supervised interest point detection and description, [In:] 2018 IEEE Conference on Computer Vision and Pattern Recognition Workshops, pp. 337–349, 2018, https://openaccess.thecvf.com/content cvpr 2018 workshops/papers/w9/DeTone SuperPoint Self-Supervised Interest CVPR 2018 paper.pdf.


  39. Sarlin P.-E., DeTone D., Malisiewicz T., Rabinovich A., SuperGlue: Learning feature matching with graph neural networks, [In:] IEEE Conference on Computer Vision and Pattern Recognition, pp. 4938–4947, 2020, https://openaccess.thecvf.com/content_CVPR_2020/papers/Sarlin_SuperGlue_Learning_Feature_Matching_With_Graph_Neural_Networks_CVPR_2020_paper.pdf


  40. Sarlin P.-E., Cadena C., Siegwart R., Dymczyk M., From coarse to fine: Robust hierarchical localization at large scale, [In:] IEEE Conference on Computer Vision and Pattern Recognition, pp. 12716–12725, 2019, https://openaccess.thecvf.com/content_CVPR_2019/papers/Sarlin_From_Coarse_to_Fine_Robust_Hierarchical_Localization_at_Large_Scale_CVPR_2019_paper.pdf


  41. Du R. et al., DepthLab: Real-time 3D interaction with depth maps for mobile augmented reality, [In:] UIST ’20: Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology, pp. 829–843, 2020, https://doi.org/10.1145/3379337.3415881


  42. Sturm J., Engelhard N., Endres F., Burgard W., Cremers D., A Benchmark for the Evaluation of RGB-D SLAM Systems, [In:] 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 573-580, Vilamoura-Algarve, Portugal, 2012, https://doi.org/10.1109/IROS.2012.6385773


  43. Geiger A., Lenz P., Urtasun R., Are we ready for autonomous driving? The KITTI vision benchmark suite, [In:] Conference on Computer Vision and Pattern Recognition, 2012, https://www.cvlibs.net/publications/Geiger2012CVPR.pdf


  44. IEEE, IEEE Standard for Information Technology–Telecommunications and Information Exchange between Systems Local and Metropolitan Area Networks–Specific Requirements Part 11: Wireless LAN Medium Access Control (MAC) and Physical Layer (PHY) Specifications Amendment 4: Enhancements for Positioning, 2022, https://standards.ieee.org/ieee/802.11az/7226/


  45. Sygic, Real View Navigation, https://www.sygic.com/what-is/real-view-navigation


  46. Huang B.-C., Hsu J., Chu E.T.-H., Wu H.-M., ARBIN: Augmented Reality Based Indoor Navigation System, Sensors, 20(20): 5890, 2020, https://doi.org/10.3390/s20205890


  47. Sato F., Indoor navigation system based on augmented reality markers, [In:] Barolli L., Enokido T. [Eds.], Innovative Mobile and Internet Services in Ubiquitous Computing. IMIS 2017, Advances in Intelligent Systems and Computing, Vol. 612, pp. 266–274, Springer, Cham, 2018, https://doi.org/10.1007/978-3-319-61542-4 25.


  48. Jiang J.R., Subakti H., An indoor location-based augmented reality framework, Sensors, 23(3): 1370, 2023, https://doi.org/10.3390/s23031370