Dissertations and Theses
Date of Award
2020
Document Type
Thesis
Department
Computer Science
First Advisor
Zhigang Zhu
Keywords
Indoor navigation, blind and visually impaired, hybrid modeling, route planning algorithm, task scheduling algorithm, ARKit
Abstract
We propose an integrated solution of indoor navigation using a smartphone, especially for assisting people with special needs, such as the blind and visually impaired (BVI) individuals. The system consists of three components: hybrid modeling, real-time navigation, and client-server architecture. In the hybrid modeling component, the hybrid model of a building is created region by region and is organized in a graph structure with nodes as destinations and landmarks, and edges as traversal paths between nodes. A Wi-Fi/cellular-data connectivity map, a beacon signal strength map, a 3D visual model (with destinations and landmarks annotated) are collected while a modeler walks through the building, and then registered with the floorplan of the building. The client-server architecture allows the scale-up of the system to a large area such as a college campus with multiple buildings, and the hybrid models are saved in the cloud and only downloaded when needed. In the real-time navigation component, a mobile app on the user’s smartphone will first download the beacon strength map and data connectivity map, and then use the beacon information to put the user in a region of a building. After the visual model of the region is downloaded to the user’s phone, the visual matching module will localize the user accurately in the region. A path planning algorithm takes the visual, connectivity and user preference information into account in planning a path for the user’s current location to the selected destination, and a scheduling algorithm is activated to download visual models of neighboring regions considering the connectivity information. Our current implementation uses ARKit on an iPhone to create local visual models and perform visual matching. User interfaces for both modeling and navigation are developed using visual, audio and haptic displays for our targeted users. Experimental results in real-time navigation are provided to validate our proposed approach.
Recommended Citation
Chang, Yaohua, "Multimodal Data Integration for Real-Time Indoor Navigation Using a Smartphone" (2020). CUNY Academic Works.
https://academicworks.cuny.edu/cc_etds_theses/857
Included in
Other Computer Sciences Commons, Systems Architecture Commons, Theory and Algorithms Commons