Facilitating the movement of blind and visually impaired people
Mobile application InclusionApp uses sensor fusion and image analysis to help navigate in any city without the need for external markings:
Public space has many architectural barriers for visually impaired people, which poses a security risk and often leads to social exclusion. InclusionApp operate in two modes: economical and active navigation with the camera is the first solution on the market that is able to replace sight.

InclusionApp’s Functionalities
An image analysis tool to help navigate in any city without external markings. InclusionApp functionalities include:
Locating the user in an urban environment
Accuracy of pedestrian positioning in urban conditions based on the fusion of data from the user’s phone sensors, including at least GPS, accelerometer, compass and image analysis with an accuracy of 1.5m without the use of external infrastructure markings. The distinguishing feature of InclusionApp is the lack of the need to have additional markings in the form of graphic codes and/or dedicated beacon devices.

Determination of the direction, distance and way to reach the desired destination
Accuracy of pedestrian positioning in urban conditions based on fusion of data from user’s phone sensors, including at least GPS, accelerometer, compass, and image analysis, with a precision of up to 1.5m without using external infrastructure markings. A distinguishing feature of InclusionApp is the absence of the need for additional markings in the form of graphic codes and/or dedicated beacon devices.

Informing about vehicles that have arrived at the stop where the user is waiting
The InclusionApp application will also help in recognizing vehicles approaching the stop by reading the vehicle number from the user’s phone camera. The solution will make it easier for blind and visually impaired people to find the right vehicle.

Route planning taking into account the directions for pedestrians
The use of artificial intelligence (SI, AI) and Augmented Reality (AR) and machine learning (UM, ML) to reconstruct the outside world, on the basis of which the precise location and navigation of people with disabilities will be carried out, is key to enabling the target group to move around the city efficiently. Therefore, the tips include planning the entire route of moving around the city: from leaving the house, through getting to the stop, finding the right bus, leaving at the destination stop, to reaching the end of the journey.

Mobile application for Android and iOS operating in two modes: economical and using cameras
The mobile application for precise positioning of users in the urban environment, and in particular when using public transport, is available for both Android and Apple iOS
The mobile application for users complies with the European Digital Accessibility Standard WCAG 2.1. (Web Content Accessibility Guidelines)
