Cognitive model of interaction without vision for tactile exploration of 3D maps


Spatial cognition, interaction, user-centered design, visual impairments, tactile & haptic devices.


ANR ActivMap (2020-2023) focuses on the issue of orientation and mobility of people with visual impairments (PVIs), which has a real impact on their independence and quality of life. The appropriation of urban space is an essential element to improve their autonomy, because outdoor travel is a major issue. Synthetic and adapted representation with tactile maps and diagrams with adapted interactions is essential for greater autonomy. However, their design based on a traditional approach, craft handed by professionals, mainly Orientation and Mobility Instructors and Tactile Document Makers, in insufficient numbers to cover real needs.

Today, thanks to the availability of open and collaborative data, combined with the diversification of means to produce physical artefacts augmented with adapted interactions (Fig.1), it is possible to consider the development of a set of specialized methods and tools to design multimodal interactive maps in a semi-automatic way. This research project is based on an interdisciplinary approach between human computer interaction, cognitive sciences, geographic information sciences, computer sciences, and professionals in the field of visual impairments.

Several approaches will be explored in ACTIVmap to adapt the automatic map design process to the specific needs of users with various visual impairments and their potential contexts of use, including different representations of space: raised-lines maps and diagrams, 3d printing, verbal descriptions, etc. These supports will be augmented with multimodal interactions (haptic or sound) in interactive prototypes, which usability will be evaluated by users such as instructors and PVIs in different use contexts.



The PhD thesis aims at investigating the ways how the spatial knowledge could be transferred for PVIs’ needs, from the map to the real-world and vice-versa, in order to provide them useful 3D maps augmented with interactions, for their mobility: the expected fitness for use of the semiautomatically produced 3D maps is related to the way the users mentally represent the urban spaces they move into, and the way they interpret and understand the graphic and tactile representation of these urban spaces. PVIs should also be given abilities to independently select and explore maps adapted to their intended task, related to the use cases of the project, for instance general spatial learning or mobility preparation with the help of Orientation and Mobility Instructors, but also to their own perceptual and cognitive abilities [Gir+17a].

Based on the user needs analysis, on previous evaluations of the interactive devices, and on graphic and tactile semiology, the student will focus on the design of a conceptual cognitive model of nonvisual interaction with 3D urban maps: this model would be approached as an interface between the mental representation of urban spaces and the concrete representation of this urban space into a 3D map augmented with interaction, in order to drive the automatic 3D map design process.

This objective implies to describe and implement the link between a mental and a tactile representation of urban spaces, at any scales, made by various PVI to perform orientation and mobility tasks. This could rely on the identification and characterization of specific map design processes (generalization, stylization) of the following entities, in order to produce the related main proxies into the 3D maps, which could be augmented by specific interactive modalities:

Student profile

M2 Cognitive sciences, psychology studies or Geographic Information Sciences. This internship is granted by the ANR ActivMap project.


Starting from September 2020, please apply s.


The PhD thesis will held at the IGN - LaSTIG in the GEOVIS team (73 avenue de Paris 94160 SaintMandé, France), with regular travels to the IRIT, Cherchons pour Voir lab, in Toulouse, France.



See also

pdf version