Skip to main content
Home

Main navigation

  • News
    • All news
    • Seminars
  • Presentation
    • CREATIS
    • Organigram
    • People directory
    • Staff
    • Contacts
    • Access
  • Research
    • Research teams
    • Transversal projects
    • Structuring projects
    • Imaging platform
    • Activity reports
    • Data information note
  • Contributions
    • Publications
    • Patents
    • Software
  • Studies & Training
    • Implications dans les formations
    • Doctoral Studies
  • Jobs Opportunities
  • French French
  • English English
Search API form
User account menu
  • Account
    • Log in

Breadcrumb

  1. Accueil
  2. Real-time Image Registration of RGB Webcams and Colorless 3D Time-of-flight Cameras

Real-time Image Registration of RGB Webcams and Colorless 3D Time-of-flight Cameras

Ajouter à mon calendrier
Google Agenda Outlook Calendrier
See ColOr whose name stands for seeing colors with an orchestra, is a context-aware aid system for visually impaired and blind people. See ColOr proposes an instrument-color relation based on cross-modal transfers of the brain modeled via EEG classification. The See ColOr prototype makes use so far of a 3D sensor (PMD[vision] ® CamCube 3.0 ) and a touch-pad (iPad). We aim at enlarging legibility of the nearby environment as well as to facilitate navigating to desired locations, exploration, and serendipitous discovery. In this project, we use the audio and haptic trajectory playback to convey visual information that relates spatial awareness, revealing of boundaries and obstacles perceptiveness. We use also spatilization of sound, which gives the illusion of virtual sound sources emitting from desired spatial locations. Overall, our context-aware aid system merges three levels of assistance: an exploration module, an alerting method, and, finally, a cognitive engine: I. An exploration module that makes it possible for the users to touch with their fingers the whole image captured by a range camera in real time. The color and position of touched points are encoded into instruments sounds and sonic effects, respectively. This module exploits the audio and haptic trajectory feedback to convey significant visual cues. Particularly, the use of spatialized sound allows understanding of spatial relations. II. An alerting method based on range-imaging processing that prevents the user from stumbling by informing about unexpected entities lying on his way and potentially leading to a fall. Eventually, this algorithm can also predict the trajectory of detected obstacles to keep/suspend a warning. This method allows the blind to find a clear path in the interest of safe navigation. III. A cognitive engine that uses state-of-the art object recognition methods to learn natural objects. There is a training phase supported by tracking and bootstrapping methods followed by an online searching process. This latter informs the user about the presence of learned objects in real time during exploration, if any. Unlike I and II, this engine is able to perceive and depict complex visual information in a much higher level of abstraction.

Orateur

Juan Diego Gomez

Lieu

Salle de Réunion B502 - Blaise Pascal

Date - horaires

Wed 17/10/2012 - 15:30

Type d'évenement

Séminaire

Téléchargements

Barre liens pratiques

  • Authentication
  • Intranet
  • Rss feed
  • Creatis on Twitter
  • Webmail
Home

Footer menu

  • Contact
  • Map
  • Newsletter
  • Legal Notices