PhD in VR and Room Acoustics

IJLRA (Institut Jean le Rond d’Alembert, UMR 7190 CNRS – Sorbonne Université) and IRCAM (Institut de Recherche et Coordination Acoustique/Musique, UMR 9912 STMS IRCAM – CNRS – Sorbonne Université) are looking for new candidates for the 3-year PhD thesis “Navigation aid for the visually impaired: Virtual Reality acoustic simulations for interior navigation preparation”.

The PhD student will participate in the creation and evaluation of a training system application for visually impaired individuals.

Duration: 3 years

Closing date: 1 October 2018

Degree Level: Master in Computer Science, Acoustics, Architectural Acoustics, Multimodal Interfaces, or Audio Signal Processing

Major(s): Virtual reality, 3D audio, spatial sound, spatial cognition, room acoustics, visual impairments, navigation aid

 

Job Summary

 

Doctoral thesis

Navigation aid for the visually impaired: Virtual Reality acoustic simulations for interior navigation preparation

 

Laboratories: IJLRA (Institut Jean le Rond d’Alembert, UMR 7190 CNRS – Sorbonne Université) and IRCAM (Institut de Recherche et Coordination Acoustique/Musique, UMR 9912 STMS IRCAM – CNRS – Sorbonne Université)

Doctoral schoolÉcole Doctorale Sciences Mécaniques, Acoustique, Électronique et Robotique (SMAER): ED 391

Discipline: Acoustics (Virtual Reality, Audio, Interaction, Aide Handicap)

Co-supervision: Brian KATZ (DR-CNRS, IJLRA) et Markus NOISTERNIG (CR, IRCAM)

Keywords: Virtual reality, 3D audio, spatial sound, spatial cognition, room acoustics, visual impairments, navigation aid

Research context: This thesis project is placed in the context of the ANR 2018-2021 project RASPUTIN (Room Acoustic Simulations for Perceptually Realistic Uses in Real-Time Immersive and Navigation Experiences). In the domains of sound synthesis and virtual reality (VR), much effort had been placed on the quality and realism of sound source renderings, from text-to-speech to musical instruments to engine noise for use in driving and flight simulators. The same degree of effort cannot be seen with regards to the spatial aspects of sound synthesis and virtual reality, particularly with respect to the acoustics of the surrounding environment. Room acoustic simulation algorithms have for decades been improving in their ability to predict acoustic measurement metrics like reverberation time from geometrical acoustic models, at a cost of higher and higher computational requirements. However, it is only recently that the perceptual quality of these simulations are being explored beyond their musical applications. In real-time systems, where sound source, listener, and room architecture can vary in unpredicted ways, investigation of the perceptual quality or realism has been hindered by necessary simplifications to algorithms. This project aims to improve real-time simulation quality towards perceptual realism.

The capability of a real-time acoustic simulation to provide meaningful information to a visually impaired user through a virtual reality exploration is the focus of the project. As a preparatory tool prior to visiting a public building or museum, the virtual exploration will improve user’s knowledge of the space and navigation confidence during their on-site visit, as compared to traditional methods such as tactile maps.

The thesis work entails participating in the creation and evaluation of a training system application for visually impaired individuals. Tasks involve the development of an experimental prototype in collaboration with project partners with a simplified user interface for the construction of virtual environments to explore. Working in conjunction with a selected user group panel who will remain engaged in the project for the duration, several test cases of interest will be identified for integration into the prototype and subsequent evaluations. The prototype will be developed by the thesis student in collaboration with Novelab (audio gaming) and IRCAM/STMS-CNRS (developers of the audio rendering engine). Design and evaluation will be carried out in collaboration with the Centre de Psychiatrie et Neurosciences and StreetLab/Institut de la Vision. The ability to communicate in French would be beneficial, but is not mandatory at the start of the project.

Evaluations will involve different experimental protocols in order to assess the accuracy of the mental representation of the learned environments. From the point of view of the metrics relation preservation, participants will have to carry out experimental spatial memory tests as well as onsite navigation tasks.

Candidate profile: We are looking for dynamic, creative, and motivated candidates with scientific curiosity, strong problem solving skills, the ability to work both independently and in a team environment, and the desire to push their knowledge limits and areas of confidence to new domains. The candidate should have a Master in Computer Science, Acoustics, Architectural Acoustics, Multimodal Interfaces, or Audio Signal Processing. A strong interest in spatial audio, room acoustics, and working with the visually impaired is necessary.   It is not expected that a candidate will have already all the skills necessary for this multidisciplinary subject, so a willingness and ability to rapidly step into new domains, including spatial cognition and psychoacoustics will be appreciated.

Domaine: Réalité virtuelle, Audio, Interaction, Aide Handicap

Dates:  Preferred starting date from 1-Nov-2018 to 20-Dec-2019, and no later than March-2019.

Application: Interested candidates should send a CV, transcript of Master’s degree courses, a cover letter (limit 2 pages) detailing their motivations for pursuing a PhD in general and specifically the project described above, and contact information for 2 references that the selection committee can contact. Incomplete candidatures will not be processed.

Application deadline: Complete candidature files should be submitted to brian.katz@sorbonne-universite.fr and markus.noisternig@ircam.fr before 1-Oct-2018.

VR in Engineering Education: Research Assistant/Associate – Imperial College London

Imperial College London (South Kensington Campus), Department of Chemical Engineering, offers a job opportunity as “Research Assistant/Associate in Using VR in Engineering Education”.

Duration: 12 months

Closing date: 12 August 2018

Degree Level: Ph.D (UG or MSc degree in areas such as Computer Science and Engineering, will also be considered)

Major(s): Computer Science, Engineering.

Job summary

The main role of the post holder is to help develop an interactive, multimodal environment for the exploration of transient, three-dimensional phenomena in mechanics-based modules, e.g. fluid dynamics, generated via in-house computational fluid dynamics software. Through this approach, students will develop a ‘feel’ and an intuition for the physical mechanisms underlying the flow phenomena, and the crucial assumptions and approximations which can be made in tackling complex flows, such as the ones typical of a wide range of settings.

Qualifications

  • PhD, or equivalent, in an area pertinent to the subject (i.e. Computer Science, Engineering, etc.). Individuals without a PhD, but with a UG or MSc degree in areas such as Computer Science and Engineering, will also be considered, but relevant and extensive experience in the field (e.g. industry experience) will need to be demonstrated.
  • experience with the development of VR applications, using signal-processing techniques, and interfacing software with hardware devices (e.g. haptic interface systems, VRPN).
  • excellent knowledge of programming in C, C++, Matlab (or Python), and of OpenGL (or similar).
  • Knowledge of fluid mechanics is desirable, as is experience with mobile programming (e.g. Swift, C#).

How to apply

Visit the official website and click the Apply now button at the top of the page.

Further information

For further information on this position, see the dedicated website.

Virtual Reality: Research Interns – Bosch Human Machine Interaction team, California

Virtual Reality Research Interns

Bosch Human Machine Interaction team, Palo Alto, California, U.S.A.

Duration: 3 – 12 months
Start Date: As soon as possible
Degree Level: Masters or Ph.D. – Must be a current student or recent graduate (less than 1 year)
Major(s): Computer Science or related fields

Tasks & Responsibilities:

  • Conduct advanced research and engineering in the area of virtual reality, e.g. robust hand tracking, multi-modal interaction in immersive virtual environment, and HMD calibration.
  • Apply research results to real-world cases with high quality implementation.
  • Integrate the resulting system/software into existing Bosch platform.
  • Summarize research findings in high-quality paper and patent submissions.

Qualifications:

  • Strong background in virtual/mixed reality, computer vision, machine learning and related fields
  • Proficient with multiple languages and libraries (C++, OpenGL, OpenCV, Unity3D, etc.).
  • Strong research and problem solving skills.
  • Good communication, teamwork and technical writing skills.

Desired Skills (not required):

  • Publication record in top venues (CVPR/ISMAR/IEEEVR/VRST/CHI/SIGGRAPH, etc.).
  • Experience with hand tracking and gesture recognition with commodity RGB/RGB-D cameras.
  • Experience with multi-modal interaction technology for VR application.
  • Experience with HMD calibration.

How to apply:

Please email your resume to rbvisualcomp@gmail.com with prefix “[VR Research Intern 2017]” in the subject/title.

Further information:

For further information on this position, see http://www.bosch.us/content/language1/html/15320.htm

 

Mixed and Augmented Reality: University Assistant with Doctorate – ICG, Austria

University Assistant with Doctorate, with Tenure Track to Position as Associate Professor

Institute for Computer Graphics and Vision, Graz, Austria.

Working area: Mixed and Augmented Reality

This full-time employment position (40h/week) is initially restricted to six years and offers the possibility of a qualification agreement for a tenured position. The employment is expected to start on May 1, 2017, and will be part of the Institute of Computer Graphics and Vision.

The position requires a completed PhD degree in computer science or a similar field.

We expect the following qualifications:

  • Excellent scientific qualification in Mixed Reality and Augmented Reality (MR/AR), especially computer graphics and visualisation methods for MR/AR and 3D user interfaces for MR/AR
  • International scientific publications
  • Experience with acquisition, management and scientific supervision of research projects
  • Experience in establishing a research group
  • Track record of scientific achievements, such as international prizes or awards
  • Teaching experience in relevant subjects
  • Network in the scientific community

Your job will involve the following duties:

  • Research on MR/AR, with the goal to earn international scientific reputation and visibility
  • Top-level international publications
  • Acquisition of third party funding for research (FFG, FWF, industrial
    projects)
  • Supervision of bachelor and master theses and support in the supervision of PhD students
  • Independent teaching in the bachelor and master program and introducing students to the world of visual computing
  • Service in the academic administration

The position will be paid according to category B1 of the collective contract of Austrian universities.

Graz University of Technology aims to increase employment of women in leading positions and as part of the research staff. We specifically encourage qualified women to apply. Please note that, all other aspects equal, women will be preferred in hiring.

How to apply:

Applications with supporting documents (copies of certificates and diplomas, CV, publication list, description of scientific and professional career, overview of previous and planned research and teaching, teaching evaluations if available) should be submitted preferably by email to informatik@tugraz.at and should quote the position identifier.

Closing date for applications:

18th February 2017

Position identifier: 7100/17/001