https://www.selleckchem.com/products/daurisoline.html Tracking the eye of a blind patient can enhance the usability of an artificial vision system. In systems where the sensing element, i.e. the scene camera that captures the visual information, is mounted on the patient's head, the user must use head scanning in order to steer the line of sight of the implant to the region of interest. Integrating an eye tracker in the prosthesis will enable scanning using eye movements. The eye position will set the region of interest within the wide field-of-view of the scene camera. An essential requirement of an eye tracker is the need to calibrate it. Obviously, off-the-shelf calibration methods that require looking at known points in space cannot be used with blind users.Here we tested the feasibility of calibrating the eye-tracker based on pupil position and the location of the percept reported by the implant recipient, using a handheld marker. Pupil positions were extracted using custom image processing in a field-programmable-gate-array built into a glasses-mounted eye tracker. In the calibration process, electrodes were directly stimulated and the subject reported the location of the percept using a handheld marker. Linear regression was used to extract the transfer function from pupil position to gaze direction in the coordinates of the scene camera.In using the eye tracker with the proposed calibration method, patients demonstrated improved precision on a localization task with corresponding reduction of head movements.Vestibular perception is useful to maintain heading direction and successful spatial navigation. In this study, we present a novel equipment capable of delivering both rotational and translational movements, namely the RT-Chair. The system comprises two motors and it is controlled by the user via MATLAB. To validate the measurability of vestibular perception with the RT-chair, we ran a threshold measurement experiment with healthy participants. Our result