Now you can control your iPhone/iPad with your eyes thanks to Eye Tracking

Now you can control your iPhone/iPad with your eyes thanks to Eye Tracking

Last month, Apple announced new accessibility features, including eye tracking. Now the company has added eye tracking to the recently released iOS 18 and iPadOS 18.

This is a new feature in iOS 18 and iPadOS 18

With this option, users can control compatible iPhones and iPads using only their eyes. While this feature is designed to help people with disabilities, it can also be a fun and convenient way for other users to interact with their devices. Calibrating the front-facing camera and activating the feature takes only a few seconds and no additional hardware or accessories are required for the tracking feature to work.

With this feature, users can navigate through iOS and iPadOS apps, use pause controls to interact with each item, and even control the functions of physical buttons.

Users of an iPhone 12 or later on iOS 18 can activate eye tracking by going to Settings > Accessibility > Eye Tracking. Once the switch is turned on, users will need to follow the dot displayed on the screen with their eyes as it moves across the screen. This calibrates the camera to the user’s eye movement. This setup process only takes a couple of seconds, and then users can select and interact with items on the screen with their eyes. To select an item, such as an app icon, the user will need to look at it and hold their gaze for 2-3 seconds.

This feature uses machine LLMs built into Apple devices to accurately track eyes. The company claims that all data required for this feature is securely stored on the device and that there should be no privacy concerns.

Another accessibility feature in iOS 18 is Music Haptics, which allows deaf or hard of hearing people to experience music using the Taptic Engine in iPhone. The Vocal Shortcuts feature allows users to complete tasks with a special soundtrack.


Please enter your comment!
Please enter your name here