iOS 17 Brings Live Text to Video for Enhanced Interactivity

Apple’s upcoming iOS 17 update will bring the popular Live Text feature to video, allowing users to easily interact with text and images directly within video content. Live Text was first introduced in iOS 15 for still images, but iOS 17 expands it to function with videos as well.

Recent Released:What’s New in iOS 17 Health Features

What is Live Text?

Live Text uses advanced machine learning and computer vision techniques to detect text within photos or video frames. It can recognize text in multiple languages and formats, including handwritten text. Once text is detected, users can:

  • Select and highlight the text to copy and paste it elsewhere. This makes extracting information from images quicker and easier.
  • Translate any text into another language inline.
  • Make a phone call by tapping on phone numbers in the text.
  • Open web links by tapping on URLs.
  • Look up addresses on maps by selecting street addresses or location names.
  • See quick actions for dates, times, and contact info.

Overall, Live Text removes friction when needing to use or act on text and information present in images. It enables new ways to interact directly with visual content.

How Live Text Works on Video in iOS 17

Previously, Live Text only worked on still photos. But with iOS 17, it can now OCR text and offer interactions within video clips and live previews from the camera.

When viewing a video in Photos, users can pause the video, then tap and hold on text to highlight it just like a photo. From there, all the same Live Text actions are available like copying, translating, calling numbers, etc.

Live Text also works directly within the Camera app. When the camera is open, any text seen by the camera is automatically detected and highlighted. Tapping on the highlights provides instant actions, eliminating the need to first capture a still photo.

Finally, Live Text functions in Quick Look on videos. So when previewing a video with Quick Look (such as from Files), you can interact with text without formally opening the video in Photos.

Benefits of Live Text for Video

Expanding Live Text to work on videos provides several notable benefits:

  • Faster information extraction – You no longer need to manually pause videos and take screenshots to grab key text/info. Live Text does this automatically.
  • Improved accessibility – People with visual impairments can get more use from videos with Live Text through features like audio translations.
  • Enhanced video interactivity – Viewers can take direct actions on phone numbers, links, etc. right within the video playback.
  • Works across apps – Live Text works consistently in Photos, Camera, Quick Look, Safari, and any app that displays video.
  • Multilingual support – Languages like Simplified Chinese, Spanish, French, Italian, German, and English are supported.

Overall, Live Text reduces the friction of obtaining and working with text and information embedded in videos.

Which Devices Will Support Live Text for Video?

Live Text for both images and video requires Apple’s Neural Engine to run the machine learning models powering the feature. Therefore, Live Text for video will be supported on:

  • iPhone XS and later
  • iPad Pro 3rd generation and later
  • iPad Air 3rd generation and later
  • iPad 5th generation and later
  • iPad mini 5th generation and later

If you have an older iPhone or iPad without Neural Engine, Live Text will continue working for photos, but not videos. Upgrading to a newer device will enable the full Live Text experience.

How to Use Live Text for Video in iOS 17

Using Live Text with video in iOS 17 is easy and straightforward. Here are the basic steps:

  1. Update your supported iPhone or iPad to iOS 17.
  2. Open the video in Photos or Quick Look that contains text you want to interact with. You can also point your camera at text.
  3. Pause or tap the video (or use camera preview) to engage Live Text on that frame. Detected text will highlight automatically.
  4. Tap on a highlight to reveal the available interaction options like Translate, Call, Copy, etc depending on the type of text.
  5. Perform your desired action! The text will act just like a live hyperlink or button within the video or camera viewfinder.
  6. Repeat this process as needed on any text seen during video playback or camera capture.

The process feels very responsive, and takes mere seconds to extract and act on text from videos.

Tips for Using Live Text on Video Effectively

To take full advantage of the new Live Text for video capabilities in iOS 17, keep these tips in mind:

  • Pause on text-heavy frames to extract a lot of information quickly.
  • Use Live Text early on in videos to grab critical details like phone numbers.
  • Tap highlighted text instead of manually selecting it for faster results.
  • Enable as many Live Text languages as possible in Settings for translation abilities.
  • Try Live Text in the Camera app to act on text without capturing first.
  • Remember it works across Photos, Quick Look, Camera, and other apps.
  • Live Text requires adequate lighting and a clear view of text to detect it accurately.
  • For small text, use the zoom to enlarge it before engaging Live Text.

With some practice, using Live Text on videos can become second nature. It’s an easy way to turn static text into interactive information.

Limitations of Live Text for Video

While Live Text for video represents a major expansion of the feature’s capabilities, there are some limitations to be aware of:

  • It only works on frozen video frames, not moving video. You must manually pause to trigger it.
  • Low resolution, blurry, or distant text may not be detected.
  • Extremely small, stylized, or complex fonts can be difficult to recognize.
  • It relies on Neural Engine hardware only present in recent iOS devices.
  • Recognition and interaction speed may vary depending on length and complexity of detected text.
  • Text translated from one language to another is not always perfectly accurate.
  • Only supported typed languages can be detected, not handwritten languages like Chinese.

In most cases, Live Text will work seamlessly, but its computer vision abilities have their limits. Still, it presents exciting new interactions before not possible in video content.

The Future of Live Text for Video

iOS 17 marks just the beginning of expanded Live Text capabilities on iPhone and iPad. It’s reasonable to expect Apple to continue improving and refining the technology over future iOS updates.

Here are some potential ways Live Text for video could evolve:

  • Detect text within moving video, not just paused frames
  • Recognize more handwritten and stylized fonts
  • Support text detection in more languages
  • Enable user-selectable sections of text for interaction instead of whole detected regions
  • Faster processing and interaction speeds powered by Apple silicon advancements
  • Tighter integration with other accessibility features like VoiceOver and Speak Selection
  • Wider application beyond Photos and Camera into other apps like YouTube or TikTok
  • On-device intelligence to summarize key video content based on Live Text analysis

Live Text on both images and video provide a glimpse into the future of augmented reality and computer vision on mobile devices. More seamless blending between digital content and the physical world. iOS 17 brings video into the fold, but there remains enormous potential still to be explored in future versions of Live Text.

Key Takeaways on Live Text for Video

  • iOS 17 expands Live Text from photos to video, detecting text in paused frames, Camera, and Quick Look.
  • Users can interact with text directly within video by copying, translating, calling, sharing and more.
  • Live Text makes videos more immersive and information easier to access.
  • The feature requires an iPhone XS or later with Neural Engine to function.
  • To use it, pause/tap a video, tap highlighted text, and choose an interaction option.
  • It enables new ways to engage with videos, but has some accuracy and language limitations.
  • Expect expanded capabilities and tighter integration over future iOS updates.

The introduction of Live Text to video marks a major step forward in iOS capabilities. While not without some limitations, the feature foreshadows even more advanced computer vision applications soon to come. iOS 17 empowers users to engage with content in videos like never before.

Frequently Asked Questions About Live Text for Video

Q: Which iPhones and iPads are compatible with Live Text for video?

A: Live Text for video requires Apple’s Neural Engine, so compatible devices include iPhone XS and later, iPad Pro 3rd gen and later, iPad Air 3rd gen and later, and iPad 5th gen, iPad mini 5th gen and later.

Q: Can Live Text work on any type of video content?

A: Live Text works on videos stored locally on your device, such as in the Photos app or Camera Roll. However, it will not work on videos played from external sources like YouTube, Netflix, etc.

Q: Does Live Text for video work on moving video?

A: No, unfortunately Live Text currently only recognizes text on paused or static video frames. The video must be manually paused to activate Live Text.

Q: Can Live Text detect text in low light video conditions?

A: Live Text requires adequate and consistent lighting in order to accurately detect text in videos. Low light conditions will decrease recognition accuracy.

Q: How many languages does Live Text for video support?

A: iOS 17 supports Live Text for video in English, Chinese, French, Italian, German, Portuguese, and Spanish. Support varies slightly for photos vs video.

Q: How accurate is the text translation feature of Live Text?

A: The quality of Live Text translations depends on the language. More widespread languages tend to have better translation accuracy. But no machine translation is 100% accurate.

Q: Can I use Live Text on video to extract text from handwritten notes and documents?

A: Unfortunately, Live Text does not currently support optical character recognition (OCR) for handwritten materials. Only typed text in supported fonts can be detected.


The launch of Live Text for video in iOS 17 brings Apple’s computer vision and augmented reality technologies to an entirely new medium. Users can now interact with videos in creative new ways thanks to automatically detected and interactive text. While the feature has limitations, its debut represents a watershed moment for enhanced video engagement on iPhone and iPad.

Looking ahead, expect Apple to rapidly iterate on Live Text for both images and video. The company has only begun tapping into the power of on-device intelligence. Seamless blending between the physical and digital is the direction we’re clearly headed. With iOS 17, videos come alive with new possibilities thanks to Live Text. The future remains bright for ever-smarter computer vision features across Apple’s platforms.

Leave a Comment

%d bloggers like this: