I waited literally years to write this post because Apple was a full 4 years behind one of Android’s most popular features — and I HATE to share a feature that only works on one platform. Y’all get angry. 🙂
Pull Text from the World Around You
If you have updated to iOS 15 or you own an Android phone, you will never have to retype a website or email address again. Apple’s Live Text feature (brand new) and Android’s Google Lens (around since 2017) open a whole world of information gathering via your phone’s camera.
Note: This post focuses on the ability to pull text with Live Text, but both the Apple and Google features let you identify objects, landmarks and
How to Use Apple Live Text
Look for a secret symbol to pop up the next time Apple detects text on an image or in your camera lens, then let Apple do the work for you.
Other Live Text Tricks
Live Text lets you pull info such as instant hyperlinks from website addresses and emails, and the feature even the ability to scan text straight into text messages.
Watch this short video for a thorough demo of Live Text:
How to Enable Live Text in Apple iOS 15
It’s probably enabled by default, but just in case… go to Settings>Camera>Live Text and make sure it’s turned on (green).
How to Use Google Lens
This is where it gets a little embarrassing for Apple. Google Lens has been around since 2017. Here’s my review of the tool from 2018. It includes a workaround so that Apple folks can use it, too. But Android phones have it built in.
Here’s a side-by-side, feature-by-feature comparison of Google Lens vs. Apple Live Text from one of my favorite tech sites, Tom’s Guide.
Are you Apple or Android?
Most people are clear about which platform they like. Are you an avid fan? Which platform? And did you know about these secret features?