Don’t Miss Calls and Texts: How to Use New Phone and Messages Filtering

Spam is one of the many banes of modern existence. While we receive more email spam than anything else, interruptions from unwanted phone calls and text messages are even more annoying. Apple has added various features over the years to help control spam calls and messages, but none have completely solved the problem. The problem is that, unlike email, it’s difficult to evaluate a call or message before notifying the user, especially since phone calls occur in real time and text messages are meant for quick back-and-forths.

In iOS 26, iPadOS 26, and macOS 26 Tahoe, Apple has taken another swing at the problem with additional options to identify calls and messages from unknown sources. They work as advertised, but because everyone receives calls from unknown numbers (your doctor’s new line, or that person you exchanged numbers with at a party) and messages from unknown senders (two-factor authentication codes, notifications from a food truck that your order is ready), it’s not necessarily an easy decision to enable these options. When they’re turned on, it’s easy to miss important calls or messages when they come in, and if you don’t know where to look, you may never realize they arrived at all.

Only you can decide if the peace from not being inundated with spam calls and messages is worth the risk of missing something important. Here’s what you need to know about filtering unknown callers in Phone and unknown senders in Messages. We’ll focus on the iPhone apps here, but the interfaces are very similar on the iPad and Mac.

Unknown Callers in the Phone App

Apple offers several new ways to deal with unknown callers before they ring through and afterward, for calls you don’t pick up. Configure these options in Settings > Apps > Phone:

  • Screen Unknown Callers: To control calls before they ring through, choose either Ask Reason for Calling or Silence. Our experience is that asking callers for information works only on real people—telemarketers just hang up. Silencing unknown callers is effective, but it also means you’ll miss some legitimate calls.
  • Call Filtering: Enable these switches to move missed calls and voicemails from unknown callers to the Unknown Callers list, or even the Spam list if your carrier identifies the call as spam. These lists are accessible from the Phone app’s Filter menu.

When the two Call Filtering switches are enabled, two new lists appear in the Filter menu , accessible by tapping the Filter button in the upper right. In Unknown Callers, each call displays Delete and Mark as Known buttons. Marking a call as known moves it and future calls from that number to the main Calls list; deleting a call (and optionally reporting it as spam) maintains it as unknown for future calls. We haven’t seen any calls identified as spam yet, so we don’t quite know what that screen looks like.

Our advice:

  • If you’re concerned about missing potentially important calls, set Screen Unknown Callers to Never and turn off both Call Filtering switches.
  • As a middle ground, choose Ask Reason for Calling and turn on Spam but not Unknown Callers.
  • If you hate unwanted calls and don’t care if you miss the occasional legitimate call, turn on Ask Reason for Calling or Silence and select both switches.

Unknown Senders in the Messages App

Messages are easier to screen and filter than phone calls because they don’t occur in real time. In Settings > Apps > Messages (below left), you’ll find:

  • Screen Unknown Senders: This option creates a separate Unknown Senders list and hides those conversations from your main list.
  • Filter Spam: Enable this switch to separate explicit spam and junk messages into a Spam list. Apple’s spam filtering seems to be accurate, if conservative.

When Screen Unknown Senders is on, two additional options become available:

  • Allow Notifications: Should you be notified when a message from an unknown sender arrives? It depends on the message, and Apple lets you allow or block four types: time-sensitive messages, messages from individuals rather than organizations, transactional messages, and promotions (below center).
  • Text Message Filter: Turning on the Screen Unknown Senders and Spam switches is sufficient to create Unknown Senders and Spam lists in Messages. What the Text Message Filter setting (below right) does, when enabled, is create lists for Transactions and Promotions, and sub-lists for Finance, Orders, and Reminders under Transactions.

In the Messages app, tapping the Filter button at the top right reveals all the available lists. Note that the lists are exclusive—if a message appears in Promotions or Spam, for instance, it won’t also be in the main Unknown Senders list. It also seems that the more granular filtering enabled by the Text Message Filtering setting applies only to new messages, not existing messages, so you won’t see much in there to start.

Our advice: How much of a difference these settings will make to you depends on how heavily you use Messages. For those who receive relatively few messages from unknown senders, it’s probably worth keeping only the Filter Spam setting on. As that number increases, we recommend turning on Screen Unknown Senders and enabling at least Time Sensitive, Personal, and Transactions for notifications. Make sure to check the Unknown Senders list regularly to make sure you’re not missing anything. Only those who are drowning in text messages from unknown senders should turn on Text Message Filtering to get the more granular filtering—it’s just fussier than necessary for most people.

One final note. The Phone and Messages apps on the iPhone, iPad, and Mac all have these settings, and you must set them individually on each platform. It’s best to ensure each platform displays the same lists; otherwise, you might have trouble finding the same message across devices. Also, if you have devices that aren’t yet running iOS 26, iPadOS 26, or macOS 26, you’ll still receive Messages notifications from unknown senders that are otherwise silenced on the newer operating systems.

(Featured image by iStock.com/fadfebrian)

Never Save Your Work in These Locations

In every job that involves interaction with the public, amusing “Can you believe…” stories about customers abound. They’re often triggered by seemingly reasonable behaviors that experts recognize as problematic. A well-known example from the early days of personal computing is a college student who kept track of his floppy disk by attaching it to his fridge with a magnet, not realizing that magnetic fields could disrupt the disk’s magnetic patterns and corrupt files. The advice from tech support? “Don’t do that.”

No one is sticking floppies to their fridge anymore, but we still occasionally see the modern equivalent: saving data or documents in places that are likely to disappear. Just as you shouldn’t write the only copy of essential information on an easily erased whiteboard, you shouldn’t store important data in any of these locations:

  • Unsaved documents: While autosave is becoming more common, it isn’t universal and often doesn’t activate until a document has been saved for the first time. When you create a new document, always save it right away, before you do anything else. Otherwise, you risk losing all your work if the app crashes, the Mac kernel panics, or the power goes out.
  • Trash: We know, we know! Who would put something in the Trash that they want to keep? But it happens. Don’t do that! On the other hand, there’s also no reason to empty your Trash regularly unless you’re low on space. A good compromise is to choose Finder > Settings > Advanced and select “Remove items from the Trash after 30 days.” This way, you’ll always have a 30-day grace period to recover mistakenly deleted items.
  • Clipboard: Most people know that the clipboard serves as a temporary holding place, overwritten with each new Copy or Cut. However, if you’re unaware of this, you might write something lengthy, use Cut to place it on the clipboard with the intention of pasting it elsewhere, and then forget to do so right away, resulting in data loss on the next use of Copy or Cut. Always paste anything you cut immediately. Many utilities (such as Copy ‘Em, Keyboard Maestro, LaunchBar, Pastebot, and Raycast) provide clipboard history so you don’t lose clipboard data immediately, but you still shouldn’t rely on it persisting indefinitely.
  • Email Drafts mailbox: There’s nothing wrong with starting an email and coming back to it later to finish—that’s the point of the Drafts mailbox. It’s also a sensible way to begin a message on one device and complete it on another. However, avoid storing anything in Drafts for an extended period, and be aware that items there may disappear without warning. (And never, ever store anything in your email Trash mailbox—it will be deleted eventually.)
  • Temporary folders: Thanks to its Unix roots, macOS includes several temporary folders, one located at /tmp and others specific to each user. These folders are cleared regularly, such as when the Mac is restarted, left idle for a long time, or when drive space is low. Storing important data in a temporary folder is a digital version of Russian roulette.
  • Downloads folder: Although the Downloads folder isn’t inherently volatile, it’s unwise to store anything important there. You might forget about that document while tidying up and accidentally delete it, or you might use a cleanup tool in the future that does it for you.
  • USB flash drives: There is nothing wrong with putting files on a USB flash drive. However, avoid storing the only copy of an important file on one, as it is too easy for the drive to be lost or damaged.
  • Public computers, virtual machines, and sandboxed environments: This scenario is unlikely but not impossible. Imagine you’re working on a public computer in a lab and save a file on the desktop. When that computer reboots, it will likely delete all data to return to a fresh state for the next user. The same could apply to a virtual machine used for testing or a sandboxed environment that you log in to remotely.

There are also a few locations that generally aren’t problematic but deserve extra attention due to the higher likelihood of losing data:

  • Third-party app folders in ~/Library: Some apps store their data in folders they maintain within your user account’s Library folder. While this is acceptable for data managed by those apps, we advise against putting anything else in these folders since it’s impossible to know how the app might deal with data it doesn’t recognize during a cleanup or major update.
  • Desktop: It’s fine to work on documents stored on the desktop, but we recommend filing them away carefully when you’re finished. If you frequently move files in and out of your desktop, it’s all too easy to delete something important accidentally. Additionally, if you have iCloud Drive’s Desktop & Documents folder syncing enabled, you might unintentionally delete files from another Mac due to being in a different context.
  • Box, Dropbox, Google Drive, iCloud Drive: Cloud storage services are entirely acceptable locations for important data, but they all offer options that store files only online, downloading them only when necessary. These options may prevent online-only files from being accessible when you’re offline or from being backed up locally. Worse, if you share cloud storage with others for collaboration, they could accidentally delete your data. Be sure to enable any available version history options and ensure everything is backed up locally.
  • External drives or network storage: Many individuals and organizations store essential files and data on external drives and network storage. This approach is perfectly valid, provided that these locations are backed up. When designing your backup system, remember to include your external drives, network servers, and NAS devices. Lastly, if an external drive is encrypted, ensure that you have a backup of both its data and the decryption key.

If you want to avoid all these issues, save your files in your Documents folder and make sure you have a solid backup strategy.

(Featured image based on an original by iStock.com/shutjane)

Make the Most of Visual Intelligence on the iPhone

The “visual intelligence” aspect of Apple Intelligence leverages the artificial intelligence capabilities of your iPhone to make what you see through the iPhone’s camera or on its screen interactive and actionable in ways that weren’t previously possible. It’s one of the most useful aspects of Apple Intelligence.

Triggering Visual Intelligence

We offer numerous examples of visual intelligence’s superpowers later in this article, but first, let’s make sure you know how to activate its two modes: camera mode and screenshot mode. Use camera mode to learn more about the world around you; use screenshot mode for help with something on your iPhone’s screen. Here’s how to trigger each mode:

  • Camera mode: Press and hold the Camera Control button on all iPhone 16 models (except the iPhone 16e), all iPhone 17 models, and the iPhone Air. Press the Camera Control again or tap the shutter button to lock the image for visual intelligence. On the iPhone 15 Pro, iPhone 15 Pro Max, and iPhone 16e, which support Apple Intelligence but lack the Camera Control, use the Action button, a Lock Screen button, or a Control Center button.
  • Screenshot mode: Simultaneously press the side button and volume up button to display an interactive preview of what was on the screen.

Visual intelligence analyzes the content of your image and provides relevant action buttons based on what it detects. While the Ask and Search options are always available, other buttons appear contextually depending on the content.

  • Ask: Tapping the Ask button lets you pose a question about the image to ChatGPT. (But we’ve found that Apple won’t pass on health-related questions or queries with certain types of sensitive data.)
  • Search: Tapping the search button conducts a Google Image search for similar items. It may also display tabs for search results from other apps, such as Etsy or eBay.
  • Recognized objects: When visual intelligence identifies an object, such as a specific plant or animal, it displays a button that brings up more details.
  • Text: When it detects blocks of text, visual intelligence provides buttons to summarize the text, read it aloud, or translate it.
  • Dates: If it detects a date in text, visual intelligence displays an Add to Calendar button.
  • Contact info: When details like email addresses or phone numbers appear in the image, visual intelligence can help you call or message the number, or send email.
  • Addresses: When it identifies a physical address in text, visual intelligence displays a button that opens the address in Maps.
  • URLs: This works in the virtual world as well—a URL embedded in an image prompts visual intelligence to display a button that opens the website in Safari.
  • Businesses and locations: When you capture an image of a business or other location that’s known in Maps, visual intelligence can show hours, menus, reviews, and more.

Real-World Uses for Visual Intelligence

It can be challenging to think of uses for visual intelligence at first, simply because it’s a new way of engaging with the world around you and what you see on your iPhone. We’re used to taking pictures of event flyers we want to attend, doing Google searches for things we see, asking questions of chatbots, and using specialized apps to identify plants and animals—visual intelligence can do all that and more. Here are a few practical ways to use visual intelligence today:

  • Create calendar events: Create calendar events from posters, flyers, invitations, or Web pages. When you point the camera at a poster or take a screenshot of an event page, an Add to Calendar button allows you to create an event directly from the on-screen details.
  • Find business information: Point the camera at a business to retrieve details such as hours, menu/services, phone number, and website. It’s the same information you’ll find in Maps, but it’s easier to pull up using visual intelligence.
  • Search for products: Shopping for something? Once you find an example of what you like—such as this mid-century modern sofa—take a screenshot, circle the picture with your finger, and browse the search results for similar items.
  • Summarize and read text aloud: When you’re faced with a large amount of text, especially if the font size is difficult to read, visual intelligence can provide a summary or even read it aloud. The option to have text read aloud can be particularly helpful for those with low vision.
  • Translate text: iOS offers multiple ways to translate text in unfamiliar languages, including the Translate app, but visual intelligence is often the fastest way to get a quick translation of a sign or placard.
  • Quick object identification: We’ve all wondered what some plant or animal is—using visual intelligence, you can point your camera at it to find out quickly. Just tap the name that appears to get more details.
  • Research questions: Sometimes, you know what you’re looking at but have questions about it. Instead of starting a new search, you can use visual intelligence, tap the Ask button, and pose your question to ChatGPT. Tap the response to ask a follow-up question. If you connect Apple Intelligence to your ChatGPT account in Settings > Apple Intelligence & Siri > ChatGPT, your conversations will be saved in ChatGPT, where you can review and continue the discussion.
  • Manipulate real-world data: Anything that can be photographed or captured in a screenshot can be used as data for other manipulations. For example, you could take a picture of a bookshelf and ask for a list of all the titles, or take a screenshot of a recipe and request the calorie count per serving.

How does visual intelligence compare to apps like ChatGPT, Claude, Gemini, and others? It outdoes them in two ways but falls short in one. Thanks to its deep integration with iOS and the iPhone’s Camera Control, it’s easier to activate visual intelligence than any other app. It also transfers data more effectively to other apps, such as sending URLs to Safari, phone numbers to Phone or Messages, addresses to Maps, and more. However, chatbot apps—which can also analyze photos and screenshots—are more conversational, offer more detailed information, and are willing to discuss potentially sensitive topics that Apple won’t touch, such as health and politics. We use visual intelligence for straightforward tasks, but for more complex situations, we often turn to a chatbot app instead.

(Featured image by Apple)

How to Customize the iPhone and iPad Home Screen with Liquid Glass

When describing its new Liquid Glass design language, Apple spoke only generally about how users could change the look of icons and widgets on their iPhone and iPad home screens to be dark, clear, or tinted, without specifying how to do that. The trick is to touch and hold an empty spot on the Home screen to enter jiggle mode, tap Edit in the upper-left corner, and select Customize to bring up a set of controls: choose from Default, Dark, Clear, and Tinted. For Tinted, set the color and opacity using the sliders, and use the buttons at the top of the Customize panel to change the brightness , expand icons and remove names , and use either the suggested image color or pick a color with the eye dropper . The effectiveness of a Liquid Glass-enabled Home Screen will depend on how much you rely on color to identify icons at a glance.

(Featured image by Apple)

macOS 26 Tahoe Introduces New Recovery Assistant

If a Mac running macOS 26 Tahoe experiences certain kinds of problems, it might automatically restart and launch a new Recovery Assistant. It will prompt you to unlock your disk if needed and to connect to a Wi-Fi network—an Internet connection is required. Apple doesn’t specify precisely what Recovery Assistant will do to recover your device, but it will either indicate that it succeeded, that it was unable to recover the device, or that it found no problems. In any case, you’ll need to restart your Mac, after which you may receive a notification to recover your iCloud data; initiate this process in System Settings. While we generally favor Macs being able to fix their own problems, we cannot stress enough how much more important it is to have a current backup than to rely solely on any recovery system.

(Featured image based on an original by iStock.com/Armastas)

Updated Passwords App Adds History

One small way Apple’s Passwords app lagged behind top password managers like 1Password was in its lack of a password history. It’s sometimes helpful—such as when trying to figure out why a seemingly correct password isn’t being accepted—to see previous passwords for a site and when they were changed. In macOS 26 Tahoe, iOS 26, and iPadOS 26, the Passwords app adds that feature. Click or tap View History to review the history of a particular site’s passwords.

(Featured image by iStock.com/designer491)

App Store Gains Accessibility Nutrition Labels

Apple does a good job providing accessibility options for users who experience issues with vision, hearing, motor control, and other accessibility needs. Nearly everyone will benefit from these features at some point in their lives. To encourage support for Apple’s accessibility features and assist users in finding compatible apps, the App Store now includes Accessibility Nutrition Labels that indicate supported features. Developers aren’t required to support or list these features, so it may take some time before many apps display this information. Still, it’s a welcome step forward!

(Featured image by iStock.com/findfootagehq)

Apple Adds M5 Chip to MacBook Pro, iPad Pro, and Vision Pro

Apple’s fall harvest has yielded the new M5 chip, leading to updates for the entry-level 14-inch MacBook Pro, the iPad Pro lineup, and the Vision Pro. All three benefit from the M5’s significant performance boosts, but remain mostly unchanged otherwise. Availability starts on October 22. We anticipate that Apple will release M5 versions of the iMac, Mac mini, and MacBook Air in the coming months, along with M5 Pro and M5 Max chips in early to mid-2026.

With the M5 chip, Apple continues to focus on boosting AI performance. Its new 10-core GPU promises up to four times the peak GPU compute performance of the previous M4 chip—an impressive leap. The GPU also provides enhanced graphics capabilities and ray tracing that deliver results up to 45% faster than the M4. The 10-core CPU, with six efficiency cores and four performance cores, provides up to 15% faster multithreaded performance than the M4, a more typical performance increase between chip generations. Additionally, Apple increased unified memory bandwidth from 120 GBps to 153 GBps, which speeds up many different operations.

14-inch MacBook Pro

The most mainstream of the updated devices is the entry-level 14-inch MacBook Pro, which benefits from the improved performance of the M5 chip. Apple claims improvements in various tasks ranging from 20% to 80% compared to the M4 model it replaces. Even more compelling is the comparison with the M1-based 13-inch MacBook Pro, where the M5 MacBook Pro is 2 to 7 times faster.

Apple also increased SSD performance by up to 2x, which we suspect should give the new M5 model read/write speeds comparable to those of the M4 Pro and M4 Max models of the MacBook Pro.

The final update is that the M5 MacBook Pro can now be equipped with 4 TB of storage, a $1,200 option previously reserved for the M4 Pro and M4 Max models. They remain for sale, and although Apple didn’t share any benchmarks, we suspect they will continue to outperform the M5 model at most tasks.

Otherwise, the new M5 MacBook Pro continues to feature three Thunderbolt 4 ports, an HDMI port, an SDXC card slot, and MagSafe 3 charging. Its 14.2-inch Liquid Retina XDR display with ProMotion remains unchanged, as does the physical design. Pricing is also the same, starting at $1,599 for 16 GB of unified memory and 512 GB of storage.

For now, the M5 model of the 14-inch MacBook Pro is notably more capable than the M4 models of the MacBook Air, which was less true of the M4 MacBook Pro it replaces. That said, you won’t go wrong with the more affordable MacBook Air or the higher performance of the M4 Pro and M4 Max models of the MacBook Pro.

11-inch and 13-inch iPad Pro

The M5 models of the 11-inch and 13-inch iPad Pro deliver similar performance improvements over the previous M4 models, with Apple highlighting up to 2x faster AI image generation and up to 2.3x faster AI video upscaling. The enhancements are even more pronounced when compared to the M1 iPad Pro models, where benchmarked tasks are 2x to 7x faster. Additionally, the M5 iPad Pro models feature up to 2x faster storage, although they are limited to 2 TB.

Apple didn’t stop there. The 256 GB and 512 GB configurations have 50% more unified memory, increasing from 8 GB to 12 GB, while the 1 TB and 2 TB configurations have 16 GB. The new models also feature enhanced external display support, enabling them to drive external 4K displays at up to 120 Hz with Adaptive Sync, which reduces latency and enhances gaming performance. Lastly, Apple introduced fast charging that provides 50% power in 30 minutes with an appropriate charger.

Two other notable changes probably won’t be evident to most users. The M5 models of the iPad Pro utilize Apple’s new C1X cellular modem and N1 wireless network chip, which enable Wi-Fi 7, Bluetooth 6, and Thread networking. Apple claims faster cellular and Wi-Fi performance, along with lower power consumption, but official battery life estimates remain unchanged.

Otherwise, the new iPad Pro models remain nearly identical to their predecessors. They retain the same Ultra Retina XDR display, cameras, ports, and accessory ecosystem (Magic Keyboard and Apple Pencil). The case and industrial design are the same.

Pricing for the 11-inch iPad Pro starts at $999 for Wi-Fi models and $1,199 for Wi-Fi + Cellular models. The 13-inch iPad Pro starts at $1,299 with Wi-Fi and $1,499 for Wi-Fi and cellular connectivity.

Creative professionals already using a previous generation of the iPad Pro may consider upgrading, but for most people, we recommend either the low-cost iPad or the mid-range iPad Air.

Vision Pro

In the first hardware update to its Vision Pro “spatial computer,” Apple replaced the M2 chip with the more powerful M5. This upgrade boosts performance, allowing the Vision Pro to render 10% more pixels, resulting in a sharper image with crisper text and more detailed visuals. The M5 also increases the Vision Pro’s maximum refresh rate to 120 Hz from 100 Hz, helping to reduce motion blur. Battery life receives a slight boost, providing an additional 30 minutes of general use (up to 2.5 hours) and video playback (up to 3 hours). The only other change is a new Dual Knit Band, which Apple says is more comfortable.

Apart from the M5 and Dual Knit Band, the Vision Pro remains unchanged in form, function, and philosophy. Nor did Apple lower the Vision Pro’s price, which is still $3,499. Although these changes undoubtedly improve the Vision Pro experience a bit, they won’t change anyone’s purchasing decision.

(Featured image by Apple)

Ten Useful New Features in iOS 26’s Phone App

Although it’s easy to joke about how little we use our iPhones for actual phone calls, telephony remains a core feature that everyone depends on to some degree. In iOS 26, Apple put significant effort into improving the phone experience, delivering the most notable upgrade to the Phone app in years. Here’s what you’ll find.

Unified View

The Phone app has traditionally featured a toolbar with buttons for Favorites, Recents, Contacts, Keypad, and Voicemail, along with a separate Search field. iOS 26 retains this layout as Classic view (below, left) and introduces a new Unified view (below, right) that aims to simplify the interface by reducing the toolbar to four buttons: Calls, Contacts, Keypad, and Search, with the Calls screen combining favorites and recent calls. You can switch between these views by tapping the Filter button in the top-right corner and choosing the preferred layout. The Filter menu also lets you specify which calls appear below, including voicemail.

Call Screening

A new Call Screening feature, configurable in Settings > Apps > Phone > Screen Unknown Callers, intercepts incoming calls from unknown numbers and prompts the caller to “state their name and reason for calling” before the iPhone even rings. If the caller responds, you’ll see a transcript or snippet of their response, allowing you to decide whether to answer or ignore the call. In our experience, it mainly causes spammers and telemarketers to hang up instantly, which is equally effective.

Unknown Call Lists

The Phone app has long been able to silence calls from unknown numbers—those not in your contacts or numbers you haven’t called—and send them directly to voicemail. This feature, now called Unknown Callers, remains available in Settings > Apps > Phone > Call Filtering, as does the previous Silence Junk Callers option, now called Spam. What’s new is that when these options are turned on, lists for Unknown Callers and Spam appear in the Filter menu, so they don’t clutter your Calls list. You can delete calls from unknown callers, mark their numbers as known, or add them to Contacts so they aren’t silenced next time.

Spam Voicemail Reporting

With most spam calls going to voicemail, your inbox may fill up with unwanted messages. You’ve always been able to delete them, but now, when you view a voicemail from an unknown number, a Report Spam button appears. Tap it to report the voicemail to Apple and delete it. We don’t know if reporting spam voicemails will make any difference, but it’s more satisfying than just deleting them.

Hold Assist

The Phone app’s new automatic Hold Assist feature is somewhat hard to test, but we hope it works when you need it. When Hold Assist Detection is enabled in Settings > Apps > Phone, Apple says that if you’re placed on hold by a customer service agent, the Phone app can detect hold music, silence it, and notify you when the agent comes back on the line. You can also manually tap the More button, tap Hold Assist, and then see a transcript of the hold message while you wait. Tap Pick Up when you’re ready to talk.

Live Translation in Calls

If you need to call someone who speaks a different language (as long as it’s English, French, German, Portuguese, or Spanish), the new Live Translation feature might be helpful. Once you’re on the call, tap the More button, then tap Live Translation, and choose the languages for From and To. We highly recommend testing this feature before you actually need it, as it can take a few minutes to download a new language for the first time. After that, tap Start Translation to hear spoken translations and have your voice translated for your caller. You’ll also see a transcript of both sides of the conversation.

Type to Siri During Calls

If you desperately need to use Siri during a phone call, you can now activate Type to Siri. Make sure it’s turned on in Settings > Apple Intelligence & Siri > Talk & Type to Siri, and then double-tap the bottom edge of the iPhone to open a Siri text entry field.

Screen Sharing and SharePlay in Calls

When you’re on a call with someone using iOS 26, iPadOS 26, or macOS 26, you can now initiate Screen Sharing or SharePlay with that person during the call. Access these features from the More button. Screen Sharing can be helpful for explaining how to perform an action on the caller’s device or troubleshoot a problem, and SharePlay lets you and the caller listen to the same audio or watch the same video in a supported app.

Call History

When you view a contact from within the Phone app (not the Contacts app), a Call History option appears, and tapping it shows your call history with that number, which could go back years.

Phone App Comes to macOS, iPadOS, and visionOS

Wait, did we just say you could be on a call with someone using the Phone app on a Mac or iPad? Exactly! With macOS 26, iPadOS 26, and even visionOS 26, the Phone app has expanded to the iPhone’s sibling platforms. It looks and works very similarly, with the benefit of displaying more information at once. To use one of these Phone apps, your iPhone must be on the same Wi-Fi network and signed in to the same Apple Account.

(Featured image based on an original by iStock.com/sergeyryzhov)

What’s Liquid Glass, and What to Do If You Don’t Like It

If you’ve been following Apple’s recent product releases, you’ve probably heard the term “Liquid Glass.” That’s what Apple calls its newest design language, a combination of an aesthetic look and functional philosophy for the user interface in iOS 26, iPadOS 26, and macOS 26 Tahoe, in particular, but also in watchOS 26, visionOS 26, and tvOS 26 (we think of them collectively as OS 26). Apple describes Liquid Glass as a “translucent material that reflects and refracts its surroundings, while dynamically transforming to help bring greater focus to content.” The company claims that Liquid Glass “makes apps and system experiences more expressive and delightful while being instantly familiar.” Apple even has an intro video.

Beyond the marketing speak, that means most of the controls you’ll interact with in Apple’s new operating system will be semi-transparent and appear to float above the content, blurring what’s underneath and adjusting to the underlying content’s color. That may make it easier for you to focus on your content, or it may make the interface harder to read. Liquid Glass also features subtle animations that may seem fun or make everything feel a little squishy.

Like many of Apple’s interface changes over the years, Liquid Glass has sparked strong reactions—some love it, while others dislike it. While we’ll explore the pros and cons below, it’s worth acknowledging upfront that Liquid Glass represents Apple’s vision for the future of its interfaces. Although you can adjust various settings to make it more comfortable to use (which we’ll cover shortly), Liquid Glass will be part of all Apple operating systems going forward. The good news is that, as with previous major interface changes, such as Aqua in Mac OS X and iOS 7, we’ll all adapt to it over time as Apple continues to refine and enhance the experience.

Liquid Glass Pros

Although Liquid Glass might appear to be just a fashionable cosmetic update, Apple’s designers had some serious objectives:

  • Cross-device platform consistency: Many Apple users own an iPhone, iPad, Mac, Apple Watch, and Apple TV. The interfaces for these operating systems have all evolved somewhat independently due to differences in their development, usage, and screen sizes. Liquid Glass is Apple’s effort to unify the design language across its devices, making each device feel like part of a cohesive design philosophy.
  • Focus on content: Between the translucent look that refracts the content behind it and by having interface elements morph and fade when not in use, Apple designed Liquid Glass to help you focus on your content rather than cluttering the screen with controls.
  • Modern look and feel: While longtime Apple users may prioritize functionality and familiarity, one of Apple’s main goals is to attract new users by convincing them to switch from Android and Windows or encouraging them to start with Apple devices. Liquid Glass draws inspiration from futuristic devices seen in science fiction shows and movies, which might especially appeal to younger users who have grown up with these visual references.
  • Fluid animations: By adding subtle animations to Liquid Glass, Apple makes the interface come alive in a way that wouldn’t be possible otherwise. Again, that may attract new users or impress possible switchers.
  • Personalization: Some people have a strong aesthetic desire to see their interfaces be clear or tinted rather than have every icon in its own bold color. With Liquid Glass, you can customize your icons, widgets, and folders to work better in dark mode, be clear, or have any tint you prefer.
  • Technical showcase: Liquid Glass’s complex real-time blurring and lighting effects require a significant amount of processing power, but Apple’s A-series and M-series chips are up to the task. This is another case of competitive advantage—Apple is showing off by saying, “Our devices have so much power that we can use it to make the interface snazzy looking.”

Liquid Glass Cons

Just because Apple describes Liquid Glass as “delightful” doesn’t mean everyone will agree. Many users dislike change, and numerous user experience experts have criticized aspects of Liquid Glass. Some of the concerns include:

  • Readability: When Liquid Glass displays light gray text on a clear control positioned over a dark background with additional text, it becomes almost unreadable. While this is an extreme case, many floating translucent controls over backgrounds can cause legibility issues, especially for those whose vision isn’t perfect. As you can see in this screenshot of iOS 26’s release notes (below, left), the Liquid Glass controls in Photos are notably awkward (below, right).
  • Learning curve: Apple can say that the interface changes in Liquid Glass are for the better, but there’s no denying that everyone will need to learn something new. That’s easier for some users than others, and many people will be boggled by controls moving around.
  • Matter of taste: Not everyone shares Apple’s design aesthetics. Some people find Liquid Glass to be cartoonish or distracting.
  • Inconsistent design: While Liquid Glass aims to create a uniform design language across all Apple platforms, it will take time for even Apple to update everything. Some third-party apps will never receive updates, and some developers may refuse to modify their interfaces to support Liquid Glass. We’ll be dealing with inconsistent interfaces for several years.
  • Performance issues: On some older devices, Liquid Glass may feel sluggish or drain the battery more quickly due to heavy GPU usage. Although Apple doesn’t intend to create a bad user experience for anyone, performance issues may nudge some people to upgrade sooner than they planned.

Liquid Glass to Solid Metal

You can’t turn off Liquid Glass, but three Accessibility settings will make it less liquid and less glassy on the iPhone, iPad, and Mac. There are also a few additional settings that may make the iPhone and iPad easier to use. All these settings are independent, so you can mix and match to find the combination that gives you the look you prefer. (Paths are for the iPhone and iPad; on the Mac, start with System Settings and note slight naming differences.)

  • Reduce Transparency: Turning on Settings > Accessibility > Display & Text Size > Reduce Transparency significantly improves Liquid Glass readability by making translucent panels opaque, adding solid backgrounds, and reducing blur. On the Mac, this setting restores the solid menu bar. However, Reduce Transparency may make certain aspects of the interface look awkward, such as when a previously transparent toolbar suddenly covers a much larger part of the screen.
  • Increase Contrast: Enabling Settings > Accessibility > Display & Text Size > Increase Contrast makes interface elements stand out more by sharpening borders and reducing the tendency for controls to meld with the background. Keep in mind that Increase Contrast can also significantly alter the colors of many interface elements.
  • Reduce Motion: If Liquid Glass’s animations, blurring, and parallax effects make you a little queasy, turn on Settings > Accessibility > Motion > Reduce Motion. While you’re here, turn on Prefer Cross-Fade Transitions to minimize motion for interface controls that slide in and out. A warning—without transitions, some parts of the iPhone and iPad experience might seem abrupt.

The next three settings are exclusive to the iPhone and iPad, and you’ll find them in Settings > Accessibility > Display & Text Size:

  • Bold Text: Flipping this switch makes all interface text bold, which can make it much easier to read, particularly when transparency renders thin glyphs nearly invisible.
  • Button Shapes: It’s harder to find controls that benefit from turning Button Shapes on, but if you want a true button instead of blue text or underlines that show a label is clickable, turn this on.
  • On/Off Labels: Turning this setting on displays small | and O labels for On and Off on all switches to help clarify if the change from gray (Off) to colored (On) isn’t clear.

Again, Liquid Glass is the foreseeable future of Apple interface design, and while it’s far from perfect right now, we anticipate Apple improving and refining it over the next few releases. You can help nudge that process in the direction you want by submitting feedback to Apple.

(Featured image by Apple)