With the reveal of the latest iOS update at WWDC 2024, Apple had previously confirmed that we'd be seeing Apple Intelligence coming this Fall in beta.
And now, according to Bloomberg's Mark Gurman, it seems we're finally getting our first wave of Apple Intelligence features with the iOS 18.1 update on October 28, even though it won't include everything announced thus far.
If you've been eagerly awaiting Apple's AI and excited to see how it changes the iOS experience — and if you have an iPhone 15 Pro or any of the iPhone 16 models — your time to dive in and start playing around is coming.
So without further ado, here are the top 3 Apple Intelligence features coming to iOS 18.1 this month that should get you excited.
While there are many aspects of Apple Intelligence to look forward to, the most useful might be the new "Clean Up" feature coming to the Photos app. It allows your iPhone to automatically detect potentially unwanted objects (or people) in the background of your photos, and by simply tapping on the highlighted area, it will remove whatever it is ruining your photo and yield an ideal shot.
Apple Intelligence
You can also manually select what's being highlighted in the photo by creating a circle around the object, or even utilizing a brush that can remove certain aspects of the photo if you want to go further into detail. Apple Intelligence is designed to be able to identify what's a main aspect of the shot, so even if you highlight a background item to be removed, it won't impact the primary focus of the photo.
Obviously, the more complex the photo, the more difficult it will be for Apple Intelligence to perform a smooth job removing certain objects. But with the launch of iOS 18.1, it'll be exciting to test its limitations and see just how far it can go in perfecting our images.
Apple Intelligence is primed to make Siri more effective at assisting users in their day-to-day life. Examples include its ability to continue with knowledge from previous requests, meaning if you ask one question, get an answer, and then ask another, it can adapt that second answer based on what you asked initially.
Apple's example for this feature involves setting a location for a calendar event in the first question, and then following up by asking "What will the weather be like there?" Siri is designed to understand where you're talking about based on the previous query.
Apple Intelligence
Siri is also now loaded with information about features for Apple products, meaning you can ask it questions about anything on iPhone, iPad or Mac, and it will provide "step-by-step instructions in a flash" according to the Apple Intelligence page.
Siri is also designed to follow queries even when key information is changed halfway through the statement. For example, if you ask for a timer of five minutes, and then correct yourself later for 10 minutes, it'll follow your final query. And even if you change your mind for a third time, it should still get the most accurate final query.
Apple Intelligence is bringing summaries across many of its applications. Simply by highlighting a block of text, you can select to receive a summary of everything included.
And when you hit record in the Notes or Phone applications, audio recordings will be transcribed, and then Apple Intelligence will generate a summary of that transcript, hopefully making it easy to digest in a small text block.
Apple Intelligence
And if you're overwhelmed by an email, you can quickly receive an Apple Intelligence-powered summary. You can even view these summaries straight from your inbox, without needing to expand.
Best of all, there is now a feature that will showcase notifications in a quick, easy-to-digest, summarized format from your lock screen. It will also prioritize potentially important notifications.