Normal view

There are new articles available, click to refresh the page.
Before yesterdayMain stream

We Basically Already Know Everything About the Galaxy Z Flip 6 and Z Fold 6

5 July 2024 at 19:30

Samsung Unpacked 2024 is nearly here. On Wednesday, July 10, Samsung will unveil the next generation of some Galaxy devices, including the successors to the Galaxy Watch 6, a new smart ring, as well as the Galaxy Z Fold 6 and the Z Flip 6. You don't need to wait until Unpacked to check out the new foldables, however: They've basically totally leaked.

As reported by The Verge, leaker Evan Blass shared Samsung's promotional materials and spec sheets for both the Z Flip 6 and the Z Fold 6. If these documents are legit (and they appear to be), we're getting our first look at Samsung's new foldables half a week early.

Spoiler alert: Samsung hasn't made any monumental changes to either the Z Fold or the Z Flip here. These are very much subtle updates. But there are some interesting features to note.

Galaxy Z Flip 6

Samsung's latest flip phone looks to be similar to the Z Flip 5, albeit slightly lighter and thinner. The leaked marketing materials show off some interesting new features, like an addition to Interpreter Mode that displays the translated text on the front display; a new 50MP camera, up from 12MP from the Z Flip 5; and a vapor chamber for cooling.

Here's what the phone looks like on paper, according to the alleged leaks:

  • Display: 6.7-inch 2640 x 1080 OLED with a refresh rate of 120Hz (main), 3.4-inch 720 x 748 IPS (front)

  • Cameras: 12MP ultra-wide, 50MP wide, 10MP front

  • Battery: 4,000 mAh, up to 23 hours of video playback

  • SoC: Snapdragon 8 Gen 3, 12GB RAM

  • Storage: 256GB

Galaxy Z Fold 6

At first glance, Samsung isn't adding as many new features to the Z Fold 6 as it is with the Z Flip 6, according to The Verge. The Z Fold 6 doesn't get the additional Interpreter Mode, or the vapor chamber for cooling.

However, there are some interesting changes over last year's Z Fold 5: The display will allegedly reach 2,600 nits of brightness, compared to the Z Fold 5's maximum of 1,750 nits. Like the Z Flip 6, the Z Fold 6 will also get the Snapdragon 8 Gen 3, which Samsung says has an NPU increase of 42% (the part of the chip that processes AI). Samsung is also touting new AI features, such as a new AI zoom, that should theoretically result in clearer closeups.

In addition, the Z Fold 6 is also thinner and lighter than past models, and has a new aluminum frame. All that said, there aren't any radical changes here: If you have a Z Fold 5 (or perhaps an older Z Fold device) you might not feel a huge push to upgrade.

  • Displays: 7.6-inch 2160 x 1856 OLED with a refresh rate of 120Hz (main), 6.3-inch OLED (front)

  • Cameras: 12MP ultra-wide (F2.2), 50MP (F1.8), 10MP (F2.4)

  • Battery: 4,400 mAh, up to 23 hours of video playback

  • SoC: Snapdragon 8 Gen 3, 12GB RAM

  • Storage: 512GB

Hackers Now Have Access to 10 Billion Stolen Passwords

5 July 2024 at 13:30

Data leaks are an inevitability of the digital age. It's all but impossible to have accounts online without losing some of your passwords to these attacks (which is why using 2FA is so important). But it's one thing to know some of your passwords are out there somewhere; it's another thing entirely to know there are billions of our passwords conveniently rounded up for the taking.

That's exactly what new research seems to suggest: As reported by TechRadar, researchers say they found a text file, called rockyou2024.txt, containing nearly 10 billion unique passwords, all stored in plain text. That means anyone with access could scrape the list as they would a PDF and discover each and every password for themselves.

This was not a project that happened overnight: These passwords were collected over time, from various attacks and leaks over the past 20 years. Attackers added 1.5 billion of these passwords to the file from 2021 to this year alone. The fact that these are all unique, too, means there are no repeats in the list. It's tough to wrap your head around that many passwords.

What's the danger with these password leaks?

While it's bad enough that anyone with the list can Command+F their way into searching for any password under the sun, that's not really where the danger lies. It would simply take too long to look for specific passwords to try.

Rather, bad actors can use lists like this one to engage in brute force and credential stuffing attacks. In a brute force attack, bad actors try a large number of passwords in quick succession to try to break into an account. Credential stuffing is similar, but involves using leaked credentials—like known username/password combinations—with other accounts, as people tend to use the same password for multiple accounts. (Please don't do this.)

Bad actors don't run these attacks by hand, of course: They use computers, which can try millions of these passwords in an attempt to break into these accounts. With a database of 10 billion unique passwords, hackers will certainly have a field day running brute force and credential stuffing attacks against both individuals and organizations alike.

How to protect yourself from this password database

Hopefully, organizations take the time to shore up their defenses against attacks like these, but even as individuals, there's quite a bit we can do to protect ourselves.

First, you can use a leaked password checker to see if your credentials are available for bad actors to use, whether that's in this database or elsewhere. If you see that any of your passwords have been compromised, change them immediately.

On that note, make sure you're using a strong and unique password for every single one of your accounts. In the event an account's credentials are leaked, bad actors won't be successful in credential stuffing, as your other accounts won't use that compromised password.

If an account supports passkeys, use that instead, as passkeys have no credentials to leak. If not, use two-factor authentication whenever possible. In the event that bad actors know your credentials, they won't be able to break into your account without access to your trusted device, whether that's a smartphone or an authenticator app.

To manage all these credentials, use a password manager. Not only will a good password manager help you, um, manage your passwords, it should come with convenient security features, like password generators, 2FA codes, and alerts when your passwords are leaked.

Apple Might Make It Easier to Replace the Battery in Your Next iPhone

3 July 2024 at 16:30

All batteries age, and the lithium ion battery in your iPhone is no exception. Overtime, the battery degrades, and, as such, can no longer hold as much of a charge as it could when new. After a few years of using the phone, you may notice you only get 70% or 80% of the time you used to after taking your iPhone off the charger.

When the inevitable happens to you, you don't need to buy a new iPhone. Instead, you can simply replace the battery at a relatively inexpensive cost. Depending on your iPhone, you may not only notice an increase in time away from the charger, but also a boost in performance, since iOS slows down the processing power of your iPhone when its battery is too degraded.

But though replacing your battery is possible, it could certainly be easier. Apple currently secures the battery to the inside of your iPhone with strong adhesive. To remove it, you need to pull on a few tabs that are easy to break, making the removal process more precarious than it should be. The battery itself is also fragile, and you'll need to remove and reattach some very delicate cables. Despite all of this, it's possible to replace your battery yourself, but it's simpler to opt to take it to a repair shop. (Apple would prefer you use one of its own.)

But if reports are correct, the process could be notably easier with the iPhone 16.

The iPhone 16 may have an easy-to-remove battery

According to a report from The Information, Apple is planning on a new battery strategy for the iPhone 16. With this new line of smartphones, Apple may wrap the battery in a metal casing, rather than a foil one, allowing for a new removal process: Rather than having to pull on tabs to release the adhesive from the battery, you would send a low voltage burst of electricity through the battery casing to release it from the iPhone. If it pans out, the process sounds much safer and easier than the current system.

Apple wouldn't be doing this out of its concern for customers. Instead, it's likely in response to a new E.U. law that requires smartphones to have "replaceable batteries" by 2025. Europe has had quite an influence over Apple's decisions over the past year, requiring the company to open up many of its closed platforms, including allowing independent app stores and browsers on iOS.

Despite that pressure, only the battery will be easier to replace. There are no rumors suggesting Apple is making the rest of the iPhone repair process any simpler, so the iPhone 16 will likely still come with the usual strong adhesives on its casing that will need to be heated and broken in order to open the device.

Apple's battery changes may also improve battery capacity

This changes may mean more than just easier battery replacements. According to noted Apple leaker Ming-Chi Kuo, Apple will also increase the battery density on the iPhone 16 line by 5 to 10%. That extra boost could result in longer battery life, but seeing as Apple is rolling out presumably power-hungry Apple Intelligence features to these new iPhones, those battery gains may quickly disappear.

All Four iPhone 16s Will Probably Get Apple Intelligence

3 July 2024 at 12:30

Fifteen years in, Apple's annual iPhone releases have gotten pretty predictable—we've come to expect four new iPhones (including two larger size options), two Pro models and two standard devices. In recent years, you could also expect to need to go Pro in order to get the best chips Apple had on offer. But if reports are correct, that might be changing this year.

Every iPhone 16 model will like get the A18 chip

As reported by MacRumors, Apple's backend code reveals that the company is working on four iPhone 16 models to release this year. That's not surprising at all, but what is surprising is all four models appear to reference the same A-series chip. In fact, Nicolás Alvarez discovered five A-series entries in this code, each with a unique identifier not tied to any existing iPhone:

  • iPhone17,1

  • iPhone17,2

  • iPhone17,3

  • iPhone17,4

  • iPhone17,5

Don't be confused by the "17"—these names refer not to the iPhone 17, but to the A18 chip Apple is developing for the iPhone 16. (Okay, maybe it's a little confusing.) The interesting bit is that all of these entries share the "iPhone17" moniker, which shows that Apple is working on five devices with the A18 chip. (Perhaps the fifth model is a new iPhone SE?)

If any of these iPhones were using a different chip, you would see that reflected in the name here. As MacRumors highlights, in Apple's backend code, the iPhone 15 is listed as "iPhone 15,4," since it uses the A16 chip, while the iPhone 15 Pro is listed as "iPhone 16,1," since it uses the A17 chip.

To that point, it's great news is all of Apple's upcoming iPhones using the same chip. For the past two product cycles, Apple has recycled the previous year's Pro chip for the standard iPhones, reserving the best A-series chip for only the latest Pro devices. The iPhone 14 and iPhone 14 Plus use the A15 chip from the iPhone 13, while the iPhone 15 and iPhone 15 Plus use the A16 chip from the iPhone 14 Pro and iPhone 14 Pro Max.

All four iPhone 16s may be Apple Intelligence-ready

While it's possible Apple will beef up the A18 chip in the iPhone 16 Pros to give consumers something to stretch their dollar for, the likely consequence of the above news is this: All four new iPhone 16s will probably be compatible with Apple Intelligence, Apple's upcoming suite of generative AI features.

Right now, the only iPhones officially compatible with Apple Intelligence are the iPhone 15 Pro and iPhone 15 Pro Max. Not even the iPhone 15 or iPhone 15 Plus are compatible, even though they're as new as the 15 Pros, as their A-series chips are a generation older. Given the recent buzz around AI, it makes sense that, going forward, the company would want all new iPhones to work with Apple Intelligence—though I would have assumed that would've been true of the iPhone 16 and iPhone 16 Plus regardless, as even if Apple stuck to their recent tradition of using last year's Pro chips in the base models, these iPhones would've shipped with the iPhone 15 Pros' A17 chip, which is Apple Intelligence-ready.

The standard iPhone 16 might be the best value choice

Either way, it seems the standard iPhone 16 will be a good value this year for anyone wanting to balance their device's power with price. I'm sure Apple will soup up the 16 Pros in other ways, perhaps with higher-end A18 chips, superior cameras, and other exclusive features. But if you're simply looking for an iPhone that can run Apple's latest AI features without spending $1,000 for the privilege, it's looking like the iPhone 16 will be that phone.

Three New AI Features Rumored to Be Coming to Pixel 9

2 July 2024 at 16:30

Google is set to unveil its Pixel 9 lineup of smartphones in August, a full month earlier than usual. Along with new hardware, which, of course, has already leaked, the company is likely to reveal new software features for the new Pixel devices, and unsurprisingly, much of that will be AI. Lucky for us, we don't have to wait to learn about some of these features.

In an exclusive report, Android Authority details new AI features allegedly coming to Pixel 9. Android Authority says a source inside Google provided the outlet with a list of five features the company plans to highlight with the release of its new smartphone this year, via a screenshot titled "Discover Google AI at its best."

We know about two of these features already. One is Circle to Search, which, as the name implies, lets you draw a circle around an element on your smartphone's display to start a search on it. The other is Gemini, Google's AI assistant, which you can already use to replace the traditional Google Assistant on your Pixel if you want to.

While Google may push these two existing AI features as part of its Pixel 9 launch, there are three new AI features the company may also rollout in tandem with the new smartphone:

Add Me

According to the alleged screenshot from Google Android Authority shared, "Add Me" is a feature that lets you "make sure everyone's included in a group photo." While that isn't particularly descriptive, one can infer that Android will use AI to edit you into a picture if you didn't make it in frame in time.

That would be a compliment to Google's existing Best Take feature, which, after taking multiple photos of a group of people, lets you choose the "best" face of each subject from each photo.

These are useful features in theory, but they do challenge the idea of the photograph itself: What's really the point of a photo if it doesn't actually represent reality? "Let's all look at that group photo we took on our trip last year. Although Greg wasn't actually there...and Melissa definitely wasn't looking at the camera...At least we're all smiling!"

Studio

"You imagine it. Pixel creates it." That's how Google describes "Studio" in Android Authority's screenshot. By the description, it sounds like a built-in AI image generator on your Pixel. That tracks, since Google is reportedly building an app called Creative Assistant, primarily for making stickers.

It's par for the course these days for tech companies to offer AI art generators, so this isn't a surprising development. I'm sure Google will inject some Pixel or Android-specific features with Studio, but for now, this is all we know.

Pixel Screenshots

It's "Pixel Screenshots" that's by far the most interesting new feature outlined here. Based on the screenshots of the feature shared by Android Authority, Pixel Screenshots is essentially Microsoft's Recall feature, but for the screenshots saved on your Pixel.

Recall, as you may recall, was designed to save a snapshot of your entire screen every few moments, so you could search for quite literally anything you ever did on your PC. It was met with widespread concern and criticism, especially once it appeared to be quite vulnerable to hacking. Microsoft has since delayed the feature's rollout.

Pixel Screenshots, on the other hand, doesn't seem to take a screenshot of your display every few seconds; rather, it scans your existing screenshots to turn them into a "searchable library." If you know you screenshotted a pair of shoes you wanted to buy, or a receipt you need to reference, you can use the feature to search for it.

When you turn the feature on, Android will save extra data for screenshots you take going forward, including web links, the names of apps, and the date and time the screenshot was taken, all to make it easier to search for those data points in the future. Interestingly, the feature says all data access and processing happens on-device, so none of your data should make its way to Google's servers.

On the surface, it sounds much more secure than Microsoft's solution, although it also serves a much different purpose. That said, there's a slight risk to the feature: Allowing AI to make searching screenshots a breeze opens the door for anyone with access to your phone to do the same: If you save screenshots of sensitive information, like credit card numbers, banking info, or passwords, anyone with access to your phone could search for this data.

That said, Google hasn't actually announced any of these features yet. We'll just have to see which features, if any, the company does decide to bundle with the Pixel 9 when it launches later this year.

Payouts for Apple's Butterfly Keyboard Lawsuits Are Finally Coming

2 July 2024 at 10:30

Were you involved in a typing accident that left you with double keypresses or no keypress at all? Were your laptop keys sticky and/or unresponsive, in a way that left you with lasting trauma? If you were the victim of Apple's "butterfly" keyboard design on your MacBook, MacBook Air, or MacBook Pro, you may be entitled to compensation. In fact, that compensation is rolling out to victims soon.

What's up with my MacBook's keyboard?

Let's clear something up quick: This situation has nothing to do with the MacBooks Apple has released in recent years. If you have a relatively new laptop from Apple, your keyboard experience is likely just fine, if not above average. However, for a period of time (roughly 2015–2019) Apple made some questionable design choices for their lineup of notebooks—namely, the butterfly keyboard.

These keyboards ditched the traditional scissor switch mechanism Apple used for its previously beloved MacBooks for a "butterfly switch," which reduced the travel distance necessary for each key press. This design allowed Apple to make all their MacBooks ultra thin, which looked good visually, but wasn't the best choice for an optimal computing experience.

But the issue with the keyboards wasn't necessarily their thinness; it was the flawed design in general. Butterfly keyboards were prone to failure, for one reason or another, that could eventually result in nothing happening when pressing the keys, repeated entries when pressing the key once ("AA" appearing when you only typed "A" one time), or the feeling of "sticky" keys, since the keys would grind against the keyboard and get stuck in place.

Apple tried fixing the keyboards with small changes, like a thin membrane underneath the keys to prevent dust and debris from getting stuck, but none of these changes worked: The design was simply too flawed. Apple even started a repair program, so affected customers could fix their keyboards for free. The problem was the repair program only lasted for four years following the retail sale of your computer, not even when you bought it. If Apple sold your computer in fall of 2016, but you bought it in 2018, the program ended in 2020 regardless.

How much can I get paid?

Enter the lawsuits: It wasn't just that Apple had sold customers a faulty keyboard; it was that the company knew the keyboards were doomed to fail, and continued to sell them anyway. Apple denies all wrongdoing, but did in the end agree to a settlement of $50 million.

If you qualify, your payment will depend on your exact situation: If you had to get your MacBook's topcase replaced at least twice within that four year time period, you could get up to $395 in the settlement. If you replaced it once, you get $125. If you just had the keycaps themselves replaced, that's $50.

That said, if you didn't make a claim, you don't get anything. The deadline to make one was March 6 of 2023, so unless you stated your case back then, you're out of luck now.

Payments are coming

As spotted by MacRumors, payments for qualified victims are coming soon. The court issued a payment order on June 27, and the payments themselves will be issued by August. If you made your claim last year, be on the lookout for your payment by the end of summer.

Seven New Copilot Features in Microsoft 365

1 July 2024 at 16:00

Copilot has changed quite a bit since the start of this year. Not only does the bot have an official app, but now anyone with a Microsoft 365 account can access Copilot in apps like Word, PowerPoint, and Excel. Before, it was exclusive to business users.

If you have Microsoft 365 and Copilot Pro, or your company uses this suite of tools, there are some new Copilot features you can access right now. Microsoft announced a list of 14 new Copilot features rolling out this month, but seven of those really apply to admins and management. For the rest of us, the other seven are key new features to look out for:

You can generate images directly in Word and Powerpoint documents

This month, both Word and Powerpoint will let you generate images and search for stock images using Copilot via Microsoft Designer. If you prompt Copilot to generate you an image, it will create and present the usual series of options, which you can choose from to insert into your doc. When using the feature in Powerpoint, Designer will incorporate it in a "compelling slide design."

designer in copilot
Credit: Microsoft

Reference PDFs, emails, and meetings in Word

Starting last month, Microsoft rolled out the ability to reference PDFs and encrypted Word docs using Copilot in Word. This month, the company will also add the ability to reference Microsoft Cloud info, which includes your emails and meetings. Those updates add on to the data types you could reference previously, including Word and PowerPoint files.

Going forward, you will be able to pull these files types into your prompts with Copilot in Word. If you call on Copilot and ask it to write you a project proposal, you can have Copilot base it on one of these files from an expanded menu. If you discussed the project in depth in a meeting, or if the notes are laid out in a PDF, Copilot will be able to analyze that doc or file and generate a report from it.

referencing docs in word
Credit: Microsoft

Expanded support for creating PowerPoint presentations from PDFs and encrypted Word files

Similarly, Microsoft rolled out the ability to create PowerPoint presentations from new PDF and Word file types in June. You can also reference Word docs and PowerPoint presentations in PowerPoint itself.

However you create a PowerPoint presentation with Copilot, you should notice three key improvements to how the AI handles the PowerPoint doc, including improved titles, sections, and slides; presentation structures with slides for your agenda, sections, and conclusion; and new transitions and animations.

In addition, PowerPoint chat can answer questions using Microsoft cloud, Microsoft Graph, and Bing.

New features in Excel

Microsoft announced three new Copilot features for Excel, all of which are rolling out this month:

  1. Copilot now works with data ranges "resembling tables" with headers along a single row. Microsoft says this is more efficient than before, as you won't need to format data before calling in Copilot.

  2. The edit box is available no matter which cell you have selected, so you can use Copilot without needing to worry about where you are in your spreadsheet.

  3. Copilot will be more conversational with its responses to questions about Excel data, including offering step-by-step instructions for certain tasks.

copilot in excel
Credit: Microsoft

Copilot in Teams

Do you know who you're chatting with at work? You may think you do, but the more AI features take over our work programs, the less you may be interacting with a real human being.

A new Teams features rolling out right now is Copilot integration in Teams chats and channels. After typing out your message, you can prompt Copilot to adjust your words in any way you want. Microsoft suggests prompts like "add a call to action," "make it persuasive," or "convert my message into a list and add inclusive language."

You might think your boss sent the team a hand-written message, or that you're having a fun chat with a coworker. But, for all you know, they used Copilot to change their words. You may, in fact, be chatting with an LLM.

copilot in teams
If you're using Copilot to write a joke for your coworkers, please don't. Credit: Microsoft

Copilot lets you rewrite content in SharePoint

Copilot rewriting has come to SharePoint: If you use the app to create websites for your organization, you'll be able to use familiar Copilot tools to rewrite text. It doesn't seem like there's anything particularly innovative here, mind you. It seems like these are the usual rewriting tools you'd expect from generative AI. However, now you can use them directly in SharePoint. This is rolling out this month.

Copilot in Loop

Loop is Microsoft's collaborative workspace app, allowing team members to work together on a project in real time. While you can start a project from scratch with a blank canvas or from a template, Microsoft wants you to use Copilot to generate a "structured document ready for team collaboration." If you use Loop, you can try asking Copilot to set up your workspace based on whatever parameters you need. Microsoft introduced this feature in May.

copilot in loop
Credit: Microsoft

Here’s Why Apple Might Be Putting Cameras in Your Next AirPods

1 July 2024 at 12:30

What do you look for in a new pair of earbuds? High-quality audio? Easy pairing to your devices? Smart features to improve daily use? Sure, all of that. How about cameras? That might sound ridiculous, but it's something Apple is potentially seriously considering—and if reports are accurate, your next pair of AirPods might come with cameras attached.

According to Ming-Chi Kuo, an analyst with a solid track record for Apple rumors, Apple is indeed planning on shipping AirPods with embedded cameras by 2026. However, the cameras wouldn't be like the ones you use to take photos and videos on your iPhone; rather they'd resemble the infrared camera responsible for Face ID on your iPhone (not the selfie cam, but the one that scans your face and unlocks your phone for you). Kuo says Foxconn will supply the IR cameras for the AirPods, and are currently planning on servicing enough modules for roughly 10 million pairs.

Why AirPods might be getting cameras

So, if the cameras attached to these AirPods aren't for snapping pictures of the sides of your head, what are they good for?

  • Spatial computing. Kuo says the idea is that these AirPods will pair with Vision Pro and subsequent headsets from Apple to both improve the devices' spatial computing, as well as boost the performance of spatial audio. If your AirPods can map your environment, it can deliver a more realistic 3D experience when you look around the room, as sounds can be artificially boosted to sound like they're coming from a particular area.

  • In-air gestures. Kuo also says Apple is exploring "in-air gesture control" for AirPods, which isn't super specific, but I imagine the concept would allow you to control your AirPods by moving your hands hear the earbuds, rather than physically touching them. If Apple kept the physical controls they already have, like squeezing and swiping on the stem, incorporating in-air gestures could add extra controls to the mix, like toggling Spatial Audio, playing a specific playlist, or starting a phone call.

It's not just Kuo who thinks Apple is working on AirPods with cameras: Bloomberg's Mark Gurman also reports that Apple is interested in the idea, in tandem with other upgrades to its wearable products. I'm all for hardware that improves the overall user experience. If adding cameras to AirPods manages to do that, go for it, Apple—as long as the company doesn't try selling me on "Ear Selfies."

YouTube Is Rolling Out Five New Features for Premium Subscribers

27 June 2024 at 17:30

The free version of YouTube works perfectly well (current ad blocker debacle notwithstanding), but if you use it a lot, a Premium subscription is actually a pretty good deal. Not only do you get to skip most ads, download videos for offline playback, and gain access to YouTube Music Premium, you also have the opportunity to try new features before other YouTube users.

Today, YouTube added five new features for YouTube Premium subscribers—two are rolling out widely now, and three more are experimental features you can enable if you wish. Here's what's new.

Jump Ahead

Following a round of testing, YouTube is going ahead with Jump Ahead. The new feature uses AI to analyze the "best" parts of a video, based on where most users scrub to while watching. When you double-tap the player window to skip ahead 10 seconds, you'll now have the option to "Jump Ahead" to this "best" spot.

YouTube says this feature is rolling out first to YouTube for Android, but it will be available on the iOS app for Premium subscribers soon.

Picture-in-picture for YouTube Shorts

Picture-in-picture is a convenient way to watch videos on platforms like YouTube while switching to other apps. Now, Premium subscribers on Android will be able to use PIP for the TikTok-esque YouTube Shorts as well. I suppose this feature will be helpful for Shorts that aren't too short, but I can't really see a strong use case for keeping 15 to 30 second videos in PIP while doing things in other apps.

Smart downloads for Shorts (experimental)

If you opt-in to this experimental feature, YouTube will automatically download new Shorts to your smartphone so you can watch them with or without a connection. The company didn't specify, but I'd guess these Shorts would also work in PIP mode, as well.

Conversational AI (experimental)

YouTube also announced it's bringing back its conversational AI to Android devices. When available, YouTube's AI will appear underneath videos via an "Ask" button. You can ask the bot questions and request similar content to videos you're watching, even while a video is playing. I doubt this is going to be a game-changing AI by any means, but it's an interesting experiment nonetheless.

New watch page UI (experimental)

The one change to the YouTube web app for Premium subscribers is an experimental new watch page. YouTube didn't share much about it, but said the new watch page will make it easier to find new videos and engage with comments.

How to opt-in to YouTube experimental features

If you want to try out these three new experimental features, as well as any other experimental features YouTube is currently testing, head to YouTube's "New" webpage and opt-in.

ChatGPT's Free Mac App Is Actually Pretty Cool

26 June 2024 at 14:30

When OpenAI first rolled out the ChatGPT app for Mac, it was exclusive to ChatGPT Plus subscribers. Unless you paid $20 per month, you needed to stick to the web app or the one on your smartphone. As of Tuesday, however, the Mac app is now free for everyone. And, honestly, you should probably give it a go.

At first glance, OpenAI's Mac app offers the usual ChatGPT experience you're used to. When you log in, you'll find all your previous conversations saved to the sidebar, just as they are in the web and mobile apps. You can type your prompts in the text field, use the mic button to ask questions with your voice, and click the headphones icon to enter Voice mode. (Not the "Her" Voice mode, mind you: That feature has been delayed.) You can also use features like Temporary Chats (conversations that don't pull from your chat history), change your GPT model, generate images with DALL-E, and access GPTs.

A better experience than the web app

But there are some Mac-specific features that make this particular app worth using over the web option. First, in addition to uploading files and photos to ChatGPT, you can take a screenshot of any open window on your Mac directly from the app. If you click on the paperclip icon, and select Take Screenshot, you can select an active window from the pop-up list to share with ChatGPT. (The first time you do this, you'll need to grant the ChatGPT app access to screen recording.)

Alternatively, you can take a screenshot of the window manually, then share it to ChatGPT as an image, but this skips a step and makes the bot feel a bit more integrated with macOS.

using screenshot tool chatgpt for mac
Credit: Jake Peterson

But what's even more convenient, in my opinion, is the ChatGPT "launcher." This launcher is essentially Spotlight search, but for ChatGPT. Using a keyboard shortcut, you can bring up a ChatGPT text field directly over any window you're currently using on macOS to start a conversation with the bot. You'll then be taken to the app to continue chatting. This basically saves you the step of switching out of the current app you're in and starting a new thread in ChatGPT; if you see something on your Mac you want to know more about, you can hit Option + Spacebar, type your query, and get started.

using the shortcut
Credit: Jake Peterson

This launcher also has the same paperclip icon as the app itself, which means you can upload files and take screenshots directly from the shortcut. If you're a ChatGPT power user, this launcher should be a welcome feature. (I don't even use ChatGPT that much, and I really like it.)

Unfortunately, OpenAI is only making the ChatGPT app available on M-series Macs—the machines running Apple silicon. If you have an older Intel-based Mac, you'll still have to head to the web app in order to use ChatGPT on your computer.

If you have a Mac with an M1 chip or newer, you can download the app from OpenAI's download site.

Here's When Google Is Unveiling the Next Pixel

25 June 2024 at 15:30

Another year, another Pixel. It’s no surprise that Google is planning on releasing the Pixel 9, 9 Pro, and Watch 3 at some point this fall. Every tech company refreshes their smartphones at least once a year. What’s surprising is the event is happening earlier than ever in 2024.

As reported by The Verge, Google just sent out invites for its Made by Google hardware event. Google says the event will focus on Google AI, Android, and, of course, the “Pixel portfolio of devices.” While this event is usually held in September, Google is inviting people to an August announcement—Aug. 13, to be specific.

The event kicks off at 10 a.m. PT (1 p.m. ET), which is pretty standard for these tech events. But the advanced date is curious: Why is Google announcing these things a whole month earlier than usual? It’s possible it’s Google’s way of getting around rumors and leaks: Pixels tend to be leaked in their entirety by the time Made by Google rolls around, to the point where anyone keeping up with the rumors knows just about everything Google is announcing.

That said, we do have rumors about the Pixel 9, so that strategy might not be working: According to the leaks, Google is planning to pull an Apple and release four different Pixel models: a 9, a 9 Pro, a 9 Pro XL, and a 9 Pro Fold. It's also expected that the Pixels will come with the G4 Tensor chip, Google latest generation SoC. These devices will replace the current Pixel 8 and Pixel 8 Pro, as the Pixel Watch 3 will replace the Watch 2.

In addition to hardware, Google will share announcements about its latest AI features and developments, as well as Android 15, which is currently in beta testing. It will be interesting to see what the company has planned for these announcements, as their latest AI endeavor, AI Overviews, didn't have the best of rollouts.

Because Google has only sent out invites to the event thus far, we don't know for certain how the company plans to stream the event for the rest of us. However, more than likely, Google will host a live stream of Made by Google on the company's YouTube page. If you want to see these announcements live, tune into YouTube.

Gemini Is Coming to the Side Panel of Your Google Apps (If You Pay)

25 June 2024 at 15:00

If you or your company pay for Workspace, you may have noticed Google's AI integration with apps like Docs, Sheets, and Drive. The company has been pushing Gemini in its products since their big rebrand from "Bard" back in February, and it appears that train isn't stopping anytime soon: Starting this week, you'll now have access to Gemini via a sidebar panel in some of Google's most-used Workspace apps.

Google announced the change in a blog post on Monday, stating that Gemini's new side panel would be available in Docs, Sheets, Slides, Drive, and Gmail—the latter of which the company announced in a separate post. The side panel sits to the right of the window, and can be called up at any time from the blue Gemini button when working in these apps.

Google says the side panel uses Gemini 1.5 Pro, the LLM the company rolled out back in February, equipped with a "longer context window and more advanced reasoning." That longer context window should be helpful when asking Gemini to analyze long documents or run through large sets of data in Drive, as it allows an LLM to handle more information at once in any given request.

Now, if you've ever used a generative AI experience—especially one from Google—this experience probably won't shock you: You'll see a pretty typical welcome screen when Gemini comes up, in addition to a series of prompt suggestions for you to ask the bot. When you pull up the side panel in a Google Doc, for example, Gemini may immediately offer you a summary of the doc, then present potential prompts, such as "Refine," "Suggest improvements," or "Rephrase." However, the prompt field at the bottom of the panel is always available for you to ask Gemini whatever you want.

Here are some of the uses Google envisions for Gemini in the side panel:

  • Docs: Help you write, summarize text, generate writing ideas, come up with content from other Google files

  • Slides: Create new slides, create images for slides, summarize existing presentations

  • Sheets: Follow and organize your data, create tables, run formulas, ask for help with tasks in the app

  • Drive: Summarize "one or two documents," ask for the highlights about a project, request a detailed report based on multiple files

  • Gmail: Summarize a thread, suggest replies to an email, advice on writing an email, ask about emails in your inbox or Drive

gemini in sheets
Credit: Google

None of these features are necessarily groundbreaking (Gemini has been generally available in Workspace since February) but Google's view is they're now available in a convenient location as you use these apps. In fact, Google announced that Gmail for Android and iOS are also getting Gemini—just not as a side panel. But while the company is convinced that adding its generative AI to its apps will have a positive impact on the end user, I'm not quite sold. After all, this is the first big AI development from Google since the company's catastrophic "AI Overviews" rollout. I, for one, am curious if Gemini will suggest that I respond to an email by sharing instructions on adding glue to pizza.

As companies like Google continue to add new AI features to their products, we're seeing the weak points in real time: Do you want to trust Gemini's summary of a presentation in Slides, or an important conversation in Gmail, when AI still makes things up and treats them like fact?

Who can try Gemini side panel in Google apps

That said, not everyone will actually see Gemini in their Workspace apps, even as Google rolls it out. As of now, Gemini's new side panel feature is only available to companies who purchase the Business and Enterprise Gemini add-on, schools that purchase the Education and Education Premium Gemini add-on, and Google One AI Premium subscribers. If you don't pay for Google's top tier subscription, and your business or school doesn't pay for Gemini, you're not seeing Google's AI in Gmail. Depending on who you are, that may be a good or bad thing.

Update Your Pixel Now to Patch This Security Flaw

24 June 2024 at 13:30

Earlier this month, Google issued a security update for its line of Pixel smartphones, issuing patches for 45 vulnerabilities in Android. Security updates aren't as flashy as Feature Drops, and so users might not feel as inspired to update their Pixels right away. This update, however, is one you should install ASAP.

As it turns out, among those 45 patched vulnerabilities, is one particularly dangerous one. The flaw is tracked as CVE-2024-32896, and is an escalation of privilege vulnerability. These flaws can allow bad actors to gain access to system functions they normally wouldn't have permission for, which opens the door to dangerous attacks. While most of these flaws are usually caught before bad actors learn how to exploit them, the situation with CVE-2024-32896 isn't so fortunate: In the security notes for this security update, Google says, "There are indications that CVE-2024-32896 may be under limited, targeted exploitation."

That makes this vulnerability an example of a "zero-day" issue—a flaw that bad actors know how to take advantage of before there a patch is made available to the general public. Every Pixel that doesn't install this patch is left vulnerable to malicious users who know about this issue, and want to exploit it.

Google hasn't disclosed any additional information about CVE-2024-32896, so we don't know much about how it works—that said, it sounds like a particularly nasty vulnerability. In fact, Forbes reports that the United States government has taken note of the issue, and has issued a July 4 deadline for any federal employees using a Pixel: Update your phone, or "discontinue use of the product."

GrapheneOS, who develops an open source privacy-centric OS for smartphones, says that the patch for CVE-2024-32896 is actually the second half of a larger fix: In April, Google patched CVE-2024-29748, and according to GrapheneOS, both were targeted to patch vulnerabilities forensic companies were exploiting.

This Tweet is currently unavailable. It might be loading or has been removed.

How to patch your Pixel

To install this security patch on your Pixel, head to Settings > System > Software update. When the update is available, you can follow the on-screen instructions to install it. Alternatively, you can ask Google Assistant to "Update my phone now."

Eight Apps Apple Could Make Obsolete This Year

21 June 2024 at 15:30

Giant tech companies like Apple are constantly adding new features to their platforms, but they can't do everything. To fill the gaps, we have third-party apps: These developers can hone in on features Apple products either don't have, or don't implement well, and can focus all their efforts on making those features great. It's really a win-win—that is, until Apple decides to take those great ideas and implement them into their platforms for free.

This practice happens so much, there's a name for it: sherlocking. It refers to Apple's search app, Sherlock, which took features from the third-party search app Watson. With every major iOS and macOS update, Apple introduces features that threaten or effectively replace independent programs. This year, there are eight such apps and categories clearly in the crosshairs. In fact, analysts estimate Apple's changes to iOS 18 alone could impact apps that made nearly $400 million last year. But as we'll discuss, just because Apple is introducing these features, that doesn't automatically make these apps obsolete.

Magnet

Wouldn't you know it, but the OS known as "Windows" has traditionally had better window management than macOS. For years now, it's been easy to snap Windows windows into whatever place you want: If you want a window on the left half of the screen, and another on the right, it's easy with either a mouse drag or keyboard shortcut. Apple has added some window management options to macOS, including both in and out of full-screen mode, but it's still far behind the keyboard shortcut-simplicity Windows offers.

That's where third-party apps like Magnet come into play: These utilities basically add Microsoft's window management to macOS: Windows can snap into place with keyboard shortcuts, or by dragging windows to specific corners of the display. For any PC users moving to Mac for the first time, apps like Magnet were a must.

That is, until WWDC, when Apple casually revealed its new window management system for the Mac. It's a simple system: Drag windows to the sides and corners of your display to snap them into place, or use keyboard shortcuts to do the same. But that simple system takes care of the majority of functions people turn to macOS window management utilities for. It's bad enough for the free programs, but considering apps like Magnet cost $4.99, this could definitely hurt the developer.

1Password

Apple has actually had a decent password management system for a while now: In recent years, iCloud Keychain has done enough for me to not consider third-party alternatives, like 1Password or Dashlane. That said, iCloud Keychain's biggest weakness was its lack of centrality: It works great in the background, automatically creating and saving new passwords, and autofilling those passwords when you need them. But when it comes to manually pulling up your credentials, having a full-fledged app definitely improves the experience.

Of course, that's what Apple is doing this year: iCloud Keychain is now an app, called Passwords, that syncs across your Apple devices. Now, you have clear separation for things like passwords, 2FA codes, passkeys, and wifi passwords, and you can access shared password collections as well. However, beyond these much-needed changes, it's still a pretty simple experience. I don't think dedicated password managers are in danger because of this new experience, and existing users will likely stick with their platform of choice for the additional features they offer. But third-party apps will likely need to convince new users why their iPhone and Mac's Passwords app isn't good enough for them (especially as it likely is).

TapeACall

Recording phone calls has sucked on iOS. There was never a built-in way to do it, so you needed to utilize a half-baked workaround in the free Google Voice app (which only worked for incoming calls) or pay a pricey subscription for an app like TapeACall.

Soon, however, call recording won't just be a part of iOS: You'll basically be invited to try it. Apple advertises the feature as another menu option when you're currently in a call: Just hit the record button, and iOS will record everything you and the other caller say. That likely sent a shiver down the spine of TapeACall, whose $10 per month subscription now seems a bit expensive compared to a free update to iOS 18.

That said, Apple is advertising this feature as part of Apple Intelligence, the brand name for the company's big AI features. If that's true, only the iPhone 15 Pro and 15 Pro Max (as well as future iPhones) will be able to run this phone recording feature. That leaves a sizable market for apps like TapeACall to keep marketing to. (Fingers crossed for a price cut, though.)

Grammarly

Speaking of Apple Intelligence, the company's upcoming AI assistant will be happy to help proofread your writing, and rewrite any sentence or paragraph on the fly—whether you're writing on your iPhone, iPad, or Mac.

That must not be great news for companies like Grammarly, which offer solutions across the same set of devices for checking spelling, grammar, and sentence structure as you type. Grammarly has even rolled out AI writing tools in the age of artificial intelligence: At the time, it might have seemed like a competitive move against options like ChatGPT or Gemini. (Why copy and paste your text into a chatbot when a Grammarly extension can do it for you directly in the text field?) But now that Apple also has an AI writing bot on the horizon, the question becomes: Why download the extension?

Of course, just as with the TapeACall conversation, there's going to be a limited audience for Apple's AI features at first. Apple Intelligence is only available on the iPhone 15 Pros and M-series Macs, which means any writers on an Intel Mac will still want to keep their proofreader-of-choice.

Newji

Apple Intelligence is generative AI, which means it has to have an AI art component. Among those new features is the ability to generate new emojis to share in chats. As far as AI art goes, it seems harmless, and even fun, in case the existing emoji options don't quite match the vibe you're going for.

That's kind of a bummer for apps like Newji, though. It basically works exactly like Apple's new feature does: You prompt the AI with what you want your emoji to be (Newji's flagship example is "big rat dragging a pizza slice"), and it generates options for you to choose from. Luckily for Newji, Apple Intelligence is slow-going, and won't be available on most iPhones—at least for now. So, the company has some time before more people start buying Apple Intelligence-compatible iPhones.

AllTrails

New to the Maps app across the entire Apple ecosystem is a set of hiking features: The updates brings downloadable topographical maps to the app, as well as thousands of hikes to save offline. Even when you don't have service, these offline maps and hikes offer turn-by-turn navigation with voice, as if you were pulling from a live directions feed. You can even create your own routes, if you want.

Hmm. Sounds suspiciously similar to AllTrails, doesn't it? Luckily for them, AllTrails has a huge user base already in place, so it can offer more experiences than Apple Maps, at least at the start. But seeing as the iPhone is massively popular in the U.S., the more hikers turn to Apple Maps for hiking, the larger that community could grow. And, unlike some other options on this list, all Apple devices compatible with this year's updates gets these features, as they aren't Apple Intelligence-related. This will be one to watch.

Otter.ai

Transcriptions are another non-Apple Intelligence feature coming to Apple devices this year. (Still powered by AI though.) When you make an audio recording in Voice Memos (or Notes) iOS or macOS will transcribe it for you. It's a big perk: You can quickly review a conversation you recorded, or perhaps a presentation or lecture, and search for a specific topic that was mentioned.

Of course, it's a big perk of services like Otter.ai, too. One might think that Apple's AI transcriptions threatens Otter.ai and its ilk, but this one I see being largely unaffected for now. Otter.ai specifically is so feature-filled and integrated with various work suites in a way that likely will insulate it from Apple's new features here. I see Otter losing the most business from new transcribers, who just want a quick way to review a voice memo. Why bother looking for a solution when the transcription now appears directly with your recording on your iPhone or Mac?

Bezel

Of all the apps on this list, Bezel might be the most in trouble. With macOS 15, Apple is adding iPhone screen mirroring. That means you can wirelessly view and control your iPhone's display from your Mac, all while your iPhone remains locked and put away.

Bezel is undoubtedly the most popular third-party option for mirroring your iPhone's display to your Mac, but it might not be able to compete against macOS Sequoia. For one, Bezel requires a cable, while macOS supports wireless iPhone mirroring. But the larger issue is that Bezel costs $29 for use on one Mac, and $69 for up to three Macs. Meanwhile, Apple's screen mirroring feature is free with an update to macOS 15 on any supported Mac. It's definitely a tough situation for Bezel.

But again, just because Apple adds a new feature to iOS and macOS, that doesn't mean third-party options that offer the same feature are toast. The App Store is filled with apps that sell themselves on features Apple has had baked into its platforms for years, and they succeed by offering a different (or perhaps improved) experience from Apple. I think most of these apps have that same opportunity, but really, it'll come down to what the users want.

Use This Workaround to Send High Quality Photos and Videos on WhatsApp

20 June 2024 at 19:00

WhatsApp might be the most popular chat app in the world, but it hasn’t always been the best for sending photos and videos. The app traditionally had a 16MB limit on any media you sent, and, even still, compressed it to save space. That compression resulted in lower quality images and videos, which is frustrating in a time when smartphones have incredible cameras.

It's getting better, though. Mark Zuckerberg announced last year that WhatsApp supports high-quality photo sharing—although you might have missed the option if you weren’t looking for it. The update didn’t include support for HD videos, however, until the company quietly updated the app a week later.

HD quality is becoming the default

Fast forward to June 2024, and it seems WhatsApp is finally ready to commit to high-quality media: As reported by Android Police, Meta is now rolling out the ability to send high-quality photos and videos by default. That means that, once the update hits your app, your photos and videos should share in HD without you having to do anything. (Previously, you needed to hit the "HD quality" option to trigger this every time, which was frustrating for anyone who wanted to send their media in high quality with each send.)

You can check if you have this setting enabled from Settings > Storage and data > Media upload quality. Make sure "HD quality" is selected. WhatsApp will warn you that HQ quality media may take longer to send, and that it could be up to six times larger, which means it may put more stress and resources on your data plan. With this setting enabled, you should notice the HD option highlighted before you send your photo or video.

HD quality isn't uncompressed

However, “HD” media isn’t exactly what you might think it is. Videos max out at 720p, even if your original video was recorded in 1080p or 4K, which means WhatsApp is still compressing the video quite a lot. Still, it’s better than standard quality, which drops the resolution to around 480p. Likewise, WhatsApp still applies some compression to photos sent via the HD Quality setting, so even still, you won’t be able to send HD photos in their native resolution with this method.

Use this loophole to send full resolution photos and videos on WhatsApp

WhatsApp actually has a better solution for sending high-res content: Rather than send your videos as videos, send them as documents. This has been the best way to send full-res media for a while, as WhatsApp previously had a 100MB limit on documents, and just about anything can be a “document.” Recently, that limit jumped to 2GB per file, which makes it possible to send most (if not all) of your photos and videos in their full resolution to whoever you want in WhatsApp.

To send a video file via this method, open a WhatsApp conversation, tap the attachment icon (Android) or the (+) (iOS), choose “Document,” then choose the files you want to share. WhatsApp will send the files without compression, so you can share your content in its full quality (as long as it’s under 2GB). To preserve the quality of anything larger than 2GB, you’ll need to use another sharing method, like Dropbox or Google Drive.

Update Your Windows PC to Avoid This Wifi Security Flaw

20 June 2024 at 17:00

Microsoft's latest Patch Tuesday update has a series of fixes for bugs in both Windows 10 and Windows 11. One of these vulnerabilities is particularly troubling though, as it allows bad actors to hack your PC so long as their within wifi range.

As reported by The Register, Microsoft patched 49 security flaws with its latest Patch Tuesday update, but there are really three of key interest: The first, which Microsoft says is public (but not exploited), is tracked as CVE-2023-50868 and can allow a bad actor to push your CPU to the point where it stops functioning correctly. The second, CVE-2024-30080, concerns Microsoft Message Queuing: This flaw allows a remote attacker to send a malicious data packet to a Windows system, and execute arbitrary code on that system. This one doesn't necessarily affect individual users as much, but Microsoft did give it a high severity rating, and while it hasn't necessarily been exploited yet, the company thinks exploitation is more than likely. But the last flaw seems most pressing: CVE-2024-30078 is a vulnerability affecting wifi drivers. The company says a bad actor can send a malicious data packet to a machine using a wifi networking adapter, which would allow them to execute arbitrary code. In practice, this could allow someone within wifi range of another user to hack their computer from that wifi connection alone. And since this affects many different versions of Windows, attackers will likely try to exploit this flaw as soon as possible.

It's a chilling concept: If someone learns how to exploit this flaw, they could use it to attack other Windows PCs in their immediate vicinity. Imagine the field day a hacker could have going to a high-density area of laptop users like a coffee shop or shared workspace. Fortunately, the latest security updates for both Windows 10 and Windows 11 patch these issues, so once you're updated, you're safe to return to your office in the corner of the café.

How to install the latest patches on your Windows PC

If you're running Windows 11, head to Start > Settings > Windows Update. On Windows 10, head to Start > Settings > Update & Security > Windows Update. Either way, hit Check for updates. Once available, download and install it on your PC.

Apple’s Explanation for Why You Need an iPhone 15 Pro to Use Apple Intelligence Seems Sus

20 June 2024 at 15:00

"AI for the rest of us." That's how Apple advertises Apple Intelligence on its website, the company's upcoming generative AI experience. The problem is, that tagline only applies if you have the right device: namely, a newer Mac, or a brand-new iPhone.

Apple Intelligence is chock-full of features we haven't seen on iOS, iPadOS, and macOS before. Following in the footsteps of ChatGPT and Gemini, Apple Intelligence is capable of image generation, text workshopping, proofreading, intelligent summaries, as well as enhancing Siri in ways that make the digital assistant, you know, actually assist you.

In order to run these features, Apple is only making Apple Intelligence available on select iPhones, iPads, and Macs. For the latter two categories, it's a rather wide net: Only M-series iPads and Macs can run Apple Intelligence. Sure, that leaves out plenty of the Intel Macs still in use today, as well as the iPads running Apple's A-series chips, but the company has been selling M-series devices since 2020. Many Mac users have adopted to Apple silicon, which means they'll see these AI features when they update to macOS Sequoia in the fall—or, at least, the features Apple has managed to roll out by then.

However, things aren't so liberal on the iOS side of things. Only those of us with an iPhone 15 Pro or 15 Pro Max can run Apple Intelligence when it's available with a future version of iOS 18. That's because Apple requires the A17 Pro chip for running Apple Intelligence on iOS, which the company has only put into these particular iPhones so far. Even the iPhone 15 and 15 Plus, which launched at the same time as the Pros, can't run Apple Intelligence, because they're using the previous year's A16 Bionic chip.

Why Apple Intelligence is only available on newer Apple devices

Apple's stance is that Apple Intelligence is so demanding that it needs to run on the most powerful hardware the company currently has available. A large part of that is the processing power the desktop-class M-series chips have, as well as the minimum 8GB of unified RAM. (The iPhone 15 Pro also comes with 8GB of RAM.) But the main component as far as Apple Intelligence is concerned is likely the Neural Engine: While Apple has included a Neural Engine in all iPhone chips since the A11 Bionic in the iPhone X, 8, and 8 Plus, Apple only started adding a Neural Engine to the Mac with the M1.

That stance is largely reflected in an interview between John Gruber of Daring Fireball and Apple's marketing chief Greg Joswiak. Joswiak had this to say to the question of why older Apple devices couldn't run Apple Intelligence:

So these models, when you run them at run times, it's called inference, and the inference of large language models is incredibly computationally expensive. And so it's a combination of bandwidth in the device, it's the size of the Apple Neural Engine, it's the oomph in the device to actually do these models fast enough to be useful. You could, in theory, run these models on a very old device, but it would be so slow that it would not be useful.

Essentially, Apple feels that a compromised Apple Intelligence experience isn't one worth having at all, and only wants the feature running on hardware that can "handle it." So, no Apple Intelligence for Intel Macs, nor an iPhone other than the 15 Pro.

Apple Intelligence should probably be able to run on more devices

While there is sense to that argument, it's definitely easy to take the cynical view here and assume Apple is trying to push customers into buying a new iPhone or Mac. I don't really think that's the case, but I don't buy the idea that Apple Intelligence can only run on these devices. Keeping Apple Intelligence to the M-series Macs makes the most sense to me: These are the Macs with Apple's Neural Engine, so it's easiest to get these AI features up and running.

It's the iPhone and iPad side of things that rubs me the wrong way. These devices have Neural Engines built into their SoCs. Sure, they might not be as powerful as the Neural Engine in the iPhone 15 Pro (Apple says the A17 Pro's Neural Engine is up to twice as fast as the Neural Engine in the A16) but I have trouble believing an Apple Neural Engine from 2022 isn't fast enough to handle features a chip made in 2023 can. I also wouldn't be surprised if Apple could get Apple Intelligence working well on a higher-end Intel Mac, but at least these devices don't have Neural Engines at all.

Not to mention, not all the processing is going to be happening on-device anyway. When iOS or macOS thinks a process is too intensive for the A17 Pro or M-series chip to handle itself, it outsources that processing to the cloud—albeit, in Apple fashion, as privately as possible. Even if the A16 Bionic can't handle as many local AI processes as the A17 Pro, how much would the experience be downgraded by outsourcing more of those processes to the cloud?

Who wants Apple Intelligence anyway?

But here's the thing: Even if Apple is choosing to omit Apple Intelligence from the iPhone 15 and earlier unnecessarily, I don't think it's to sell more iPhone 15 Pros. I think it simply doesn't want to waste the resources optimizing a feature that doesn't have a ton of demand. Despite ChatGPT's popularity and notoriety, I don't see "more AI" as something most iPhone and Mac customers are looking for in their devices. I think most customers buy a new iPhone or new Mac for the essential features, like keeping up with friends (especially over iMessage), taking solid photos, and using their favorite apps. AI features baked-into the OS could be a plus, but it's tough to say when there's really no precedent yet for consumers purchasing hardware made for AI.

Personally, if I had an Intel Mac or an iPhone 14 Pro that was working fine, I wouldn't see this as a reason to upgrade: Even if Siri sounds more useful now. I think Apple knows that, and doesn't want to waste time on developing these features for older devices. It probably doesn't have the resources for it anyway—the company is staggering the release of key AI features, like Siri's upgrades, so it has time to make sure everything works as it should before committing to the next set of AI options.

While Apple Intelligence might be the feature set grabbing most of the headlines, most people are going to update their iPhones and find other useful changes instead—some indeed powered by AI. You'll have the option to totally customize your Home Screen now, with control over where app icons go and even what they look like. You'll be able to send messages over satellite without a cellular connection, and you'll find new text effects and options in Messages. You'll even be able to mirror your iPhone's display to your Mac, if that's something you want to do.

The point is, there are a lot of new features coming to iPhones and Macs compatible with iOS 18 and macOS 15—even if Apple Intelligence isn't among them. I get Apple's reasoning here, and while I bet the company could run Apple Intelligence on older devices, I don't think you're going to be missing out on much. We'll have to see once Apple Intelligence does actually arrive—one piece at a time.

❌
❌