Seeing a retina screen for the first time is so impressive that it ruins the experience of using regular screens. Spatial audio is better than stereo, but it is not impressive. I wonder if stereo would sound bland after getting used to spatial audio.
iPad Air 2 was released on October 2014. It will get the latest operating system, iPadOS 15, 7 years after its introduction. And the hardware is still solid — I use one daily. I have never used a product with a higher return on investment.
Everyone must have got the memo by now: Yes, AirPods Pro fit into the little jeans pocket. Having them available without the discomfort of carrying them in my regular pockets, and certainly, not having to dig my hand deep into my pockets is an integral part of the AirPods experience.
As much as AirPods are about convenience, putting them on in the middle of a call was a hurdle. I had to support the phone with my shoulder and upper arm, while taking one of the AirPods out of its case, and being careful not to drop anything, and not to have muscle cramps in the meantime. In contrast to AirPods, AirPods Pro case allows one-handed use. It can rest on its cap and stay open. This helps when you are holding your phone with one of your hands, but it saves your day when you don’t have another hand. Not that I don’t, but accessible design is more than addressing disabilities. Kudos to Apple for being consistent in expanding their scope of accessibility.
The new case design is not an absolute improvement over the original one. Most notably, having the slots towards the outer side makes putting the pair back in difficult. It’s especially difficult to put the opposite pair (right pair when using left hand and vice versa). You better switch hands for that.
Noise Cancellation, Transparency, and Fit
I understood how good an upgrade AirPods Pro were, the first time I tried noise cancellation outdoors. With AirPods, I had to increase the volume to beat the traffic noise. With AirPods Pro, I decrease the volume because noise cancellation takes care of the noise. If you use your AirPods in noisy environments, you should upgrade to AirPods Pro.
Transparency is twin-sibling of noise cancellation. It seems like a nice-to-have feature, but it’s essential. AirPods Pro isolate environmental noise significantly even without noise cancellation. It’s impossible to hold a face-to-face conversation while wearing them. One option is to take them off, but it’s not ideal. Transparency is the feature that lets you take your AirPods Pro off without taking them off.
Everyone has been quick to note that transparency is potentially a great accessibility feature. Apple never rushes with these kind of features. I’m confident that we’ll wait for few generations of hardware or software until users can control transparency.
Contrary to other reviewers, I wasn’t happy with Ear Tip Fit Test. I first did the test with the default medium-sized pair. Result was good. Then I thought that small-sized tips might fit better. They indeed did fit better and tested positive. Then I began suspecting that the test was a scam and tried with large-sized tips. They didn’t fit me, but the test said they did. Ear Tip Seal Test would be a more apt name.
Tap vs Squeeze and Controls
AirPods Pro use a squeezing gesture instead of the tapping gesture in AirPods. When tapping didn’t work, I had no idea if the tap didn’t register, or there was a connection problem to the phone, or the phone wasn’t responding as quickly as it should. I had a habit of triple-tapping, hoping that any two of the three would register as a double-tap. Haptic feedback of squeezing lets you know if you successfully squeezed or not.1 However, squeezing is a more complex gesture than tapping. It requires finer motor skills — accessibility, remember? Occasionally, my squeeze doesn’t register perhaps because I squeeze it wrong. Squeezing is more difficult when wearing a hoodie or gloves. And I hate touching my AirPods Pro when my fingers are dirty in the kitchen. With AirPods, I could tap with my knuckles.
I don’t see why squeezing and tapping have to be mutually exclusive. Each could have different functions. Considering that you can’t customize even squeezing functions (except tap-and-hold), this seems a far-fetched request. I would love to see making different gestures available to apps so that we can navigate podcasts, chapters, change playback speed, etc. That would make AirPods Pro a real pro device.
I wish customizing controls was only a problem of professional use. When you use your AirPods Pro with noise cancellation on, it’s a matter of regular annoyance. 90% of the time, I pause playback because someone speaks to me. I also have to turn transparency on to hear them. The awkward silence until I speak back takes forever. And I reply: “Sorry, I couldn’t hear.” After the conversation, I do the tap-dance again to resume playback and turn noise cancellation on. The worst part is yet to come. Soon after, I need to have another conversation. How did AirPods Pro ship without a solution, when it is as simple as having a single gesture switch between play-with-transparency-off and pause-with-transparency-on?
I’ve spent many words on a headphone review without mentioning sound quality. The reason is two-fold. First, AirPods are much more than a headphone. They are about availability, usability, and convenience. Second, sound quality is so good that it’s not worth talking much about. Compared to AirPods, AirPods Pro sound much better.
If you haven’t got the other memo by now: No, AirPods Pro has no auto-switching between devices.
Lack of taptic engine in AirPods is not an excuse for not having any feedback. An auditory feedback would work as well. ↩
iPadOS didn’t appeal to me when it was announced. Advanced multitasking, faster Apple Pencil, Sidecar, and external storage support are great features, but I’m not a pro user. I use my iPad primarily to read news and books, watch YouTube, and browse the Web. When I began using iPadOS, however, I was surprised to see how lots of small improvements made it a much smoother experience. iPadOS was greater than the sum of its parts.
I use a 5-year-old iPad Air 2. It has had great performance since day one, but I’ve never been happy with app switching performance. Swipe-with-four-fingers gesture had a gross animation, and was unresponsive up to a few seconds. With performance improvements in iOS 12, app switching became usable. I still unconsciously dreaded using it, perhaps because responsiveness wasn’t as reliable. With iPadOS, app switching has crossed the critical threshold of usability. Apps are almost immediately available every time upon switching. Just like command-tab on macOS is second nature, a swipe from the bottom of the screen has become second nature to me on iPadOS. This is Apple at its best, improving a core feature of the software on 5-year-old hardware.
Desktop browsing was a feature announced on stage, but I didn’t get it back then. Majority of Web sites had already presented their desktop version on iPads, and those which didn’t could be fixed by manually requesting the desktop version. Was it going to be the default now? The problem was that these desktop versions only looked like desktop versions. They didn’t work like one. YouTube was the biggest culprit. It didn’t allow full-screen videos on the iPad. Desktop browsing fixes it. Now I can watch YouTube videos full screen, change playback speed, and play them while the screen is off to just listen to the audio.
I bought my first iPad as a graduate student. I had been using my Mac to read and highlight research papers, books, and course notes. iPad was the perfect medium for that. Guess what? iOS didn’t support pdf annotation. Bummer. I’ve never understood how such a basic feature could be omitted for so long. iPadOS brings pdf annotation, finally.
Now that I’ve changed the tone, I can rant on new mode of text editing. How am I supposed to learn about it in the first place? Why do I have to search the Web to find out how to place the cursor in the search bar? I can place the cursor on a word boundary with a tap in Notes app. Why doesn’t Safari search bar work that way? Oh, it’s because search terms are selected by default in an obscure manner. By the way, I can place the cursor inside a word when the keyboard is off in Notes. Why can’t I do that when the keyboard is on? To be honest, I’ve never enjoyed the old way of editing text via heavy use of magnifying glass, or tapping and dragging with millimeter precision. But at least, how it worked was clear.
There’s no single Apple behind the products and services we use. Every single part of the ecosystem is an outcome of different dynamics. Failure is a natural part of growth. I love the Apple that listens feedback, iterates, and perfects. But I’m afraid that stubborn parts might take over.
Apple Watch Series 4: can generate an ECG — can’t track sleep.
iOS 12 performance make old versions seem broken.
Siri’s speech quality improved significantly in iOS 11. But why can’t she read text?
I listen to 12 – 13 hours of podcasts and audiobooks weekly — thanks to smart speed feature in Overcast, it doesn’t take that long. One day I realized that I could listen to any text on the system. It would be nice to listen to news articles, but it would be awesome to listen to an ebook. Listening to synthesized speech is not the best, but new Siri should make it much better, no?
Unfortunately, new Siri doesn’t work for text-to-speech — even though it works for VoiceOver. It is a valid option, but selecting it causes old Siri (Samantha) to speak. After 4 major iOS 11 updates this bug still persists. I hope iOS 12 fixes it.
Why Siri is not available as a text-to-speech option on macOS deserves its own letter.
AirPods impressed me. My first experience with them was so good that I immediately wanted to blog about it. And that’s how I began blogging.
I had already seen that AirPods were paired easily. I was, nevertheless, startled when I paired them with my iPhone. Pairing looks easy when you watch it, but it’s effortless when you do.
After few minutes of use, I began dreaming about the possibilities this device enables. And that’s how you understand that a product is great.
I decided to buy AirPods because I was tired of putting my headphones on and off whenever I leave my desk. I reach over to my bottom drawer, or fill my mug with water many times in a day. Putting on and off my headphones for trivial tasks is more annoying than doing so for larger tasks. AirPods free me from the distraction of managing my headphones, and allow me to focus.
Switching between devices was another headphone management problem I had. Unplug the headphones from the current device, plug them to the next device, and repeat as many times as necessary. I have the habit of using my iPad during my work breaks, so I switch as many times as to be annoying. AirPods are better than wired headphones in this regard. A swipe and two taps are enough for AirPods to connect to another iOS device. But it’s still annoying, especially when switching takes 5 – 10 seconds. It doesn’t always take so long, but it does so as often as to be annoying. Auto-switching, which was previously announced but retracted before shipping, is a must.
I listen to podcasts whenever I walk or do chores. I always put headphone cables inside my shirt to reduce friction and not to let them mess around. Cables drive me mad by constantly pulling down my ear. I have to fit headphones in my ear regularly. AirPods are truly wireless; there is no cable that could pull them down. Truly wireless operation was also the reason I chose AirPods over Beats X.
I was heartbroken when my dreams about how useful AirPods would be turned out to be false. When AirPods are connected to my Mac, I don’t have the option of answering a phone call over my AirPods. Yes, I can answer it on my Mac via handoff, but why should I? I want the freedom of grabbing my phone and stepping outside my open office to have some walk and not disturb my peers.
AirPods, by design, are not suitable for noisy environments. Even the most basic in-canal headphones isolate noise better. Unfortunately you have to turn the volume up to hear better, but that only causes more damage to your hearing. I am looking forward to hearing about in-canal AirPods.
You have to reach your iPhone to adjust volume of AirPods. When you change environments from indoors to outdoors, from walking to commuting, and back, you frequently need to adjust volume. Reaching over to iPhone becomes troublesome. Apple Watch is a convenient option if you have one, but switching to Now Playing app is harder than pushing volume buttons of your iPhone over your pants. The best way to adjust AirPods volume would be to use Apple Watch crown, directly on the watch face.
Other problems are minor. AirPods play a tune when they connect to or disconnect from a device. Connection tune helps when nothing is playing, but it’s unnecessary when you put on your AirPods while already listening to something from your device speakers. AirPods should better play the audio as soon as they connect rather than spending time to announce that they connected, which is very obvious. The relation between Apple Watch and AirPods is very strange. When listening to music on my iPhone via AirPods, watch timer silences AirPods when it goes off. I understand that it might help to pull the attention, but it should just resume the audio, which it doesn’t, after dismissing the alert. Resuming audio is indeed a more general problem. When you answer a phone call while listening to music via AirPods, music doesn’t resume after you hang up. I am confident that these issues are going to be fixed in next major release of iOS.
Speaking of AirPods improvements in next major iOS release, we haven’t heard much about AirPods in WWDC, except that double tap can be set to different functions for each ear. I hope the lack of announcements is because there were more important things to talk about, and not because there were no new features to come. I would love to see this great product become greater.