Which Mac Do You Need?

When you start shopping for a Macintosh (or Mac) computer, there are a lot of things to consider: Laptop or Desktop? Which computer chip? How much memory? How much storage? Let’s focus on these and other features to consider.

But first, let’s consider the most important factor: How will you be using your computer?

Who Are You?

You might be a person who does light computing. Email, web surfing, shopping, and a video call with family and friends from time to time.

You might be a hobbyist who does the light computing tasks as well as photo editing and throwing together a video compilation from time to time. You might be a serious hobbyist who also edits complex videos with multiple clips, transitions, and color correction from time to time.

You might be a professional who relies heavily on your Mac to help you generate professional graphics, audio, and video projects so you can get paid from time to time.

To be honest, if you’re a professional-level Mac user, you know what you need. So this article really isn’t intended for you. Go get the heavy-hitting machine and go forth to create great things!

For the rest of us, you might be wondering how much power you need for your everyday tasks. You might be wondering if you need to max out the options to be certain you can edit your home videos in iMovie. This article is for you.

Upgradability

If you’re coming from the PC world, you’re familiar with upgrading computer components after you buy a computer. Apple hardware is different. Essentially all components are permanently assembled into the computer. Hard drives and memory are often soldered to the motherboards, and batteries are not user-replaceable. This allows Apple computers to be smaller, lighter, and more reliable. And this practice is making its way to the PC world, too, especially for laptops.

So it’s important to understand the different components that will go into your Mac, because you’re going to be stuck with them.

Desktop versus Laptop

Are you going to do your computing at one place? If this is the case, a desktop computer should be considered.

Are you more comfortable doing your computing at different places in your home? And do you like to pack up your computer and take it with you to coffee shops or to work? If so, then a laptop will likely be your best option.

For laptops, weight should be considered. Generally, models with smaller screens (13″ and 14″) are lighter at about 3 lbs (1.4 kg). Models with larger screens are bigger and heavier, so they’re not as easy to pack up and take with you and the weigh in at about 4 lbs (1.8 kg). An extra pound doesn’t sound like a lot … until you’re carrying it for a while. Trust me.

Preferred Display Size

MacBook laptops and iMac all-in-one desktops come with their own displays. And Apple uses premium displays and technologies. Sizes range from 13″ to 24″, and they’re gorgeous.

If you have your own display or have a specific display requirement, then Apple also offers the Mac mini and Mac Studio desktop computers to which you can attach your own display. There’s also a Mac Pro, but it is expensive and not intended for mortals like you and me.

Keep in mind that all Mac desktops and laptops can support multiple displays. You can attach a 4K monitor to your MacBook, and use it with the built-in display. You can have two or more monitors connected to your Mac Studio. Depending on the Mac model, there may be many more possible display configurations.

Form Factor Performance 

Historically, desktops have usually performed better than their laptop counterparts. Desktops are plugged directly into a power outlet, and they dissipate heat better than laptops which allows manufacturers to use larger, more powerful components. Laptops needed to be lightweight and run on batteries for several hours. Therefore, computing components generally needed to be small and very power efficient forsaking performance.

In 2020, Apple started using its own processing chips. The company’s CPU designs offer incredible efficiency resulting in very long battery life for laptops, and the computing performance is inarguably stellar.

Some benchmark tests report some cases of diminished performance in MacBooks due to thermal throttling. The Apple laptops have very small enclosures which challenge heat dispersion, but the lowered performance is negligible. In the vast majority of computing tasks, you won’t notice any difference in performance between Mac desktops and laptops.

Connectivity

Apple’s lower-end models offer fewer ports. The number and variety of ports increase as you move up the Mac model lines.

The MacBook Air has just a couple of ports. The 14″ and 16″ models of the MacBook Pro have many more. The Mac mini has seven ports. The Mac Studio has twelve ports, and they’re more capable than the Mac mini’s ports.

So consider how many devices you’ll want to physically attach to your computer. How many monitors? How many peripherals like keyboards, mice, and printers? Is having an SD card reader important? Consider these carefully. And keep in mind you can use port expansion dongles (such as USB hubs) on computers with fewer ports. For example, with an SD card reader dongle, you can use your camera’s SD cards on your Mac mini.

Storage

Apple’s storage options are painfully expensive. There’s no way around that. But you are getting quality. The solid state drives (SSDs) in Macs are very fast and very reliable over a long period of time.

Get as much storage as you need if you can afford it. You can save money with a smaller drive, but feeling limited for data storage might repeatedly frustrate you. And the good feeling from cost savings could evaporate quickly.

For exclusively light duty computing, the smallest storage option could easily suffice.

If you store a large music collection or have a lot of digital photos, then you will want more storage. You might want significantly more storage. As a rule of thumb, I suggest discovering how much storage you currently use (say, 400 GB), then double it (so 800 GB), and then round up to the next storage option (1 TB).

I would argue that storage should be the first option to consider upgrading when configuring your Mac. If you can afford only one upgrade, make a storage upgrade.

CPUs and GPUs

As mentioned earlier, Apple debuted their own processor chips in 2020. And they really are great. As of July 2022, Apple has a full M1 line of chips and a single M2 chip. The line-up is pretty straight-forward.

  • M1 – 8 core CPU, 7 or 8 core GPU, 16 core Neural Engine
  • M2 – 8 core CPU, 8 or 10 core GPU, 16 core Neural Engine
  • M1 Pro – 8 or 10 core CPU, 14 or 16 GPU, 16 core Neural Engine
  • M1 Max – 10 core CPU, 24 or 32 GPU, 16 core Neural Engine
  • M1 Ultra – 20 core CPU, 48 core GPU, 32 core Neural Engine

As you go up the model lines, you get some more CPU power. The difference between 8 and 10 CPU cores will be negligible unless you’re a business/professional user. Also notice that the M2 chip is not “twice as good” as the M1 chip. Don’t let the “2” mislead you.

Also, as you go up the line, you get more graphics power. This doesn’t translate to running higher resolutions or physically larger monitors. This graphics power means you will be able to tackle very complex graphics and video editing tasks with greater ease, and you can export your videos very, very quickly. Even the base chips can handle most complex editing tasks even for many creative professionals. So keep this in mind if you’re a hobbyist user.

For the simplest everyday computing tasks, the base M1 and M2 chips will effortlessly handle them. In the entry-level models of the MacBook Air and the iMac, Apple offers these chips with one or two fewer graphics cores. For this level of computer, you will not notice a performance difference. If you’re offered this lower-end chip, then consider it and save a little money.

The Pro model will offer better graphics and video performance for the hobbyist artist and video editor.

The Max model will provide extra performance for the computer user who uses graphics tools and performs video editing more frequently.

The Ultra model is overkill for anyone who is not running a graphics and/or video editing business. If you’re not in this league, then do not consider the Ultra chip. If you’re wondering whether you’re in this league, then you’re not.

Briefly, I want to provide a little more perspective on GPUs. For me, I really appreciate editing videos where I can freely move the playhead anywhere in the timeline (quickly or slowly) without dropping frames. This performance can be easily achieved with a M1 Pro chip with 14 GPU cores.

Neural Engine

Now, I won’t pretend to fully understand neural engines, but I know a little bit. App developers can leverage the neural engine to achieve some impressive results. It can help artists select subjects even on very complex backgrounds by using artificial intelligence. I think of it as the computing element capable of fuzzy logic. And I know there’s an engineer out there who’s probably ready to smack me for improperly using the term “fuzzy logic.”

Unified Memory (formerly Random Access Memory)

Apple include computer memory on the same chip as the processor. The overall chip architecture enables very fast access among the key computer components, and it results in a computer that runs very fast. The memory is shared among all those components, so Apple calls it Unified Memory.

For many models, 8 GB of Unified Memory is the entry-level configuration. For all everyday tasks, and even some hobbyist-level tasks. This amount of memory is adequate.

For computer users who run a few truly professional apps (graphics and video) at once, 16 GB is likely to perform very well.

When it comes to 32 GB of memory or more, those Macs are ready to work smoothly with many top-level professional graphics apps as well as video editors with extremely complex timelines (4K video clips, color correction, and transitions). If you are wondering whether you need 32 GB or more of Unified Memory, then 16 GB will likely serve you well.

Mac Models to Consider in 2022

So after reviewing all this information, which model, generally speaking, is best for which type of computer user?

Well, if we’re speaking generally, let’s speak in terms of Macs with standard configurations for each of the types of users: Everyday users, Hobbyist users, and Professional/Business users.

Everyday users: Aim for the MacBook Air, iMac, or Mac mini. With Apple’s chips in these models, they are essentially powerhouses for everyday computing.

Hobbyist users: Focus on a Mac mini with an upgrade of 16 GB Unified Memory or a MacBook Pro with an M1 Pro processor.

Serious Hobbyist user: If you just know your Mac will need a bit more gusto to do your tasks, then look closely at the MacBook Pro with the M1 Max processor or the Mac Studio. Both have a heavy hitting processor with 32 GB of Unified Memory. If you need more than this, then you’re probably a business owner in the graphics and/or video field.

I have a quick warning for the 13″ MacBook Pro. This model sits in a weird place. I argue that most people should not consider this model. Consider the MacBook Air with its similar specs, or consider the MacBook Pro models with M1 Pro chips. Unless the unique feature (the Touch Bar) of the 13″ MacBook Pro exactly meet your needs, other models are likely a better solution.

I hope this discussion has given you a bit of insight into each area that affects a Mac’s performance. If you feel you need to fine tune your thoughts before making a decision, reach out to an Apple Specialist. Make an appointment in a Apple Store if one is around you, or reach out to them via the Apple website.

All the best to you in your Mac purchase!

Improving Apple’s Product Names

Over the years, Apple has come up with excellent names for its products. Monikers like iPhone, Mac, and iPad are immediately recognizable and descriptive enough to know what these products are.

When it comes to sub-branding in the product lines, Apple is clumsy. And this clumsiness leads to consumer confusion. For instance, what “Air” means for MacBook is different from what “Air” means to iPad. Another example: “Pro” and “Max” relating to iPhone is very different from what these terms mean for the M-series chips that Apple installs in the Mac, iMac, and MacBook products.

Every now and then, confusion due to clumsy marketing creeps into conversations I have with friends and family members who are seeking input on Apple products.

Them: “So MacBook Air and iPad Air are the least expensive models, right?”

Me: “Not exactly. The MacBook Air is the least expensive MacBook, but the iPad Air is more expensive than the base model iPad.”

Them: [Looking at me confused]

So one day I found myself thinking about the Apple product lines as they are and how I wish they were. Yes, this probably sounds dumb. I don’t have all that much free time in my day, but I can’t stop thinking about such dumbness in the universe, especially when said dumbness comes from a company with a multi-trillion dollar market cap.

My Wish List for Apple Product Names

iPhones

  • iPhone SE
  • iPhone 14
  • iPhone 14 Plus
  • iPhone 14 Pro
  • iPhone 14 Pro Max

iPads

  • iPad Air [formerly base iPad)
  • iPad mini
  • iPad [formerly iPad Air]
  • iPad Pro (11″ & 12.9″)

MacBook Laptops

  • MacBook Air
  • MacBook (13″; M1-based) [formerly MacBook Pro 13″]
  • MacBook Pro (14″ & 16″; M1 Pro & M1 Max)

iMac All-in-One Desktops

  • iMac (24″; M1-based)
  • iMac Pro (27+”;  M1 Pro & M1 Max)

Mac Desktops

  • Mac mini (M1-based)
  • Mac Pro  (M1 Pro & M1 Max)
  • Mac Pro Max  (multiple M1 Pro & M1 Max chips)

M-Series Chips

  • M1
  • M1X [formerly M1 Pro]
  • M1Z [formerly M1 Max]

Okay, it’s off my chest. I’ll let you know when Tim Cook calls to express his gratitude to me…

My Travails with Xfinity xFi

Stylized image of the xFi Gateway device with a swirl of light around it

The Coronavirus pandemic changed the way my family and I used our Internet service. Spoilers: Our data use went WAY up.

At first, Xfinity (aka, Comcast) lifted data caps, and that was appreciated. Then in July, they reimposed the data limits, so I had to make some adjustments. I decided to dump my personally owned equipment and adopt the xFi Gateway.

It was impressive. And it was terrible.

Adopting the xFi Gateway

In July 2020, Xfinity announced they would charge for data beyond 1.2 TB each month. And we were definitely using more than that. Xfinity offered Unlimited Data for $30/month with your own equipment or $25/month with xFi Complete that included the xFi Gateway which offered mesh abilities.

I’ve always preferred to own my own modem and router for our Internet service. I can access all the router settings, and I can research how to solve issues when they arise.

But I figured I’d give Xfinity a shot. (Cue the foreboding music here.)

xFi Gateway with xFi Pods

I received the xFi Gateway and deployed it easily. The mobile app really is good. I could label devices and assign them to people in our home. This allowed me to set schedules and even unceremoniously yank a connection when a kid became disrespectful. That said, I never set a schedule or disconnected my offspring.

After a week or so, the gateway’s self-diagnostics qualified me for xFi Pods, the mesh network components. They worked really well, too.

Until…

The Unexplained Blips & Xfinity’s Tech Support

Whether wired and wireless, all connected devices would experience about 3 blips per week. These “blips” weren’t just simple bandwidth congestion where the connection would stutter and recover. These instances were hard connection resets. When a blip occurred while anyone was using Zoom or Microsoft Teams, we were disconnected from the meetings completely, forcing us to connect from scratch. This was not conducive for telework and virtual learning.

I forgave the blips, at first. But they became more frequent. So I started calling Xfinity Internet Support.

It went exactly how you’d expect.

I would describe my issue to the reps. I should have recorded this because I repeated more times that I can count.

In the beginning, the Tier 1 techs would vaguely troubleshoot my gateway and pronounce success. And a few days later, the blips would return.

After calling several times, I started getting more advanced technicians who were more transparent with their troubleshooting. But the results were the same: After a few days, the blips returned.

Then I had the Big Conversation with one of the advanced techs. We discussed and troubleshot many things. And then the conversation abruptly ended with his diagnosis.

He claimed I had too many devices attached to my network, and he suggested that I upgrade to a faster Internet package. And then he essentially stopped listening to me.

I respectfully disagreed with him and explained that my 5-year-old non-mesh router had handled the same number of connected devices (e.g., computers, tablets, phones, IoT devices) without any issues. Unfazed, the tech repeated that I needed faster Internet.

My Solution

If the tech can quit on me, then I could quit on the xFi Gateway. I packed it up and returned it.

I dug out my old modem and reconnected it. I enjoyed the call to Xfinity support so they could recognize the modem’s MAC address. I connected the tried-and-true modem to a new Netgear Orbi mesh network router.

Guess how many blips we’ve had…

Zero. Not a single one.

Arris modem with Netgear Orbi mesh router system

So this blog post is an ode I wish I could sing outside the home of that dismissive Xfinity support technician.

Analogy: A Backdoor to Encryption

There is a lot of discussion around device security and using encryption for data storage and transmission. Security and privacy are good things. However, recent investigations into homicides and terrorist activities have led law enforcement officials to seek assistance to break encryption on alleged perpetrator’s smartphones. Specifically, US federal government officials have pressured Apple, Inc., to assist in breaking encryption on iPhones owned by alleged domestic terrorists.

I think a lot of people jump to a quick conclusion that breaking encryption in these instances is a good thing. Further, they feel implementing a government backdoor to easily bypass encryption is probably a good idea, too.

Unfortunately, these people are wrong.

Here’s an analogy:

Let’s say you have a door lock on your house with a 4-digit code to unlock it. Now, let’s say there is one code that only law enforcement can use to gain access to your home. You are not given that secret code, of course. It’s only for law enforcement officials. Even if you fully trust them, how comfortable are you with this scenario?

Once criminals know this code exists, how long would it take for them to learn that code? Answer: Not long. And then your home is vulnerable, and you have no way to update your lock to prevent criminals from entering your home whenever they wish.

So, we can increase your security by going from a 4-digit code to a 16-digit code (this is analogous to implementing stronger encryption). Now, it is much more difficult to guess your home’s door lock code. Meanwhile, law enforcement still has a single 16-digit code that can gain entry into your home.

How long before criminals would learn this more complex code? Answer: Again, not long.

So before you conclude that a backdoor to encryption is a fine solution for trusted law enforcement, think about this analogy.

If we find backdoors acceptable, then it defeats the entire purpose for encryption and security. And if that is acceptable, then we should abolish encryption and be comfortable with the lack of privacy.

I’m not being facetious. We can exist without the privacy and security afforded by encryption. But let’s not live with the illusion of privacy and security when it isn’t authentic.

[Photo by Sebastian Scholz (Nuki) on Unsplash]

Did Snoke Preview the Whole Sequel Trilogy Back in 2015?

I recently re-watched The Force Awakens, and I found a couple of lines from Snoke very interesting.

At the 49-minute mark, he is addressing General Hux and Kylo Ren about allowing BB-8 to be returned to the Resistance.

I’m not going to quote the movie’s villain verbatim, but he basically complained that if the Resistance can locate “the last Jedi” then “the new Jedi will rise.”

I did a double-take. Did Snoke preview the entire sequel trilogy back in 2015?

The 2017 episode was named The Last Jedi, and the final episode of the saga is named The Rise of Skywalker.

Back before Episode VII, I remember reading that Kathleen Kennedy and JJ Abrams essentially laid out the story for the sequel trilogy early on. And when I read about the departures of creators who took too many creative liberties with Star Wars (Gareth Edwards, Colin Trevorrow, Lord and Miller), I think Rian Johnson might be getting a lot of undeserved flak over The Last Jedi. I suspect he wrote and directed the movie with careful oversight from Kennedy and Abrams. And I, personally, don’t think The Last Jedi was a bad movie, at all. Remember, not everyone liked The Empire Strikes Back when it was originally released.

Anyway, I don’t want to turn into a Star Wars apologist here, so I will just end this article here.

Jony Ive is Great but Not Perfect

Jony Ive flourished under Steve Jobs’ leadership. In those days, Jobs was focused on simplicity, and Ive was masterful at creating it.

Some achievements were clearly great. Others arguable. A few were hamhanded.

There’s a saying that art can be whatever it wants, but design has to work.

The Misses

The mouse puck, the original iPod shuffle, and the Apple TV remote are exercises in forcing art to design.

Beauty Meets Stink

The mouse was an ergonomic disaster. The third generation iPod shuffle lacked buttons (yes, really). The Apple TV remote is the manifestation of a palindrome: Which way is up? (I always, always grab the thing incorrectly.)

In the pursuit to achieve thinness, Ive drove the effort to re-engineer the key mechanism on MacBook laptop keyboards. Thus, the traditional scissor key mechanism was eschewed, and the butterfly key was developed. This new invention reduced vertical space requirements from 1mm to 0.5mm. The butterfly keyboard has gone down in infamy as a true liability for Apple. The widespread reliability issues has spawned a multi-year warranty program to quickly replace failed keyboards for customers. One new MacBook hardware update was introduced and added to this warranty program on the same day. All to shave a half millimeter of MacBook thickness.

The Mad Pursuit of Hardware Thinness

Amazing that Scott Forstall lost his job for not apologizing for Apple Maps, but Jony Ive stayed completely under the radar on this one.

Painful But Good

Anytime he could, Jobs tried to advance the computing industry in many ways, and Ive realized many of those goals.

The iMac lost its floppy disc drives and then its CD-ROM/DVD drives. Apple lost these features first, and the industry eventually followed. Now that these things are gone, no one misses them.

These days, the pinch in convenience is the Thunderbolt 3/USB-C port. The evolution to this port is inevitable and comes closer to reality as each month goes by. Like before, Apple was the first mainstream hardware maker to unceremoniously dump all other ports for this new one. In a few years, no one will miss USB-A and the mini-USB, micro-USB, etc., ports and cables.

The Hits

Ive’s design skills were a vital part in Apple’s return to prominence. The industry-standard beige box was disrupted by colorful iMacs.

The iPod liberated our music from our immobile desktop computers with a revolutionary interface to access thousands of tunes in a device that literally fit in Jobs’ back pocket.

Inarguable Hits

And then there was iPhone.

A truly momentous device that revolutionized the awful mobile phone and personal info manager industry. It went on to conveniently bring communication technology to millions across the United States and billions around the world.

Now, that is a real legacy. So thank you, Jony!

2018 Mac mini Disappoints [Updated]

2018 Mac mini next to a Shame meter

On October 30, 2018, Apple debuted an updated Mac mini. The Cupertino kids did an admirable job updating the product, and the new price ($799, previously $499) seemed within normal Apple tolerances.

But the devil is in the specs.

If this is a desktop computer, the entry level specs aren’t impressive. And once you bump the options into mid-level desktop territory, you’re in for sticker shock.

In my opinion, a desktop computer serves as a desktop with these minimum specifications: A Core i5, 8 GB RAM, and 1 TB storage.

Let’s compare’s Apple entry level device with something more serviceable:

Mac mini
(entry level)
Mac mini
(desktop minimum)
Core i3 (4 core)Core i5 (6 core)
8 GB RAM8 GB RAM
128 GB SSD1 TB SSD
$799$1,699

Bumping the specs in just 2 areas adds $900 to the cost. Does upgrading a processor and solid-state storage really cost more than two entry level Mac mini devices?

I had hoped that the cost with the upgrades I sought would have been $1,199. If it had, I’d have placed my order.

Apple gives lip service to creating devices that provide the experience its customers desire. That gets interpreted as lip service when you look at the paltry entry level specs.

C’mon, Apple. You shouldn’t seek to fleece your customers to provide the expected “experience” you propose.

UPDATE … At the end of last summer (2019), I pulled the trigger on a Mac mini. I bought an Apple refurbished model with an i7 processor, 16 GB of RAM, and 1 TB of storage. The savings was close to $400 from new retail. Not bad.

Still, I feel Apple continues to offer utterly lacking storage options in its base Mac models. Living with 128 GB or even 256 GB of storage is not an acceptable “experience.” Solid state storage costs have dropped dramatically, and Apple should offer better base options to its customers.

Are Star Wars Fans Ruining Star Wars? [Rant]

Star Wars: What's The Big Deal?

When Star Wars: The Last Jedi hit theaters in December 2017, a lot of fans revolted. Now, summer 2018 sees the release of Solo: A Star Wars Story is released, and legions of fans have now lost their minds.

I think the fans are no longer interested in experiencing stories, and now they only want to see specific backstories of any characters they feel are interesting. And it better be the exact version of the backstory they expect. Any any story story advancement had better be exactly as expected, too. And we’d better see everything spelled out as expected for the original trilogy trio (Luke, Leia, and Han). If anything unexpected occurs, then there will be extraordinary fan backlash.

SPOILERS AHEAD

So we saw the demise of Snoke at the hands of Kylo Ren. That was a delightful twist, unless you were a Star Wars fanatic who badly wanted to learn more about Snoke. Like who his mom was, what public school he went to, and what intergalactic bully turned him into a deformed force-user bent on taking over the galaxy when the Empire couldn’t pull it off.

Um, who friggin’ cares? I want to see a story. I like surprises. And I feel the new characters deserve the primary focus of the current trilogy. Comparatively, there was nothing wrong with The Last Jedi (we all remember the trainwrecks of Episodes I and II, right?). And it was much more ambitious than the retread that was The Force Awakens.

And what’s up with the hate for Solo? Alden Ehrenreich and Donald Glover did excellent jobs with legendary characters. They brought a new dimensions to the characters of Han and Lando. Youthful optimism, smugness, and a bit of rivalry. And they went up against an interesting set of antagonists.

But the fans want Lucasfilm management thrown out. And even Disney stock has taken a hit. All of this hate over popcorn movies?!

C’mon, fellow Star Wars fans! Grow. Up. Please.

PS- I hear you. I have no idea why I take the time to write this stuff, either.

Sexual Harassment in Tech (and in General)

Sarah LaneI usually don’t post on topics like this, but sexual harassment is not a political topic and it’s not up for interpretation. It’s always wrong.

So many of my fellow males excuse their behavior with “Hey, I was just joking” or “She shouldn’t be so touchy about it.” That’s not the right way to look at it. It’s not whether the guy thinks the behavior is one way or the other, it’s how the person on the receiving end perceives it.

Just because you didn’t mean for it to be damaging doesn’t mean it wasn’t damaging.

Be sensitive. Be considerate. Have empathy.

Tech podcaster Sarah Lane recently published a thoughtful and very candid article on Medium (her article contains mature themes). It’s worth your time to better understand how some behaviors and remarks are received. And how our culture intimidates women who speak out about it.

Sarah Lane currently co-hosts Daily Tech News Show with Tom Merritt. They are a remarkable team, and the podcast is one of the best tech news sources available.

Holding Off on Apple’s iPhone X [Updated]

iPhone X

Apple worked too hard to get the iPhone X ready.

When the company couldn’t roll out the iPhone X with the iterative iPhone 8 and 8 Plus models back in September, that was the first sign. When KGI Securities guru Ming Chi Kuo kept releasing pessimistic supply chain reports, that was more bad omens. When, on October 25, Bloomberg reported this:

“As Wall Street analysts and fan blogs watched for signs that the company would stumble, Apple came up with a solution: It quietly told suppliers they could reduce the accuracy of the face-recognition technology to make it easier to manufacture…”

I doubt I was the only person who sold Apple stock that day.

So on iPhone X pre-order night, I slept. And I plan to just go about my day on November 3 when the iPhone X hits stores.

Why? I predict a ridiculous level of hype. I predict short supply until January or February 2018. I predict quality production issues. I predict Face ID problems (I’ll be happy to be wrong). I predict the iPhone X will be smoothing out by March 2018, and that’s when I’ll consider buying one.

Or not. I keep looking at my 6s Plus and thinking there’s nothing wrong or lacking with it.

Update (Nov 5, 2017): It seems my concerns about the performance of Face ID was overblown. Most reviews are decidedly favorable on this new technology. Meanwhile, I found Nilay Patel’s remark about Face ID in bright sunlight amusing:

Recent Apple products have tended to demand people adapt to them instead of being adapted to people, and it was hard not to think about that as I stood in the sunlight, waving a thousand-dollar phone ever closer to my face.