Apple Silicon, some speculation

Some musings on what we might see from Apple in 2021.
When Apple Silicon was released, it impressed with both its power, and its power efficiency. Here was a system that matched or exceeded the current Apple MacBook Pro and MacBook Air capabilities, but with vastly superior battery life. And, because of Rosetta 2, it even ran existing Intel 64bit code—and astonishingly, did so as fast, or even faster than equivalent Apple Intel models.

Now, there are some caveats on all of the above. There are some things the current M1 SoC doesn’t do as well as the old systems:

  • Maximum of 16GB of memory, with no upgrades possible.
  • Only built-in graphics, no support for external graphics.
  • Only two displays supported. Either the built-in display, plus one external for laptops, or two external monitors for the Mac Mini (cf 3 monitors in current systems)
  • Doesn’t run Windows
  • Doesn’t run other Intel operating systems in a virtual machine


The memory looks at first rather like an “Apple Tax”, with the memory soldered to the motherboard. You can’t add extra memory DIMMS later. As computer memory is a highly commoditised item, this means you pay more—sometimes considerably more—for memory.

However, this “Unified Memory” as Apple refers to it, has considerable advantages over DIMM memory. Putting the memory on the main chip leads to much greater bandwidth, and much lower latency, which obviates the need for dedicated graphics memory. Normally, using shared graphics memory is a performance compromise, lower cost but lower performance—not so with Unified Memory.

This sounds fine, but creating a chip with lots of specialised units (CPUs, Neural engine, GPUs, Networking) and lots of plain old memory (a memory cell, multiplied by several billion) sounds a bit problematic.

Then, I saw photographs of the SoC and the penny dropped. The SoC is not one chip, it’s  two silicon chips joined together on a carrier. One half (well, closer to ⅔) is what I’ll call the CPUs, for simplicity’s sake. The other is plain old memory.Apple M1

And thinking about that leads to speculation about the next systems to come from Apple, and what capabilities they might have.

The Shape of Things to Come

Firstly, let’s consider the Apple product line, excluding iPhones, wearables and HomePods.

  • MacBooks (plain, Pro and Air)
  • Mac mini
  • iMac
  • Mac Pro
  • Apple TV

Of those, only the first two categories were released with Apple Silicon. That makes sense, since the Mac mini is essentially a laptop without a screen, so really the M1 was released for Apple’s laptops with the Mac mini coming along for the ride.

The Apple TV already uses an ARM chip, so it’s really an iPhone without a screen (more nearly, an iPod without a screen). It’s been years since the last Apple TV upgrade, so arguably it’s due a refresh, but there is probably no hurry compared to the iMac and Mac Pro.

So, let’s look a little closer at the M1 SoC and see what its strengths and weaknesses are:


  • Unified memory
  • Thunderbolt & USB with dedicated controllers per port (current Mac models share a controller between two ports)
  • 4 high performance cores
  • 4 high efficiency cores
  • An 8 core GPU that matches bottom end discrete GPUs


  • 16GB memory max
  • Supports only two displays
  • No eGPU support

The eGPU support is almost certainly simply a lack of ARM64 drivers, and so can be fixed with a software upgrade, which I’m betting will happen.

I’m also betting the current M1 SoC chip design doesn’t allow for more memory.

So, any future M1 upgrades might be a little faster (higher clock rate, or other internal tweaks), in an M1X or similar, but are unlikely to have more cores or a vastly faster GPU.

What does that mean for future models?

There may be more laptop releases in 2021 using an M1 or ‘M1X’ with better specs, but they won’t support more displays, more memory, or more CPU cores.

Unified Memory

Unified memory isn’t going away. There won’t be any Apple Silicon models released with support for SIMMs, not even on the MacPro. The latency and bandwidth advantages are just too big.

However, given that half (⅔) the M1 is RAM,


you can easily see the RAM doubling (RAM on both sides, with the processor chip a sandwich in the middle),

M2 with 32GB

or quadrupling with RAM top and bottom as well,

M2 with 64GB

or even eight times, with the RAM entirely encircling the CPU.

M2 with 128GB

So the next system will probably support 32GB, with 64GB and 128GB a possibility. Not bad, but still not up to the 1.5TB supported by the current Mac Pro.


All that RAM needs something to use it, so a bigger chip is likely to have more high performance cores, probably retain the 4 high efficiency cores, but with 6, 8 or 12 high performance cores


The next system will almost certainly support 3 monitors, maybe four.

I imagine that Apple is working on GPUs that are at least twice as powerful as the M1, and on support for eGPUs of course.

Windows and other guests

That leaves the problem of running Windows as a guest operating system, or emulating Intel code well enough to run other operating systems.

Since Microsoft has a version of Windows that runs on ARM64, we will see virtualisation solutions such as Parallels or VMWare that run virtual machines, but for ARM64 guests.

Windows also has its own emulation software that works like Rosetta, emulating x86 on ARM. That means there should soon be an acceptable solution for Windows 10, Linux, BSD and other operating systems that have native ARM64 support.

CodeWeavers have announced CrossOver support for Apple Silicon, which is an alternative solution.

However, that leaves x86 code for older versions of Windows, and MacOS, for those who want or need to keep running such things, and for whom virtual machines have been the perfect answer. The response to this is unclear, but there is probably a lot of work going on right now to get x86 emulators on ARM64 up to speed (or possibly licencing one of the existing ones).


So, what are we going to see released in 2021? Based on the above, here are my predictions for the next wave of Apple Silicon.


The next model to be upgraded (not counting M1X speed bumps for the laptops) will be the iMac, with an ‘M2’ chipset.

  • 12 CPUs (8 high performance and 4 high efficiency)
  • a GPU with double the current performance (16 cores)
  • Support for at least three monitors (built-in plus two more)
  • Support for external GPUs
  • Memory will go up to a maximum of 64GB, possibly 128GB.


  • Possibly a low-end Mac Pro with 12 cores and 128GB will also be announced.


  • The full-noise Mac Pros will await the ‘M3’, in 2022.
  • The Apple TV might get a refresh, but it won’t need any of the power of the M2 and beyond SoCs.


I’m just speculating, from a little country half a world away from Cupertino (literally, half a world away). I don’t imagine the actual specs will be much less that that, but I would not be thunderstruck to find that the Apple engineers have something even more impressive waiting in the wings. I guess we’ll just have to wait and see.

macOS Server 5.3

macOS Server 5.3 contains a few traps for the unwary—traps which don’t appear to be mentioned in the release notes

  1. It only installs on 10.12.4 (or later, one assumes). This is mentioned in the release notes, but not in the App store notes. NOTE: 10.12.3 and earlier are not supported.
  2. It will control a remote server running version 5.2 on macOS 10.12
  3. It will not control a remote server, running any version, on OS X 10.11!

This last point is a horrible gotcha! If you are running a server on a previous OS X version—such as because it is on older hardware which can’t be updated to macOS 10.12, and you update to sever 5.3 on another machine, you can no longer control/manage the server instance running on the El Capitan machine.

Server 5.2 and Open Directory

Server 5.2 has its own version issues, or rather Open Directory does.
Server 5.2 will happily run on 10.11 and 10.12. However, if you had a Master/Replica on OS X 10.11/Server 5.2, and you upgrade to a mixture of OS X 10.11/Server 5.2 and macOS 10.12/Server 5.2, the Master/Replica breaks, simply because Open Directory insists (for no good reason that I can see) that they be running the same OS X version!

Server 5.2 can control Server 5.3

If you have a system running Server 5.2, it will happily control remote instances of:

  • Server 5.2 on OS X 10.11
  • Server 5.2 on macOS 10.12
  • Server 5.3 on macOS 10.12
Target Manager
macOS Server 5.2 5.3
10.11 5.2 ✖︎
5.3 N/A N/A
10.12 5.2

Let’s Encrypt OS X Server

(Or, letsencrypt macOS Server if you prefer)

I have been using CACert as my free SSL certificate for some time now, and it’s fine, with one exception—CACert root certificates are not trusted by default by many systems, including most significantly, iOS and Andriod. That in turn means that I can’t retrieve email off my home server from a company provided iPhone, since the company mandated security profile demands SSL authentication.

Letsencrypt aims to address this problem (among others) with their free certificates, which are trusted by Android and iOS. However, Letsencrypt uses a highly automated system (to make things easy for the user) which originally did not support OS X (macOS).

I recently decided to revisit Letsencrypt and have indeed managed to get it to do what I want, albeit with some interesting discoveries along the way.

Measure Twice, Cut Once

Since I did not wish to blow up my existing home server, especially the current certificate, I decided to test things out in a Parallels virtual machine. So, I duly fired up Parallels and started a clean install of El Capitan.

Other things can go wrong

It failed. It said the installer image could not be verified. So I tied a backup image, and got the same result. So I tried a Yosemite install, and it failed with the same error.

A little research showed a possible reason, so I tried resetting the clock, but to no avail. Then I had a light bulb moment. The page says the date must be correct in order to install OS X, specifically the year, because if the date set is prior to the release of OS X, the error will trigger.” It turns out that it’s important not only that the date not be too far in the past, it can’t be too recent either. In particular, the current date is too recent to install older releases!

The solution is to decouple the clock from the Parallels virtual machine (Parallels will keep the virtual clock synchronised to the real machine’s clock) and then set the date back a year or so to a little after the release of the relevant operating system. Voilà! It installed.

I then ran the certbot certonly script in the VM and, after a little fiddling, got things installed.

The Real Thing (with the fiddling done)

The certbot page for Apache on OS X shows how to create the certificate for OS X. It doesn’t work—or at least not on OS X Server.

The problem is neatly explained in the file /Library/Server/Web/Config/apache2/httpd_server_app.conf in the comments at the top:

# macOS Server
# When macOS Server is installed and set up (promoted), this file is copied
# to /Library/Server/Web/Config/apache2/httpd_server_app.conf. Both macOS
# and macOS Server use the same httpd executable, but macOS uses the config
# file in /etc/apache2.httpd.conf while macOS Server’s Websites service uses
# this config file.

The $ certbot –apache command works on the file in the macOS config file, not the macOS Server config file.

The solution is to use the $ certbot certonly command, and then select webroot, as follows:


Place the files in /Library/Server/Web/Data/Sites/Default

Import the files into Server’s Certificates and all is good.

El Capitan and me

I’ve installed OS X 10.11 (El Capitan) on three machines in the household, and while it’s just fine mostly, I have one—significant—problem and some things I’ve learned along the way.

OS X Server

Just before El Capitan arrived, Apple released OS X Server 5 (which was rapidly bumped to 5.0.4). This Server release brings a much welcomed change. Unlike previous versions it runs on both Yosemite and El Capitan. Previously, upgrading Server has been somewhat of a pain, since the machines running Server (both headless Mac Minis in my case) and the machine running Server simply as a console, have all had to be in lockstep. The console machine could not talk to a newer, or older, Server instance, and as soon as you upgraded OS X on any machine you had to upgrade Server on that machine as well. Essentially, that meant that you had to upgrade Mac OS X and OS X Server simultaneously on all machines. (The clients are OK: Server will work with them more or less regardless of their OS version).

Great! I was able to upgrade all three machines to the new version of Server in anticipation of the later upgrade to El Capitan.

The fragility of Open Directory

Alas, things were not quite so simple. OS X Server seems to have a long standing problem, where the Master and Replica OD instances get confused. In this case, the Master decided it didn’t have a Replica any more, and the Replica decided it couldn’t run OD. OD was always off, and if you turned it on it wouldn’t offer to create a new Master or join a Replica, it just turned itself off. However, it still worked (network users from the Master were still available on the Replica). So I ignored that, pending El Capitan.

I started by upgrading the Master Server to El Capitan. Which worked fine (took a little over 30 minutes). It then needed to upgrade OS X Server when I first ran Server again.

So, while Server 5.0 runs on OS X 10.10 and 10.11, it’s not quite the same thing. While this was going on, the Replica decided its (unacknowledged) Master had vanished and immediately forgot the network users! (It was prepared, now, to replicate a master or create a new Master).

After the El Capitan upgrade, OD Replication was still broken (Replica had the network users back but did not appear linked to Master), so I did what I’ve done before: forcibly remove the Replica and add it back to the master, using

sudo slapconfig -destroyldapserver diradmin

But, when I tried to add the Replica, it refused, saying that the OS X versions of the Server had to be the same!

So, lesson learned. All the Open Directory servers have to be running the same OS X version.

Which raises the question: WHY? Is this really necessary? It’s extremely irritating!

I upgraded the Replica server to El Capitan (which is also running O3X ZFS, so I was prepared for trouble), fortunately without incident (including the ZFS upgrade), and all is now fine with my servers.

El Capitan: Mysterious Hangs at Boot time

I then upgraded the main machine to El Capitan. I had some software that I was a little concerned about (Adobe CS3) but the collective wisdom of the web seemed to indicate that it was OK, so I went ahead.

Incompatible Software

The upgrade went fine, with only a few items put into the “Incompatible Software” folder:

  • GlimmerBlocker (LaunchDaemon and PrefPane)
  • GPGMail.mailbundle
  • WacomTablet.prefPane

I don’t care about the Wacom Tablet. GPGMail and GlimmerBlocker claim to be OK with El Capitan, so I reinstalled the latest versions

BOINC also would not run and asked to be re-installed (as is usual for BOINC after an OS X upgrade).

Reboot to hung screen

Then I restarted, and the machine hung.

It sits at the Apple boot screen with the progress bar at zero (no pixels of progress at all).

I restarted it several times with the same result. I restarted (Command R) into recovery mode and ran Disk First Aid. This worked and reported no problems.Then I restarted again.

It hung at the Boot Screen again.

I restarted with Verbose mode (Command V) and Single User (Command S), and it showed a Panic (but not a Kernel Panic) and stopped. Single User mode would not accept typed input.

So I reinstalled El Capitan from the Recovery Boot, which worked. I noted that it removed GlimmerBlocker, again. I put it back.

I put this down to a one off until the machine restarted (for reasons unknown) and returned me to the same hung boot screen. With the same symptoms (can boot into Recovery; Disk First Aid shows no issues; Panic in Single User Mode). I have resolved the problem the same way, by a reinstall. And I’m typing this Blog using that machine.

However, I’ve not reinstalled GlimmerBlocker, or BOINC, or GPGMail, or anything else that stopped working and asked to be reinstalled). We’ll see if it continues to work and if so I’ll consider adding back items one by one.

To be continued…

rubycocoa + rvm + Mavericks fixed!

As I wrote previously, rubycocoa did not work properly with Mavericks.

Well, I’m very pleased to discover that it’s been fixed, with a new version of rubycocoa available at SourceForge.


% ./testRubyCocoa.rb
/System/Library/Frameworks/Ruby.framework/Versions/2.0/usr/lib/ruby/2.0.0/rubygems/core_ext/kernel_require.rb:55:in `require’: cannot load such file — osx/cocoa (LoadError)
   from /System/Library/Frameworks/Ruby.framework/Versions/2.0/usr/lib/ruby/2.0.0/rubygems/core_ext/kernel_require.rb:55:in `require’
   from ./testRubyCocoa.rb:9:in `<main>’


% ./testRubyCocoa.rb
Module RubyCocoa awakes!

testRubyCocoa.rb is:

#!/usr/bin/env ruby
require "osx/cocoa"
include OSX
OSX.ns_import :NSString

module TestRubyCocoa
   puts "Module RubyCocoa awakes!"

OS X Server and iOS 7 are surprisingly unfriendly

Actually, that’s iOS versions up to and including iOS 7.1, and OS X Server 3 (3.03 to be precise). I.e. Mavericks.

I have a home server, just for the family. It’s got the family mail accounts, which over the years adds up to many GB of mail. There are also iCloud accounts (and various others), but for all sorts of reasons I’m quite happy to have this email on our home server (backed up regularly and frequently by the way).

I also have a personal domain—this one as it happens, although the web host is not on the home server, it’s an external service provider that specialises in web hosting. Of course I also have another set of email addresses with them.

This all works splendidly when we are at home, but we’d also like to be able to access the home server on the go, from a laptop or a smart phone. And that’s where I run into some troubles.

I have a company provided iPhone, which is a delight to use. I can access my work email, and also my iCloud email. The obvious next step would be to add my home email (on the home server) and my email through my hosting provider.

This doesn’t work.

iOS is very particular in two respects about email.

  1. It doesn’t like using untrusted SSL certificates (and the profile installed by my company locks down this requirement)
  2. It has a restricted list of CA root certificate providers that it trusts.

OS X Server then weighs in with its own preferences, in that it really wants you to use SSL, and will refuse to allow a plaintext login (password) unless you use SSL.

These are all very worthy restrictions, but they add up to inconvenience and expense.

No iOS email access to my hosted email

I can’t access my email from iOS at my hosting provider, because they use a generic SSL certificate ( rather than my domain. They say on their instructions for setting up mail access:

Please note that it is currently not possible to configure custom SSL certificates for secure email connections.

Please ignore any certificate warnings you receive.

Your connection will still be encrypted to prevent eavesdropping if using SSL and STARTTLS.

No iOS email access to my home email

I can’t use plaintext authentication, even over a fully encrypted IPSEC tunnel (i.e. OS X Server VPN) because OS X Server won’t let me, and I can’t use a self-signed SSL certificate because it’s not in the trusted list. I can’t even add the certificate to the trusted list via a profile because the company imposed restrictions don’t trust my additions.

In fact, to keep iOS happy, I’m pretty much restricted to the list of Apple blessed CA Root providers. And they want quite a lot of money, per year.

If you go to the cheap SSL providers, you’ll find that not only is not in the (current) iOS list, neither are some of the other budget providers. The magic phrase to look for is mobile support.

Note that this means I can’t even access the home server, at home, over WiFi, from iOS, without paying fees to an organisation that assumes I’m doing e-commerce and charges accordingly.


If anyone knows a solution to this, I’d love to know, but thus far the interwebs have not been encouraging. Bearing in mind that this is a company phone, with pretty reasonable restrictions





When false, automatically rejects untrusted HTTPS certificates without prompting the user.
Availability: Available in iOS 5.0 and later.

All I want is for OS X Server and iOS to figure out that they are ‘locally’ connected (e.g. LAN or VPN) and talk accordingly.

rubycocoa + rvm + Mavericks = fail

I am very happy with OS X Mavericks, by and large. It’s slightly more refined than Mountain Lion, and noticeably faster.

I also noted with pleasure that it installed Ruby 2.0 by default, instead of maintaining the out of date Ruby 1.8.

However, yesterday I tried running a script that invokes a module which i wrote that uses rubycocoa. It doesn’t work. A little digging reveals that rubycocoa is not (yet?) supported on either ruby 1.9 or 2.0. While there is a Mavericks update for rubycocoa, it just makes it work with versions 1.8. While I could live with that, since rvm does a good job of managing this sort of thing, in the case of rubycocoa, rvm has come up short. If I explicitly invoke the script with


It works, but

rvm use 1.8

#!/usr/bin/env ruby

fails with

/System/Library/Frameworks/Ruby.framework/Versions/2.0/usr/lib/ruby/2.0.0/rubygems/core_ext/kernel_require.rb:45:in `require’: cannot load such file — osx/cocoa (LoadError)
    from /System/Library/Frameworks/Ruby.framework/Versions/2.0/usr/lib/ruby/2.0.0/rubygems/core_ext/kernel_require.rb:45:in `require’



Things I’d like to see from Apple, part 5

My first musings on what I’d like to see from Apple were basically a home server edition of Mac OS X Server, and suitable hardware, which, as I said was “pretty much a Mac Mini, with the exception of the 802.11n.”.

That was in 2008. Since then, the Mac Mini has moved on, with the addition of 802.11n, plus Thunderbolt and USB 3.0.

Mac OS X Server has also moved on, and in many way what Apple is now delivering is a home server, with, if not a Squid cache, at least App store caching, mail server, Time Machine, etc.. There are excellent reviews of Mac OS X Server (Mountain Lion) at Ars Technica (updated in Jan 2013 when new features were released).

Significantly, the price has plummeted to to being eminently affordable. It’s basically US$20.00—the same price as Mountain Lion itself.

That being the case, I’ve installed it as the home server, and I’m happy with it in that mode. I’ve repurposed my Mac Mini that was running as the Time Machine server (that function has gone to another machine) and it happily acts as the family mail server together with some other functions such as providing a VPN (so we can check our mail on the road).

Mac OS X Server is now basically just, with all the server components hidden away inside the App bundle. It is also very encouraging to see Apple adding new functionality, like the Caching server which appeared as a new feature in the first Server update.

That leads to a little list of things I’d like to see added (or fixed):

  • Cache Server reporting. At the moment there is no display of what things are in the cache (which I’d like to know for curiosity’s sake).
  • More caching. At the moment, iTunes purchases are not cached, nor is general web browsing.
  • Fetchmail support. I pull mail onto the server for all the family members off multiple ISP accounts. Among other things, this keeps GB—and years—of mail on a local server which is not subject to merger, change of terms of service, bankruptcy or pre-emptive shutdown by foreign powers. It may be antediluvian and I should probably have the mail SMTP direct to my own domain, but in the meanwhile it suits me quite well and I surely can’t be alone in this.
  • The Mail server stops. Well, actually, it doesn’t. says that it has, but the server is really running quite happily.
  • Documentation. Really, the Apple documentation is inadequate. In transition from an enterprise product to a home product the documentation got left behind. I bought Server because I figured the price was low enough that if it wasn’t useful to me I would not have wasted much. I couldn’t figure out how useful it would be from reading the documentation. In the end, the web provided more useful how to and advice than the Apple documentation ever did.