Aftermath

So last week was incredibly stupid.

Broken is #1 on Hacker News

I’ve had one thing I’ve written reach the #1 spot on Hacker News before, plus an old website that reached the top of the trending del.icio.us charts, as well as a few moronic tweets that garnered a couple hundred likes, and one appearance on Slashdot back in the day.

But the last seven days were my first real taste of how awful it must be to actually be, you know, internet famous.

So, because all this was new to me and I’m a data geek, I’d thought I’d share some final numbers. And if you’re a HN visitor, make sure you stick around for a fun surprise.

But first…

Let me say this. I’m often highly critical of Apple. Because I believe in their vision of computing, and also because my livelihood depends on their ecosystem remaining healthy. I’ve written many times on this blog about dumb bugs I’ve encountered on macOS and iOS. Even going so far as to refer to Photos.app as “fuckery”. And I’m no stranger to writing pithy, negative, reactionary tweets, either.

But here’s the thing. My big post from last week? The one you’re probably aware of if you’re reading this site right now? It wasn’t fair.

Don’t get me wrong. I fully stand behind every criticism I leveled at Apple. From the specific bugs, to the broader statements about detecting a lack of focus on the Mac in recent years, to my final thesis about their lock-step, annual release cycle hurting the company’s ability to maintain software quality.

But the part that wasn’t fair. The parts that I regret are my direct insults at those in charge. I’m all for eating the rich and all that, but in a general, class-warfare sense. When I’ve written mean things about Apple previously, I kept my words pointed at the company as a whole. Last week I took aim at a specific, small group of people. I forgot that for many of them, the Mac has been their life’s work. I’ve attended WWDC eleven times and had the opportunity to talk with quite a few Apple executives. They were great. And I don’t think they were just gaslighting me either. So I shouldn’t attack them for putting so much effort into a product that I also love – even when I vehemently disagree with them at times. Some folks inside the company reminded me of that in the past few days and called me out on my shit. And so, I’m sorry. I’ll do better.

Here’s what you came here for…

I guess I’ll go through this in the order it happened.

After a ridiculously stupid first fifteen minutes on Catalina, I posted this dumb screenshot and tweet as I was walking out the door to dinner. I thought I was exceedingly clever with my phrasing. Turns out, the internet did, too.

By the time we got our drinks, my phone was buzzing every five to ten seconds with another like or retweet. A little while later it had ramped to a twitter notification (literally) every second. And by the time dinner was over, those notifications stopped saying “@SoAndSo liked your tweet”. Instead, they changed to “15 people liked your tweet”. Every. Second.

At first it was fun. But by the time I was sitting in my kids’ room that night waiting for them to fall asleep, I realized my phone was simply unusable because the interruptions never stopped. Even after I turned off alerts, it was impossible to wade through all the noise in my mentions to find legit comments from people I was interested in. I’ve never used Twitter’s “quality” feature before to filter my notifications, because I’m not popular enough. But now I realize that feature is absolutely necessary for high-profile folks to actually use the platform.

Here are the final engagements numbers, according to Twitter, seven days later:

Tweet Engagement Statistics

And here’s a fun Twitter visualization. It’s me scrolling as fast as I can through my activity timeline over the past four days – Twitter won’t let me go back the full seven since all this started.

Scrolling Twitter activity timeline

And when watching that, keep in mind that the vast majority of those table cells are notifications like “12 people retweeted you”, not just a single “@timapple liked your tweet”.

So that tweet blew up on Monday. On Wednesday night I got the bright idea to write a lengthy followup about the Catalina bugs I ran into and my thoughts on the state of the Mac in general. 2,500 words over an hour or so. Do a quick check for spelling mistakes. Publish. Go to sleep.

Around 11am the next day, I received a Twitter DM from an Apple engineering manager who (I thought) had somehow stumbled across my post. And they very kindly asked if I could jump on a call to go into more details about my post and the bugs I identified. They were super awesome and I’m looking forward to working with them more this week.

But I did find it odd that it hit their radar (no pun intended) so quickly and that they would be so immediately proactive about reaching out. And that’s when, just a few minutes later, Vigil texted me that my web server had fallen over.

Both this blog and my company website were inaccessible. And so were all the other sites I host on this machine. I couldn’t even SSH in. I logged into Linode to check the recovery console or maybe just reboot the machine, but happened to glance at the recent activity graph.

Here’s the last 30 days network averages for this server – with the first big spike corresponding to the morning after I published the post.

I’m not an expert, but I’m hardly a server-side dummy either. My $20/month VPS should have been able to handle this amount of traffic. Why didn’t it? Because after running WordPress since 2006, I switched to Ghost earlier this year. Not because I was dissatisfied with WP, but I just wanted to learn more about Node and I thought it would be a good way to get my feet wet. Well, that experiment ended literally last Sunday night when I decided to switch back to WordPress…and forgot to turn on caching. ????

While I did go through the trouble to make sure all of my static assets were coming from a CDN…I. Forgot. To. Turn. On. WordPress. Caching.

Luckily, my domain was already using Cloudflare for DNS. So I immediately logged in and turned on their caching layer (it had been off while I was testing my WordPress migration earlier that week), and within ten minutes my server naturally began responding again. Thanks, Cloudflare!

And what does Cloudflare have to say about my traffic?

Cloudflare Traffic Graph

513,286 requests. And keep in mind, because all of my static assets are served from another CDN (Hi, Bunny ?), the only request being served through Cloudflare was the HTML for my post.

And just for the sake of completeness, here’s what BunnyCDN‘s graphs look like for that time period…

BunnyCDN Traffic Graphs

So, those are the overall technical stats for the week. Before I get to the really interesting surprise at the end, let’s take a look at some more unique / esoteric observations I noticed.

First, beyond the obvious conversations around Catalina that my tweet and post brought up, what was the topic that people emailed, @’d, and DM’d me about the most?

If you look closely at the Catalina screenshot I posted, you’ll see my desktop’s wallpaper is a hi-res satellite image of North America. Across all the above communication channels, I had nearly fifty different people ask me where it came from and if I could send it to them.

I was sorry to reply that I don’t have access to the image. Because it’s not a picture (well, yes, it is technically). It’s an app called Downlink made by the fantastic Anthony Colangelo. It sets your Mac wallpaper to a live satellite image of the continent of your choice every twenty minutes. And it’s free! Go get it!

Because I’m not normally popular, I’ve mostly forgotten that WordPress spam is a thing. But holeee shit. I don’t have comments enabled on my blog because I would rather folks respond by writing on their own blogs and linking to mine. You know, like how the web used to work before Facebook and Twitter destroyed all the good in the world.

But as my post climbed higher and higher on Hacker News, I started getting emails from my blog’s WordPress install telling me about new pingbacks. A small percentage were legit posts on other websites discussing what I wrote, but for the most part – nearly 200! – were spam blogs that just regurgitated what another credible website wrote or even each other. And they were all shitty, awful, wordpress.com subdomains like the following. (I’m posting an image of their URLs because I refuse to send them any google juice.)

Wordpress Pingback Spam

I know spammers are relentless and will ultimately ruin everything, and I certainly don’t envy Facebook’s and Twitter’s bot problem, but is WordPress.com (with the help of Akismet) not capable of shutting this stuff down? I don’t know. It’s not my area of expertise.

Moving on. With so many people visiting my personal website, did any of them explore further and reach my company website? Did it affect my app sales at all?

Yes!

Here are visits to my company website over the past 30 days:

clickontyler.com visits graph

And, more excitingly, here are my app sales for that period:

App sales graph

You can see the spike in sales and then how quickly they fall off again back towards normal. That bump certainly won’t make me rich, but it was a nice surprise.

Given that this post – and most of my blog and company website – are about the Apple ecosystem, what did the device breakdown by brand look like?

Device breakdown by brand

Apple devices (both macOS and iOS) were 66.7%. Google (specifically Pixels, not Android as a whole) was 5.4%. The unknown segment was 12.2%

43% were from the US. San Francisco being the most popular city – not surprisingly.

Ok, I’m almost ready for the real fun statistic. But one more thing before I get there. What channel sent the most traffic my way?

Top referring channels

Hacker News sent by far the most traffic – nearly 3:1 above Twitter in second place.

But of those acquisition channels, what source provided the most engaged visitors? This is the question I’m most interested in and what I’ve been teasing throughout this entire summary post. What I want to know is average time spent by referring website. Some of these results make sense. And others I found incredibly surprising.

I’ve segmented these time spent numbers to only apply to visitors of my popular blog post – those who entered my site via the short, earlier in the week one that basically just reposted my viral tweet aren’t included.

  1. instapaper.com – 7m 12s People who took the time to bookmark my post for reading later, must have really intended to read it.
  2. clickontyler.com – 5m 36s This isn’t that surprising given that these are visitors coming from my company’s website and likely already have some passing interest in my content. (And I did make sure to filter out visits from my own IP address.)
  3. mjtsai.com 4m 52s For those of you not familiar. Michael Tsai has run one of the definitive Apple developer link blogs for years. Not only does he round-up the best sources of information, he arranges them in a narrative structure that reads like a news article with his own commentary interspersed.
  4. relay.fm 4m 49s The kind folks over at the Connected podcast listed me in their show notes. I’m kinda scared to hear what their commentary was.
  5. newsblur.com 3m 13s My preferred RSS web app.
  6. theregister.co.uk 2m 26s Their reporter emailed me for a quote. I declined. So they published my email declining their request for a quote. I’m super media savvy.
  7. pinboard.in 1m 57s My favorite bookmarking service.

And now I have to skip through a couple pages in my stats app to get to the good stuff. Twitter and Hacker News.

Twitter, by nature, is probably going to rank low on time spent. And it did. 46s.

Hacker News?

There were two stories on HN last week linking to my website. The first amassed 315 comments. The second has 536 currently. And HN is generally known for having a good, mostly thoughtful comments section.

So, 851 comments – many of them full paragraphs or longer – about a 2,500 word blog post that should probably take an average of ten minutes to read.

Forty. Two. Seconds.

Broken

This isn’t the blog post I intended to write. In fact, last night I drafted up one about my problems sending background push notifications with Amazon SNS (coming soon!). And after the ridiculously over-the-top shit-storm that blew up over my dumb tweet earlier this week, the last thing I wanted to do was step back in that arena. But this needs to be said. But, first…

I’m an Apple developer. It’s the specific nerd sub-culture that I identify with the strongest. I’ve been writing and selling my own software for macOS since 2003 – back when it was still Mac OS X – back when apps were called software. And I had apps in the iOS and Mac App Stores on day one of their respective openings. I’m not rich from it, and I don’t claim to even be that successful. But for a few wonderful years my own apps were my full time income. I’m not part of the old guard of Mac developers, but I’ve been around the block and doing this for over a decade and a half. I hope I’ve earned the right to spout off my stupid opinions on the internet occasionally.

And as an Apple software developer, I live through the Summer beta periods. On my secondary machine. And, in recent years, within virtual machines that allow me to do more intricate testing. I’ve seen easy-going mostly spit and polish releases as well more substantial user-facing and under-the-hood ones.

But Catalina has been different in two particularly gruesome ways that get even worse when combined.

The first, is purely from a stability and functional standpoint. The early betas of Catalina were really, really broken. But that’s OK! That’s what betas are for. And while I can only speak for myself, I think most developers are more than happy to offer input to Apple and report bugs. So I’m totally fine using a wonky OS for a few months on a spare machine while I test my own software in addition to Apple’s.

But here’s the bad part.

Apple is becoming (already is?) a services company. And, let’s face it. Apple has never been good at anything involving the internet. I feel like they could have all the money and engineers in the world (which they basically already do) and still never completely get their services right because it’s just not in their DNA. Applications are. Hardware is. But put a network layer in there and they crap themselves. (Ok, not in every case. I’m obviously exaggerating to make a point. But the overall track record is iffy at best.)

And so when they decide to overhaul how CloudKit and iCloud Drive work and then merge those changes into an already buggier-than-usual beta OS, disaster can ensue. Because now those bugs – file corruptions, missing data, broken APIs and fundamental things that simply stop syncing – can spill over and infect your other Macs running a stable OS.

It’s my own fault for not knowing any better and signing into my Catalina machine with my personal Apple ID, but I needed to do some iCloud development this Summer and using my own ID just made things simpler. But after I ended up with (not joking) two-hundred duplicated ~/Documents directories – each with a random assortment of duplicated files of different revisions – I swore off dealing with Catalina and iCloud for the rest of the Summer. I put all of those new features on hold and planned to pick them back up after the GM when everything stabilized. I signed out of iCloud on every Catalina machine and VM and assumed Apple would get their problems sorted by Fall.

And I wasn’t alone in that assessment and strategy. Just google around for developer blog posts and tweets from the beta 3-ish time period. And that’s the puzzling and quite scary thing about all of this and the ultimate point I want to make. We (developers) were making it loud and clear that this stuff was very, very broken. And, somehow, someone at Apple made the call that it was OK to release a Public beta onto the world. A buggy, broken OS is one thing. Users installing it should know to beware. But a buggy, broken OS that also puts their data in jeopardy both on that machine and all their others linked by an Apple ID is unconscionable.

And still the betas marched on. And eventually it seemed like Apple realized what they were up against and threw in the towel and reverted the OS-level iCloud changes – much like discoveryd a number of years ago.

Again, I just read and heard about all of this. I was completely off iCloud on Catalina at this point and assumed the massive rollback would fix things.

So when Apple officially released Catalina to the public this week without so much as a press release or heads-up to developers (yes, there had been a GM build, but still, would an email to developers ahead of time have been so difficult?), I was ready to upgrade and go all-in.

Perhaps I was being naive, but I truly care about the experience my software customers have. And that means I have to live with the same system they do – even if that means dealing with OS bugs that just couldn’t be fixed in time for the .0 release.

But, damn.

I’ll go through some of the highlights (lowlights?) I’ve run into below, but I guess this is my thesis: The final (well, first) Catalina release along with the outright awful public beta makes me think one thing. And that is Apple’s insistence on their annual, big-splash release cycle is fundamentally breaking engineering. I know I’m not privy to their internal decision making and that software features that depend on hardware releases and vice-versa are planned and timed years (if not half-decades) in advance, but I can think of no other explanation than that Marketing alone is purely in charge of when things ship. Why else would stuff so completely broken and lacking the attention to detail that Apple is known for and (ahem) markets themselves on have shipped if not than to meet an arbitrary deadline? Apple has so many balls in the air – and this metaphor doesn’t really make any sense now that I’m typing it – but they’re all interconnected now that Apple is a services company. And as a services company they must find a way to ship features, fixes, and updates outside of the run-up to the holiday season. They need to be more (and, oh god, this word makes me want to vomit) agile.

An Annotated Summary of the Catalina Crap I’ve Noticed So Far

Allow or Deny

Let’s start with my now infamous tweet from the other day. (I’m an influencer!) This screenshot has absolutely been manipulated to make a point, but everything in it is real. It’s all of the security warnings and permission dialogs that I ran into (and screenshotted and arranged for maximum effect) during my iMac’s first startup after installing Catalina as well as about ten minutes of poking around and launching a few apps.

Hoo, boy.

The point I was hoping (but probably failed) to make, is that there are many thousands of way smarter people inside Apple than me, and a frightening, pop-up frenzy that will absolutely condition non-technical users to blindly click “Allow” is the best solution they could arrive at or ship in time?

Maybe they did countless user studies and determined this really is the safest and best approach. But I doubt it. I think it was a combination of poor management, hard deadlines, and probably a cavalcade of upper management and C-level executives who only use iOS as their daily driver and simply lack the imagination, experience, and technical vision to realize a modal pop-up flow that (kind of) works on a touch device does not scale to an overlapping, multiple-window, keyboard and cursor driven interface, i.e., the desktop computer.

“Security”

Let me go ahead and silence the Hacker News crowd and openly admit that, yes, I’m a geek, a developer, a technical person, and most definitely not a normal user.

That said, there needs to be an I’m-Not-A-Dummy switch in System Preferences because all my shit’s broken and I can find zero guidance from Apple on how to fix it.

I have a number of background jobs and processes on my iMac, which I basically treat as an always-on, home server. Some are run via cron, others by launchd. Some are run under my user account, others as root. A few examples:

I have an AppleScript that runs every ten minutes and downloads photos from a server and imports them into Photos.app. After upgrading to Catalina, it failed every time. I stopped cron so I could debug and run it manually. The first time I execute it, Terminal.app asks for permission to access my ~/Photos directory. OK. Then it prompts to allow Terminal.app to control Photos.app. OK. And, finally, and I’m not sure why given I already granted permission for the ~/Photos directory, it asks for permission to control Finder.

iTerm would like to control Finder

sigh

With all those permissions granted, I add a few log statements and turn cron back on. The job runs. And fails. Again. Because even though I granted permission moments ago, now that it’s being run in a slightly different way, Catalina decides to lock it down again. How is this decided by macOS and how do I fix it? My googling has failed me so far.

Next. Because I’m an idiot with reasons, I have a python daemon that launches as root via launchd and remains running in the background. It is now silently failing because it isn’t allowed to access an external USB drive.

Oh, and while debugging the AppleScript example from a paragraph above, every time I saved my cron changes in vim, the system would throw up a dialog asking for my permission to allow Terminal to modify, you know, my own personal crontab that I explicitly invoked an editing session of. (Although I’m pretty sure this was also a thing in Mojave. But the point still stands.)

I guess Apple is trying to protect less-technical customers who might inadvertently install a malicious recurring background process as root? Or accidentally read a file from an external volume while running a shell script that this non-technical user opened up a Terminal, edited, made executable, and invoked themselves? I suppose there’s an attack vector there.

iCloud Password Shenanigans

After upgrading to Catalina, like basically every other recent macOS release, I found myself logged out of iCloud. Facebook and Gmail have never once in my life expired my session on purpose. But since my iCloud data is stored on an encrypted, non-removable hard drive, protected by a T2 chip and biometric security, I can see why it’s best if Apple logs me out every time I install a software update.

OK. Fine. I’ll log back in, but of course iCloud rejects my password in System Preferences so many times that they eventually lock me out and force me through a forgot password flow just so I can change my password back to what it always was.

That done and finally logged back in, all of my iOS devices start beeping and I find this:

My Apple ID is being used on a new device apparently.

To be expected given that I just logged in on a new(?) device. But, why is my iMac’s hostname now duplicated “(2)”? Reasons, I’m sure.

Local Account Password Shenanigans

After the Catalina upgrade and spending some time getting my apps, settings, etc. kind of back to normal, my wife tried to login to her account on that iMac.

Everything went fine, and we got some of her software updated, too.

But then the Mac went to sleep.

Her local Mac account password is a simple, all-lowercase English word without spaces or numbers or special characters. macOS wouldn’t accept it when she tried to login again.

It wouldn’t accept it when I typed it in. Nor when I decided the keyboard must have somehow malfunctioned during the last half hour and I thought I was extremely clever by trying to login as her via a remote screen sharing session and it failed as well.

That was about 36 hours ago and the problem persists through multiple restarts. I don’t have it in me right now to try and debug this. But I’m not worried. All of her data is backed up. I’m ready to blow away her account and create a new one. But,…Apple?

Photos.app

It took around eight hours for Photos.app to upgrade my 200GB iCloud Photos library the first time I opened it on Catalina. Since then, across multiple reboots, it simply refuses to update with new photos added to iCloud from other devices. Or upload new photos to iCloud that I imported directly on that machine. It just says “Updating…”, forever.

More Little Things

I found earlier today that I couldn’t restart my Mac because an application was still running. The only thing open (but idle) was Xcode. I did a quick ⌥⌘⎋ and discovered this:

System Preferences is not responding

I had no idea System Preferences was even running. It wasn’t visible in the Dock?

As I mentioned earlier, I had to sign in to iCloud again (a few times) after upgrading. A day later, this popped up while I was using TextMate:

Can't connect to FaceTime

?‍♀️

After upgrading to Catalina, macOS made me reauthorize every app that wanted to send me notifications. Ironically, the following alert appears every time I reboot despite always dismissing it using the most definitive option Apple provides and never giving whatever-process-is-showing-it permission to notify me of anything in the first place:

Welcome to macOS Catalina. You're in for a treat!

Anyway…

I love the Mac and everything its software and hardware stand for. The iMac Pro and new Mac mini are phenomenal. The revamped Mac Pro (six years? really?) is a damn beast. And, honestly, I don’t even mind USB-C.

But the keyboards, the literally hundreds if not thousands of predatory scams on the Mac App Store, whatever the fuck is going on with Messages.app on macOS, iCloud Drive, the boneheaded, arrogant, literally-put-on-the-consumer-facing-marketing-website claim that iPad-to-Mac with Catalyst was merely a checkbox, all the dumb, stupid little bugs I mentioned above, and the truckload of other paper-cuts I’m sure to run into once I’m on Catalina for more than 48 hours…

My god.

It is absolutely clear that the Mac is far outside of what the upper-ranks of Apple is focusing on.

I’m not trying to throw Engineering under the bus. I’m friends with many wonderful, talented, hard-working, and caring Apple developers who want the Mac to fucking thrive. What I am doing is explicitly shitting on management and blaming the executive team for allowing all of the above to ship.

macOS 10.15 Vista

I completely realize and wholeheartedly own-up to the fact that I’m a geek and a Mac power user above and beyond what normal muggles will ever experience, nonetheless, this is the first-run experience I was greeted to this afternoon after upgrading to Catalina.

macOS Catalina First-run Experience Screenshot

Another update…

It’s now 48 hours later. Ignore my original update below. Go read this instead.

Update four hours later…

Well, that escalated quickly.

I’ve been on Twitter for twelve years, and OF COURSE my two biggest tweets would be me making a dumb joke and dunking on Apple.

Anyway, the screenshot above deserves a full explanation.

First of all, it’s about 95% accurate.

I took it after upgrading an existing Mojave system to Catalina this afternoon. Once the installer finished and I worked my way through the usual post-installation prompts/windows/whatever, I left and took my son to go get a flu shot. (Glamorous life of a father and all that.)

When I came back about forty minutes later, that’s basically how the screen looked. I thought it was mildly funny and began arranging all of the permission dialogs so they didn’t overlap.

And that’s when all the “XXXX would like to show notifications” prompts appeared. So I took another screenshot.

Soon after that, I realized that – like with nearly every macOS update – I had been logged out of iCloud, which meant time for a screenshot yet again.

I only spent about ten minutes on that system today. But it was enough time to capture all of these papercuts and combine them into one truly-awful über screenshot.

I want to make clear that I’m not blaming the talented Apple engineers who obviously worked their butts off on Catalina just like they do every release.

My side-eye is squarely directed at the managers and Marketers who push for such an insane release cycle. And also at the executives who – shiny new Mac Pro and XDR display be damned – obviously don’t see the Mac as a priority any longer.

From laptop keyboards to the Mac App Store to the insane new focus on Services, it’s clear the higher-ups at our favorite fruit company just don’t give a shit any longer.

And I’m not angry.

I’m just sad.

A Stupid-Simple Automated iOS Build Script

I’ve worked with a bunch of different automated iOS build systems over the years at the various companies I’ve worked for and with my own apps. In the early days of the App Store, many of these were completely home grown. As the toolchain matured, I’ve dealt with Xcode bots as well as dedicated SaaS companies that provide build farms like Microsoft App Center and Bitrise. I’ve also had the horrible misfortune of being tasked with maintaining a dilapidated, Frankenstein of a Jenkins installation that talked to an underpowered Mac mini over a shoddy VPN connection.

What I’ve learned from all those setups is that as useful as they are, they’re generally a bitch to maintain once they reach even a moderate level of complexity. So I tend to shy away from them until there’s a real need.

Over the last few weeks at my current job, that need has presented itself in two ways.

  1. We added a watchOS target to an existing app already in the App Store. For reasons I don’t completely understand even after hours of debugging, it completely broke my co-worker’s ability to submit builds to App Store Connect. We did a fresh clone of the project, blew away Derived Data, deleted and reinstalled every certificate and provisioning profile. Nothing worked until it seemingly fixed itself about a week later for no apparent reason.
  2. For reasons I don’t want to (and can’t) really go into, we have about fifty WiFi networks broadcasting through our small office space. Many of them physically moving around at different times. That, plus what we think is a shitty Comcast modem, means the WiFi we actually connect to randomly fluctuates between passable, to mostly broken, to everyone just gives up and tethers to their phone. The result being that it can take multiple tries and multiple hours to successfully upload our 250MB .ipa to App Store Connect – if it even works at all.

The solution to these two problems? An automatic, repeatable way to produce builds located somewhere else with a good network connection.

Like at many companies, our executives don’t want to use a 3rd-party build service because they don’t want our source code in someone else’s control. So that meant we needed to build something ourselves. And while we may eventually pony up for a hosted Mac mini somewhere, for now during this just-get-it-working-phase, I decided to go the pragmatic route and setup a build system on my (mostly idle) iMac Pro at home that sits behind a very nice Comcast Business connection.

I’m not a DevOps expert. And I’m certainly not an expert when it comes to the thousands of arcane Xcode build settings. But over the years I’ve become very good at diagnosing code signing issues and scripting various bit and bobs together on macOS.

So I spent a couple nights piecing together a straight-forward, stupid-simple, build script that does exactly the minimum necessary to accomplish our goals. Those being 1) the ability to execute a reproducible build on-demand, and 2) automatically build, sign, and submit to Apple on every commit to a specific release branch.

The result is this GitHub project. It’s a single bash script and works exactly the way my own brain expects tools of this nature to work.

For a manual build, you pass the script a JSON file containing various build settings, and it builds the project and then (optionally) submits to Apple.

For automatic builds, I’ve included a sample launchd .plist that checks for new commits every minute and, if any are found, kicks off the build process.

Oh, and as the build progresses through all the various steps, the script can optionally update you with its progress in the Slack channel of your choosing. Even better – and I’m quite proud of this – if an error occurs, it will post the full stdout and stderr log files as Slack attachments so the whole team can immediately debug and see what went wrong without having to SSH into a remote build server.

I think the whole setup is really great. It suits our needs perfectly. I make no claims about it being the right choice for your situation, or that I even did anything remotely unique / interesting when I pieced it together. There are a thousand build scripts out there; this one happens to be mine. I guess what I’m saying is please don’t make fun of my shell scripting abilities.

Check out the README for more details and usage instructions.

More Apple Photos Fuckery

Back in May I posted a Twitter rant about how iCloud Photos was fucking up videos I shot on my phone after upgrading to iOS 12.3. I’m happy to report that hasn’t happened again. But now I’m running into this…

Every two months I upload a bunch of photos to Shutterfly and mail physical prints of my kids to my 95 year-old grandmother – their great-grandmother. The Shutterfly app for iOS makes this very easy. Just select the photos from your library, wait a few minutes for them to upload, and then tap on my grandmother’s saved mailing address and hit send. The whole process takes about five minutes and it brings her a whole lotta joy.

The issue I’m running into is that I’m only able to choose photos from my library. Because I’m doing this on my phone and because Apple doesn’t understand that families might want to see each other’s photos, I can’t pick any pictures my wife may have taken of our kids.

But, as I’ve written about, all of the photos we both take are stored in Google Photos. Great. So today I went to Google Photos, selected about sixty of the best shots, and downloaded them to my Mac.

The next obvious step? Upload those to Shutterfly and ship them off to my grandmother.

Except that Shutterfly’s website uploader doesn’t work in Safari. Like everyone else doing “modern” web development these days, I’m sure they’re only testing in Chrome. Fine.

No big deal. I’ll just import the photos into an album in Photos.app on my Mac, which will then sync to my iPhone, and then I can use the Shutterfly app.

And here’s where Apple’s latest Photos.app fuckery comes into play…

That’s right. I dragged photos from Finder into an empty album in Photos.app and watched as they were imported and then subsequently deleted.

I did this five times with the same result before I finally thought to record the whole fiasco for posterity. No matter whether I dragged from Finder or used the “Import…” menu item – same thing. I even checked the special “Imports” album in the Photos.app sidebar, which shows the most recent group of photos you’ve imported. Nope. Not there either.

Look. It’s not all bad news. iCloud’s shared photo streams? Those are rock-solid and amazing. Our family and friends hardly ever post to Facebook or Instagram anymore. We almost exclusively share through Apple’s shared albums. And like someone recently remarked on Twitter, the photos/videos we share get way more meaningful engagement (likes + comments) from our close circle of friends and family than they ever did via Facebook.

But jesus fucking christ, Apple. Have you really gone so all-in on “iOS is the future” that you’ve abandoned Mac Q/A? Is there anyone on the executive team that actually uses a Mac as their daily driver? I get the feeling that the only reason iCloud shared albums work as well as they do is because they happen to also be a major feature of iOS. But Photos.app on Mac? Or Messages.app on Mac? Pffft. Fuck that. iPad OS is going to have multiple windows soon. Who needs a Mac?

Reviving an Old Mac App

A long time ago, on a Mac far, far away…

In 2011 I had an idea for a tiny little Mac app called CommandQ.

I’m a terrible touch-typist and it just so happens that on U.S. style keyboard layouts, the Q and W keys are right next to each other. That means I’d often press ⌘W to close a window and accidentally hit ⌘Q – quitting the entire app instead of just the window.

So I spent a few hours each night over the course of a week or two and built a tiny little Mac app that intercepts your ⌘Q presses and stops the frontmost app from quitting. Instead, it shows a window with a timer. If you keep holding down ⌘Q until the timer finishes, it will go ahead and quit the app just like you wanted. But if you let go before time runs out, you can avoid a dumb mistake.

I named the app CommandQ and put it for sale on my company’s website for a few bucks. It was unlike anything I had built before. All of my other apps were more complex and primarily focused on helping web developers and designers do their jobs better.

But my existing user base welcomed CommandQ with open arms. I iterated and added a whitelist / blacklist feature to let you choose which apps CommandQ works with or ignores.

And then I left the app alone – I haven’t shipped a single update for it since mid-2012.

But, to my pleasant surprise, in the eight years since its release, my little utility app has gained over 20,000 customers!

It’s also brought me a lot of guilt.

The code was so old (pre-ARC Objective-C, targeted against the 10.5 SDK, and linking against OpenSSL), that every time I thought about trying to fix a few bugs, I got blocked just trying to get the project to compile on modern versions of Xcode and macOS. Not being able to add some requested features or even fix the one crashing bug just made me feel awful and even less motivated to actually improve the app.

So, CommandQ languished kinda-sorta-working for a number of years, but people still kept buying it and emailing to say how much they enjoyed the app.

With WWDC 2019 a few weeks behind us, it’s again that time of year where I spend my nights testing my Mac apps for bugs against Apple’s upcoming OS. And I quickly discovered that CommandQ is very much, incredibly broken on 10.15 Catalina.

What to do?

I don’t normally ever suggest a rewrite of a working app, but with CommandQ being such a small codebase, I figured why not? And spent the next week rewriting it with Swift and modern macOS technologies.

The result is CommandQ 2.0.

I’ve fixed all the known bugs and crashers, added the ability to prevent accidentally closing windows (⌘W), and pushed the app through Apple’s notarization process as well.

If you’re an old CommandQ user, I hope you enjoy this new version for many more years to come. You’ll be able to upgrade to version 2.0 for a one-time purchase of $6.99 – no subscriptions here! Check your registered email next week for the coupon code I’ll be sending to claim your discount.

If you’re new to CommandQ and think it sounds useful, you can download a free 14 day trial and then purchase a license for $9.99.

The Mac Won’t Be Sherlocked

With last week’s WWDC news announcing that Catalyst (Marzipan) is now an official thing, there have been a metric crap-ton of Twitter Hot Takes™ declaring UIKit the one true way forward.

I’m not going to debate that.

Instead, I just want to point out that not everything in computing revolves around a 44pt tap target by highlighting a few amazing Mac apps from oft-overlooked developers who do a phenomenal job adhering to the very best parts of macOS while continuing to push the platform forward.

Let’s begin.

Acorn

First up, Acorn by Flying Meat. For $29, an absolute bargain, you get a top-notch, native image editor designed exclusively for the Mac that is sure to make ex-Photoshop users feel immediately at home. It’s a first-class Mac citizen, not some cross-platform Frankenstein, which means it opens lightning-fast, is easy on your battery life, and can take advantage of the latest graphics tech that Apple offers.

I use Acorn almost every day for editing images before I drop them into my Xcode projects and for optimizing photos I post to this blog. Is it as powerful as Photoshop? No. Of course not. But in the many years I’ve relied on Acorn (as a power user, not as a professional designer) I can’t remember a single time I’ve needed to reach for an Adobe app.

It’s a prime example showcasing how a solo developer can build an outstanding app when they take full advantage of Apple’s powerful frameworks.

Anesidora

Anesidora is a native Mac app for listening to Pandora.

The streaming music giant has long-offered a pretty good iOS app, but only recently launched their Mac version. Unfortunately, it’s an Electron turd. (Man, I wish I had coined that phrase.) Luckily for you and me, Adam Różyński has built this wonderful app directly on top of Pandora’s API.

What I love most about Anesidora, besides the beautiful UI, is how well it integrates with all the various macOS nooks and crannies. You can see track changes in Notification Center, assign global hotkeys, Apple’s keyboard media keys do the right thing, and it will even pause the music if you take an AirPod out of your ear.

And because it’s a real Mac app, there’s no need to fire up a full instance of Chrome and Node just to play some music. Right now as I’m typing this and listening to Talking Heads, Anesidora is using 92MB of RAM and virtually no CPU. For comparison, Safari, idling in the background with no windows open, is using 105MB.

DEVONthink

I’ve written twice about DEVONthink recently, but holy damn do I ever love this app. It’s been around for years – I’ve been a customer since at least 2009. It’s where scanned copies of all the documents that would normally be in my fireproof safe are stored, along with any piece of paperwork I might need to refer to in the future, tons and tons of reference material, archived bookmarks, and more. And whether PDF, image, or what-have-you, it’s all indexed, made searchable, and synced across every Mac and iOS device.

DEVONthink has a truly deep and powerful feature-set. But the reason I want to highlight it in this post is because it’s a perfect example of how you can use AppKit to build a dense, information-rich, highly usable interface. UIKit is wonderful for its simplicity, discoverability, and ease of use, and you can certainly design powerful, professional apps with it, but that’s not its default state. Not even on iPad. AppKit, however, naturally lends itself to concurrent areas of focus and visual hierarchies. You can see more of your work at once and do more with it. Part of that, of course, is that AppKit can run on giant screens compared to what UIKit has traditionally been constrained to. But it’s also a matter of a different philosophy between the two (competing?) frameworks.

Fantastical

Fantastical is the calendar app that Apple should have shipped with macOS. Flexibits goes beyond the basics and shows how powerful a pro version of Apple’s consumer-focused software can be.

Whenever I open a new Mac app, the first two things I do are to look at every menu item and then explore all the settings available in Preferences. Nothing gives me more joy (or sick pleasure) than hitting ⌘+comma and seeing a big, beautiful Preferences window with multiple panes full of fiddly settings I can tweak to my liking. I’m not advocating that developers should provide a switch for every option they were too afraid to make a decision about – sane defaults are a good thing and there’s paralysis in choice, etc – but flexible third-party software is a hallmark of the Mac experience.

Fantastical takes the solid foundation Apple built into Calendar.app (née iCal) and extends it for power users in the ways indie developers are famous for.

Keyboard Maestro

As a geek first, a developer second, and someone who just wants to get shit done third, I love how scriptable macOS is – how automation is part of its DNA.

At the lowest levels you’ve got full access to a Unix environment with Bash (haha) and all the other standard scripting languages and tools. (For now.)

And at the graphical level you’ve got AppleScript and its friendly cohort Automator as well as the Accessibility frameworks. The former allow novice users to bend the system to their will without having to be a real programmer, while the latter gives developers powerful means to manipulate, control, and extend other apps in ways the original developers would never have thought of. (Love OmniFocus? Wait til you hear how it began.)

Keyboard Maestro is a shining example of a best-in-class automation tool for Mac. No matter how I try and describe the app, I’ll be doing a disservice to how insanely powerful it really is. Any time I come across a repetitive task, or wish another app behaved in a different way, or wondered if I could connect this with that, chances are, there’s a way to do it with Keyboard Maestro. It’s probably best I just let the expert tell you how it’s done.

(And it’s not just Keyboard Maestro, there’s FastScripts, Hazel, TextExpander, BetterTouchTool, Script Debugger, and all the various apps that followed in the footsteps of Quicksilver.)

Anyway

I love my phone. I love my iPad. The watch is pretty great and tvOS is tolerable.

But the Mac.

Wow, did they get a lot of things right 35 years ago. It just fits how my brain works.

I’m not afraid of change. I don’t care if I build my UI with drag and drop, purely in code, or with a whizz-bang new DSL. What matters to me is the end result. And after a decade of near stagnation, I want to see macOS pushed forward in bigger and better ways.

Taking an iPad app and adding a menubar, sidebar, mouse support, and even multiple windows may be fine for Twitter, but it’s not going to cut it for end users who expect a level of power, finesse, flexibility, and soul that the Mac exudes. At least not in the Catalina release.

Steve told us there will be trucks. And there will be cars. It’s OK if they’re not the same. I know which one I want to be driving.

My Favorite Email Spam Filtering Rule

All of my email is hosted at FastMail across two domains:

And across those two domains I basically have just three email addresses. [email protected] is my primary email address that I migrated to when I switched away from Gmail five years ago. And then there’s [email protected], which was originally my customer support address but has since been (mostly) replaced by [email protected].

However, one of the really cool things you can do when you accept email at your own domain name is a catch-all address. This means that [email protected] and [email protected] will be delivered to me.

This is great because I can easily create one-off or throw-away addresses like [email protected] or [email protected] that I can filter or block entirely. (You can also do this with Gmail by using [email protected]. Unfortunately, because some web developers are stupid and others are outright malicious, many websites will reject emails containing a +.)

The downside is that spammers are just bizarre. I’ll get random spam sent to [email protected] and [email protected]. As well as seemingly-possibly legit messages sent to [email protected]. (An address I’ve never used, so someone is obviously trying to correlate names to domains.) Its clear some spammers are just blindly sending to random addresses. While others are from bots (people?) trying to intelligently guess possible addresses.

Luckily, FastMail’s spam filters are great, so I don’t ever see most of that junk. But a lot of the more legitimate looking ones do make it through. How do I filter those out?

I could simply just block anything not sent to my real address, but I like having the option of using the catch-all feature as I do make use of it quite frequently. Another option might be to setup a whitelist of allowed recipient addresses, but that would quickly become a pain to remember to update anytime I gave out a new email.

The solution I came up with is simple. (It’s hardly innovative, and I doubt I’m the first person to come up with this method, but I thought it worth sharing.)

My Favorite Email Spam Filtering Rule

I created a rule that moves any email not addressed to one of my primary emails to a folder called Aliases. This serves three purposes:

  1. It allows me to continue using my domain name’s catch-all email address feature, but keeps the truly bizarre as well as possibly legit spam from clogging up my inbox.
  2. Much like SaneBox‘s @SaneLater feature, I can check-in on this other folder at my leisure because I know that any email that ends up there is either unimportant or plain spam.
  3. It lets me quickly see at a glance and setup a rule to block any repeated, bogus emails. If these types of emails where mixed in with the ones sent to my real address in my inbox, it would be harder to spot the invalid catch-all ones – especially on mobile devices which typically don’t show the full to: address.

Like I said, this isn’t exactly rocket-science. But it’s a nice improvement I made a few months ago, which has saved me a good bit of time dealing with those extra obnoxious emails that slip through my spam filter.

Automatically backup up the full contents of your Pinboard (and Pocket) bookmarks in DEVONthink

This is a followup to my post last week about archiving your existing Pinboard bookmarks into DEVONthink. I wanted to clarify two points and also explain the new workflow I’ve setup to automatically archive any new websites I bookmark – whether in Pinboard or Pocket.

  • First, in my last post, I said “I recently stopped using Pinboard as my primary bookmarking service”. I misspoke. I’m still using Pinboard to bookmark websites I want to remember as I come across them. However, I’m no longer paying for their add-on archiving service. It’s a great feature, but it had just a little too much friction for me to make genuine use of it. I like my new solution, which I’ll detail below.
  • Second, I just want to give a heartfelt shout-out to Pinboard for simply being an amazing example of a phenomenal service that a solo developer can build “the right way” over many years into a sustainable, profitable business without relying on venture capital. While not nearly as successful, it’s the same path I’ve tried to follow with my own business.

Anyway, here’s what I’m doing now.

With all of my previous bookmarks safely archived in DEVONthink, I turned my attention to automatically importing new ones as well.

I’m using the new version 3 beta of DEVONthink and discovered it has a new “Smart Rules” feature. (At least I think it’s a new feature. If it’s not, how its the world did I miss it for so many years?) Smart Rules are like the typical NSPredicate-based Smart Folders you see in other Mac apps such as Mail. But instead of simply showing you a filtered view of your data, Smart Rules allow you to perform a chain of actions on items that match your criteria.

Both Pinboard and Pocket offer RSS feeds of your bookmarks. And DEVONthink allows you to subscribe to a feed and import its items into your database.

So, I added my Pinboard and Pocket RSS feeds into DEVONthink, and then created a Smart Rule that runs whenever a new item is imported. The rule takes the URL of the new item, converts it into a .webarchive, and moves it into a pre-determined group for permanent storage.

The result is that I now have an automatic snapshot of everything I bookmark as it appeared at the time I saved it, which can be searched and retrieved via DEVONthink’s amazing full-text search engine, or exported as a PDF at a later date if I ever have the need. And with DEVONthink To Go, those archives are also available across all of my iOS devices as well.

After you’ve added your RSS feeds to DEVONthink, here’s a screenshot of the Smart Rule I’m using to do the archiving.

Automatically backup up the full contents of your Pinboard (and Pocket) bookmarks in DEVONthink

How to Import Your Pinboard Bookmarks Into DEVONthink and Convert Them to Searchable Web Archives

Pinboard is a web-based bookmarking service that can optionally crawl the websites you save and store a complete copy of how they appeared at that time.

Because Pinboard is a good web citizen, they allow you to request an archive of all of your bookmarks and their saved contents as a tar.gz file.

I recently stopped using Pinboard as my primary bookmarking service and wanted to export my data and store it somewhere in a searchable, archived format.

I already use DEVONthink to archive and search all of my scanned documents and PDFs, so it seemed like a natural choice as it also supports just about any other file format – including macOS web archives.

The backup archive that Pinboard gives you contains a folder for each of your bookmarks containing the complete contents of the scraped website as well as a JSON-formatted manifest file of metadata.

I spent a few hours trying to wrangle everything into DEVONThink using some AppleScript trickery, but was never successful. But then two thoughts occurred to me:

  1. You can save a URL to your DEVONThink database and then use a menu command to scrape the website into a PDF or .webarchive.
  2. .webloc files can refer to any URL scheme – including file://.

What if I generated a bunch of .webloc files – each one pointing to the location on disk of my Pinboard bookmarks? And then imported the .weblocs into DEVONThink and told it to crawl those URLs?

It worked!

And if you also happen to have this rather unique need, well, I’ve made the PHP script that does it all for you available on GitHub.

The PHP script in the repo will read the contents of your Pinboard archive and generate a .webloc file for each bookmark. Those files can then be imported into DEVONThink as file:// URLs pointing to the archived web content on disk. Then, DEVONThink can “crawl” those file:// URLs and convert them into searchable web archives. Afterwards, the .webloc files can be deleted.

On my iMac Pro with a fast internet connection, importing 3,500+ bookmarks and their 2GB worth of web content took about four hours. After it was finished, I had a fully searchable archive of all of my Pinboard bookmarks that can be sync’d across all my of Macs and iDevices.

Hopefully someone else will find this script useful.