Roar Notifications

As much as it is a job and source of income, for me, building software is also a way to relax, a form of self-expression, play, and in the best moments – joy.

Today, I want to show off a project so far along the joy side of that spectrum that it blows past being silly and borders on pure ridiculousness.

Let’s reskin Notification Center on macOS.

And make it look like Winamp.

Last year, I wrote two posts titled 240 Invisible Pixels and Surtainly Not. On the surface, I complained about how difficult it is to interact with the Big Sur redesign of the menu bar and Notification Center alerts. I believe Apple has fallen victim to a design fad where they unintentionally make software harder to use in their quest to simplify the design visually.

As Michael Tsai recently put it

Making things look simple by hiding things doesn’t actually make them simple.

Affordances, color, contrast, delineation – hell, even discernible buttons – have fallen out of favor in Apple’s modern design language to a point where function often takes a backseat to form – especially on macOS.

But ignoring recent usability regressions in favor of clean-looking design, what kills me the most is how joy, personality, whimsy, fun, and goddam delight are lost in service to aesthetics.

Modern software, with Apple leading the charge, feels…sterile.

How much design and engineering effort has shifted from delighting users to finding new ways to interrupt their workflows with dark patterns leading to greater services revenue? It all feels gross.

Speaking of which, did you know you can save 3% on all your Apple purchases with Apple Card.1 Apply and use in minutes2

And so every time I have to tell Apple “Not Now” when “No” should be a choice, or when I have to guess where to hover the mouse to reveal what should be a call to action, I get frustrated.

I could let those things keep bothering me. Or I can tell the latest design trends to go fuck themselves and build something stupid and silly and fun and delightful.

Does everyone remember Growl? It was a fantastic 3rd-party app notification framework for OS X that Apple aped Notification Center from. (It’s OK. Some apps are meant to be Sherlocked.)

The best part about Growl was the plethora of built-in and community-contributed skins. It was so geeky and personal, and you could make app notifications look like anything.

Last month, a reader came up with a better way to grab two-factor authentication codes from Messages.app, I started thinking about what other information might be hanging around in the bowels of macOS – just waiting to be explored.

I eventually stumbled upon the SQLite database that holds app notifications. To my surprise, all the contents of your notifications are open and readable. And that’s when Growl came to mind. Could I build my own poor-mans Notification Center with different skins and themes?

It turns out, yes.

Could I layer on some extra features, too? Yep.

Say hello to Roar.

Roar is a little Mac app that watches your Notification Center database. When new ones arrive, it takes over and displays them with one of five ridiculous skins.

Bezel

The first is a straight-up reimplementation of Growl’s old in-your-face Bezel style.

Music Video

Growl’s Music Video display style was my absolute favorite for the longest time.

Macintosh

This style displays a classic Macintosh window – complete with the best freeware version of Chicago I was able to find with a quick Google search.

iTunes Widget

When I showed Mario Guzman what I was up to with this project, his immediate reaction was to offer a skin based on his pixel-perfect Music Widget.

Winamp

Finally, if you grew up in the 90s, how can you say “no” to an animated Winamp skin?

But wait, there’s more

My original complaint in 240 Invisible Pixels was how difficult it is to interact with notifications on recent versions of macOS, given their hidden buttons and tiny click targets. What if we take the mouse pointer out of the equation and make common notification actions keyboard-driven?

When I need to interact with a notification, I typically do one of four things:

  1. Open the associated app
  2. Copy a two-factor auth code
  3. Open a URL in the notification
  4. Copy the notification’s contents

Roar lets you do all of these things with a global keyboard shortcut. You can press the hotkey to reveal a menu with four choices when a notification arrives.

Press ⌘1 – 4 to select an action (or Escape to dismiss) and run it against the most recent notification.

⌘1 will activate the app associated with the notification. Basically, the same behavior as clicking a notification.

⌘2 will copy the first two-factor auth code it can find. In the screenshot above, that action is gray to indicate that no code was found in advance.

⌘3 copies the contents of the notification to your clipboard.

⌘4 Will open any URLs in the notification in your default browser.

Don’t take this seriously

Everything shown above works. The code is on GitHub, and I’ve made a notarized build available if you’d like to try Roar.

That said, because the app relies on scraping your notifications from Apple’s database, Roar is at the mercy of how often the system saves notifications to disk. In my testing, that delay varies. Sometimes Roar will appear immediately after the original notification. Other times it may be ten seconds.

Is Roar meant to be an actual Notification Center replacement? No. Of course not. This all just started as a thought experiment and turned into a fun diversion over a holiday break from work.

A Better Way to Copy Two-Factor Codes on macOS

Back in June, I posted a completely un-serious post that described a ridiculous Rube Goldberg approach to grabbing two-factor authentication codes from your text messages on macOS using Keyboard Maestro (for those of us who don’t use Safari).

How dumb was it? Let’s just say that it relied on taking a screenshot of Notifcation Center and parsing the code out of the image.

A joke, yes, but also a fun distraction one evening.

To my surprise, very nice reader @ajorpheus provided a real solution in the comment section earlier today.

Their solution is to grab the most recent text from Messsages.app’s actual SQLite database and parse the token from that. Not only is this way, way faster – it’s much more error-proof as well.

Here’s the shell script they provided:

sqlite3 "$HOME/Library/Messages/chat.db"   "select text from message order by date desc limit 1" | perl -lpe "s/.* (\d+) .*/\1/g"

With that, it’s just a matter of dropping the script into a Keyboard Maestro macro like this:

Two-factor auth token Keyboard Maestro macro screenshot

Besides the script itself, all I added was a step to copy the extracted 2FA token to the system clipboard and another to display the token in a system notification so you can confirm the macro ran (and worked?) before pasting.

You can download the macro here.

Please note: The Messages.app database is a protected file on your system. For Keyboard Maestro to access it, you’ll need to grant the app Full Disk Access in System Preferences → Security & Privacy → Privacy.

(Remember to scroll slowly through that list when looking for Keyboard Maestro. It’s been how many years and still none of the Privacy sections sort their lists alphabetically?)

And big thanks to @ajorpheus for coming up with a better way to turn a dumb idea into something useful and reliable.

Retina Studio

Here’s the thing.

When I tell people, “I started my app business in 2007”, that’s not true. I never meant to start a business – it just happened. Because if I had sat down one afternoon and thought, “I’m going to begin selling software online today,” I sure as hell wouldn’t have intentionally named my company Click On Tyler.

For fourteen years, I’ve hated that name.

I began learning Objective-C and Cocoa in 2003. I tinkered around with tiny little projects for a few years. And then, in early 2007, I had the idea for VirtualHostX because I needed it for my day job as a web developer. When I finished the app in August, I had to put it up for sale somewhere.

Back up.

Two years before that, I was on my lunch break browsing recently expired domain names (I’m a nerd) and saw clickontyler.com was available. There weren’t touch screens back then – everything you did with a computer involved clicking. (That’s not true, but you know what I mean.) The domain name sounded fun, so I bought it. Nothing ever came of the website other than an awful About Me-style page.

So when I needed a place to sell VirtualHostX, clickontyler.com was the only website I had up and running. I put the app on the front page next to a PayPal button.

That was it. The name stuck. 50,000 customers. For a few years, my full-time job. And I’ve secretly despised it every day.

In fourteen years, I never even decided if the “On” in Click On Tyler should use capital or lowercase oh. And putting my first name into a company name (unintentionally or not) has always made me feel like my branding is one-step-removed from a solo divorce attorney calling themselves the First Name Group, PLLC. Every time I spoke “click on tyler” out loud to someone, I’d cringe a little bit.

Honestly, Apple ditching Intel CPUs and transitioning to their own chips is the best thing that could have happened to me.

What I mean is, the migration to Apple Silicon killed off VirtualHostX because of its dependency on VirtualBox (made by Oracle). I looked into every technical avenue I could think of to keep the app alive, but there’s no path forward that makes business sense for me.

I spent much of 2021 thinking about what I do now that VirtualHostX has reached its end of life. And I realized, holy crap, despite VHX being what I thought of as my flagship product, I have six other Mac apps that fall outside my original niche of web developer-focused software.

It was my wife who finally said, “Sunset VHX and focus on the rest. Start over.”

As usual, she was right.

And that’s what I did this Summer. I paused and took time to start over with a company name I actually like and that I hope carries me forward another fourteen years or longer.

Retina Studio

Hello.

Triple Tap to Capture

Have you ever triple tapped the back of your iPhone? No?

I wish I could remember who on Twitter pointed out this Accessibility feature, but I wanted to highlight it here and how I use this gesture because it’s such a fun shortcut for automation nerds.

Watch this.

At any time, from any app, whenever I need to remember something, I can tap the back of my phone three times. An input dialog will appear, wait for me to type in what I need to do, and file away those items into OmniFocus for later. I never have to leave the app I’m currently using or even launch OmniFocus at all.

Here’s how.

Settings.app → Accessibility → Touch → Back Tap

Back Tap gesture options screenshot

From here, iOS lets you assign actions to two different gestures: double tap or triple tap. You tap the back of your phone with a finger in quick succession, just like you would double click a mouse button.

Options include helpful system commands like

  • Open Camera
  • Lock Screen
  • Lock Rotation
  • Take a Screenshot
  • Turn on the Flashlight

as well as enabling iOS’s many Accessibilty affordances and, for our purposes, running a Shortcut.

Back Tap command options screenshot

(I’ll go ahead and add here that I only use the triple tap option because I found double tapping is too easily accidentally triggered during my normal day-to-day phone usage.)

If you have a Shortcut that you use frequently or want immediate access to, these tap gestures are a fantastic way to launch it.

As for my task capturing Shortcut, it’s pretty simple.

  1. It prompts for input
  2. Splits what you enter into individual lines
  3. And creates a new OmniFocus task from each one
Screenshot of Capture Shortcut steps

You can download the Shortcut here.

Capture Thing

Brett inadvertently forced my hand to publish the app featured in tonight’s post.

tweeting

If you take enough detours to hack around with a side project, over a few years you’ll eventually get to 2.0 on a project you’re not even sure anyone else uses 🙂

To which I had an immediate face-palm reaction when I realized

  1. How did I miss version 1?
  2. My goodness, doing is so much more scriptable and robust than the weird-ass Mac app I built for myself to scratch a similar itch.

If I’m understanding the intent of doing correctly (and I’m writing this after only reading through the v2 announcement and skimming around the documentation), I think his project and mine (which I’ll get to in a bit) come from a shared need to record and track what you’re doing (or have done).

Where our two approaches diverge is Brett went full-on nerdcore building a reporting engine into doing that provides the ability to track time and generate statistics of what you’ve been…doing. The later, last, and recent subcommands look like a brilliant ad-hoc reminders list to get you back into your last work context.

I’ve been big into journaling for close to a decade now – at least in my personal life. But I’ve never been able to build up the same habit in my work / professional life – even though I know I would reap benefits there, too.

I’ve tried all sorts of workflows to make journaling my workday a regular and frictionless routine — everything from a Day One hotkey to some convoluted Keyboard Meastro macros and Drafts.app actions.

None of them stuck.

But what finally did work for me (at least for the last six months or so) is a tiny little Mac app called Capture Thing. (Sorry, I pick horrible names for projects I never intend to share with others.)

Here’s the app’s only window

Capture app window

While I’m working, whenever I want to record progress, take a note, or jot something down for future reference – I press a keyboard hotkey to summon the capture window. Fill in the details (as much or as little as I want – all fields are optional) and ⌘S to save the entry. The window disappears, and I’m right back where I was in some other app.

Where do these entries go? And what are all the options?

I have a folder in Dropbox called Capture. Inside that, the app automatically generates year/month folders as time goes on. And a new journal file for each day I capture info. Like this

Finder folder structure

Each journal file is (of course) a plain text Markdown document in a standard format that renders into a friendly, human-readable format for later viewing and searching.

A complete entry (including every option) might look like this.

# 10:12 PM
Writing a blog post about Capture Thing
----------
I need to come up with some decent examples to share that don't expose all of my top-secret work notes.

![Screenshot 1](attachments/2021-11-22 - Screenshot 1637640770_1.jpg)
![Screenshot 2](attachments/2021-11-22 - Screenshot 1637640770_2.jpg)

### Attachments
* [doing-tweet.png](attachments/2021-11-22 1637640770doing-tweet.png)

### All Browser Tabs
* [Add New Post ‹ tyler.io — WordPress](https://tyler.io/wp-admin/post-new.php)
* [Brett Terpstra is fully microchipped on Twitter: "If you take enough detours to hack around with a side project, over a few years you'll eventually get to 2.0 on a project you're not even sure anyone else uses :). https://t.co/6sNw3jv34a" / Twitter](https://twitter.com/ttscoff/status/1462106481814298635)
* [Home · ttscoff/doing Wiki](https://github.com/ttscoff/doing/wiki)

### Misc Info
*Timestamp: 2021-11-23T04:12:53Z*
*WiFi: TopSecretWifiNetworkName*
*Computer: imacpro.local*

* * * * *

Top to bottom, here’s the entry format

  • An optional one-line summary.
  • An optional multi-line description with as much or as little content as you want.
  • If the Screenshot option is enabled, the app will take a screenshot of each monitor window, store those in the attachments directory, and link them inline in the Markdown.
  • You can drag and drop files into the capture window, and they’ll be copied into the attachments folder and linked, too. I’ll often take a small screenshot using CleanShot X, add some annotations, and then capture that image into my journal.
  • If you choose one of the browser tab options, the app will use AppleScript to grab the current tab (or all tabs) and dump those URLs into the entry along with their page titles as proper Markdown links.
  • Finally, to provide added context about when and where I made this entry, the app records the name of my computer as well as the current wifi network (which serves as a poor man’s location, so I know if I was at home, work, or a coffee shop).

As I go about my day, I capture what I’m currently working on or thinking. And at the end of the day, I have a full Markdown document of (hopefully) everything I worked on. I can review these journal entries fully rendered in an app like Marked.

Marked preview window

For years I’ve hacked around on tools and projects that I nebulously classify as my “second brain” or “digital memory”. Those terms aren’t unique or invented by me, but I’m fascinated by the possibilities that arise when data storage is so cheap and it’s (trivially?) easy to capture so much of one’s life – privately, for your future reference.

But that’s a much longer blog post.

Let me end with a quick video or two of Capture Thing in action to highlight how keyboard-centric it is.

First, here’s a complete end-to-end capture.

And then, you can see that the three screenshot and web browser capture options are toggleable with ⌘1-3.

Those options are adjustable per entry but default to what you choose in Preferences.

Capture Thing Preferences window

Another nicety that I won’t bother screen recording is if you press the capture hotkey twice (without filling in any details), the app will instantly capture and save a screenshot. I’ve found this very helpful to capture chat windows and video presentations super quickly.

Despite its terrible name and even more awful app icon, Capture Thing is available on GitHub if you’d like to try it or take the source code and build your own bespoke capture journaling workflow.

A notarized app download is available here.

That Person Exists

My wife and I were living in California in 2008 – most of our family back in Tennessee. I arrived home after work one day, and she said, “Your mom called me. She told me not to tell you anything, but we need to talk.”

My grandmother was 84 at the time and had been forgetting things for a few years. Those lapses in memory or the occasional odd behavior led to doctor visits. Which led to specialists. And then, a week before my mom’s phone call to my wife, an official diagnosis of Alzheimer’s.

From my mother’s perspective, this diagnosis – this disease – was shameful and needed to be hidden. As much as we eventually fought over her not telling my grandmother, and as much as I don’t think I’ll ever forgive the final decision never to tell, I can understand that it wasn’t just her. That entire generation viewed anything related to mental health as a stigma. And whether or not the affected person was her mother or her son, the stigma remained and was not to be discussed.

My mom’s intentions were to tell no one – not me, my sister, family friends, not even my grandmother herself.

(The doctor didn’t tell my grandmother either. Why a physician would deliver a diagnosis like this to the child of the patient and not the patient themselves is unconscionable and borderline malpractice, in my opinion.)

So why did she call my wife to share the news at all? Because at the time, my wife worked for the Northern California chapter of the Alzheimer’s Association. And when something like this becomes real, it becomes real, really fast. You need help. To cope, to plan, to prepare. You need advice on the next steps and the myriad of medical decisions suddenly just over the horizon.

And that’s exactly what the amazing people at the Alzheimer’s Association are for. In addition to advocating for and advancing the medical science of this disease, they are there to provide support and guidance for the families and caretakers of people living with Alzheimer’s.

Situations like these are never easy on the person’s family – no matter the medical condition. But Alzheimer’s is particularly vile because at some point, the person living with this disease…disappears. And other than the occasional fleeting glimpse of their old selves, the person that family loved and now has to take care of, is gone. It’s fucking cruel to witness the story play out over the remaining years. In my grandmother’s case, it took eleven.

My grandmother passed away in August of 2019 at ninety-five years young. Why am I writing this post now? Two years later? Before I get to that, let me tell you about her.

Her name was Bettye. This is her.

This is also her. (She’s the one climbing the wall.)

This is the first time she met my son.

And this was the last time she met him. And goddam if her eyes didn’t light up when he walked in the room – even if she couldn’t remember his name any longer.

This is a video of her using an iPad for the first time. As she swiped through those photos and intuitively, without anyone teaching her, pinched to zoom in, she out-of-the-blue remarked

You know that man in California invented this. He made all these things himself. Can you believe that?

And these were her dance moves.

She joyfully and fearlessly lived through more tragedies than I care to think about and had a wicked collection of one-liners that could knock you backward out of your shoes. When Trump was elected, she never once spoke his name — only referring to him as “That Man.”

My brother-in-law was the first of her grandkids to disappear, which made sense because he was the last to join the family. She never forgot my name or my sister’s. But near the end, you could hear the uncertainty and delay around my wife’s.

I don’t know how much she understood about herself, but the most challenging part wasn’t the end for me. It was the middle years when it was evident in her eyes that she still knew enough to know something was wrong inside her brain but didn’t know what or why. I can only imagine the panic she must have felt.

I said goodbye to her three years before her death. I had emotionally accepted that the person I loved was no longer there. That didn’t mean I stopped caring, or visiting, or checking in. Or letting my kids, her great-grandchildren, develop their own relationships and memories with her while they could in their short time together.

But it did mean that when I woke up at 6 am to the news of her passing in a text message from my mother – after a weeks-long agonizing final decline – I was at peace as much as I hoped she was.

She lived a difficult life, and yet met it with such apparent gratitude. She deserved much better than the way it ended.

But back to the reason for this blog post. I write about my grandmother this evening because my wife is working for the Alzheimer’s Association again – an organization that she loves and is dear to our family. This time, here in Tennessee.

She, her coworkers, and everyone at the association work tirelessly to one day see a world without this disease. It’s not an impossible goal at all. The first survivor of Alzheimer’s is likely out there right now – growing up.

Next week our families are walking in memory of my grandmother to raise awareness and maybe even a little money to support Alzheimer’s care, research, and advocacy. It’s a worthy cause, and I would be incredibly grateful if you’d consider donating.

Love you, Memaw. This much.

The 4:15 to San Francisco

Trains are awesome. When I lived in San Francisco, commuting down the peninsula every morning via Caltrain was (when on time) a delight compared to my Nashville commute today.

The train cars were this weird microcosm of Silicon Valley tech workers. Young college grads commuting to Palo Alto. Graybeards hopping off in Mountain View or one of the San Jose stops. And middle managers dressed up to look important.

A passenger sitting next to me in 2008 pulled this strange, angled, white slab of plastic with a screen I had never seen before out of their backpack. It was the first time I saw a Kindle in real life.

As a kid who grew up in the suburbs, for forty-five minutes each way, to work and then back home every day, I’d plug my Verizon USB dongle into my Mac, have just enough bandwidth to browse the web, and legit feel like I was living in the future.

Most commutes had a few regulars I’d recognize, but always something or someone new. One day there was a fight that ended in the conductor physically throwing two men off the car. Or the time a woman a row ahead of me began sobbing during an unexpected stop when word reached us that a suicide on the track was the cause of the delay.

But the story I’m reminded of today is the one about the only actual Apple leak I ever personally heard. I hadn’t thought of it in years until Jason Snell and Gruber briefly discussed how a hardware rumor could begin from a supply chain leak.

Listen:

I was commuting home on the train in August 2009. The section of the car I was sitting in was a pod of four seats – two rows of two facing each other. The seat next to me was empty, but across was a young woman and a man probably in his late 30s who very clearly didn’t know each other.

It was one of those awkward moments in an otherwise mostly-quiet car where one person will just. not. stop. talking. And this guy was going on and on about the electronics company he worked for and all the patents they had.

I was on my laptop – probably working on some failed app idea. The woman alternated between reading a book and doing anything else to not listen to him.

But what caught my ear was when he started talking about audio processing and microphones. It was something along the lines of…

Yeah, we made a new way to cancel out noise you don’t want to hear using extra microphones. In fact, you know what? Apple is using our stuff in the next iPhone. They’re gonna put a second mic on the back of the phone to get rid of background noise when you’re talking to someone.

It turns out the guy wasn’t lying. From Apple’s press release the following June announcing iPhone 4:

iPhone 4 features a second microphone and advanced software to suppress unwanted background noise for improved call quality when in loud places.

Like I said above, I hadn’t thought about this conversation in years. But a quick search shows that the guy must have worked for Audience – based out of Mountain View – which was my Caltrain station every day.

From Wikipedia:

Audience was the first company to have reverse-engineered the human hearing system and model its processes onto a chip, enabling computers and mobile devices to use the same kind of “auditory intelligence” that humans employ. By using this technology in conjunction with two or more microphones, background noise is suppressed, improving the quality of the remaining voice and reducing distraction for the listener. By using this technology in conjunction with two or more microphones, background noise is suppressed, improving the quality of the remaining voice and reducing distraction for the listener.

So that’s my one leak story. It’s not as exciting as finding a prototype in a bar, but I did manage to overhear how the new iPhone was going to listen.

Six

There aren’t enough adjectives in the world to describe her giant-sized personality. So I’ll just say happy number six to the strongest girl I know.

Augmented Reality Ducks

It’s an open secret that Apple is working on an augmented reality headset (or something). All the recent advances in ARKit and adding LiDAR to the pro iPhones and iPads are certainly leading up to a larger goal.

I have no idea what that next leap is.

But, I do know that fourteen years into iOS, people still ducking hate autocorrect. Especially when you find your ducking text messages littered with ducks. There’s just no ducking way around it. Short of adding a fake contact to your address book named Dr. Duck Ducking McDucker, autocorrect seems ducking incapable of learning everyone’s favorite bit of profanity.

That got me thinking earlier today. What’s going to happen when Apple finally leads us into that next frontier of human / computer interaction? What happens when our day-to-day reality becomes augmented with live information and our physical and digital worlds merge even closer together?

What happens if Apple takes autocorrect’s prudish vocabulary into AR? If they dared to try and censor the real world, how would that look?

I decided to try and find out. And I have the Xcode project to prove it.

Watch.


So what did you just see in that video?

It’s an iOS app that analyzes video streaming from the camera and attempts to detect human hands. If it finds any, it then tries to distinguish the digits of each finger and, specifically, if the middle finger is raised. If it detects that, it takes the location of the offending finger and censors it with a 🦆.

This whole post, of course, really is a joke. But the silly idea originated from an actual conversation with my wife and then a friend egging me on to build it.

But, more importantly, it shows just how fantastic software is these days. How spoiled we are to carry supercomputers in our pockets. In less than 200 lines of code, I used these incredible frameworks developed by Apple to get this idiotic idea working in an evening.

I know I complain a lot, but I also want to credit the many talented people working hard to put software like this out into the world for other developers to build upon.


As usual, the sample code for this project is on GitHub.

Conversations in the Dark with a Six-Year-Old

I love our new world of always-on technology because it can provide empirical, digital evidence that validates the squishy, fuzzy experience of being human. It can prove you’re not crazy. That there is a reason for feeling the way you’re feeling.

I don’t have a good segue into this next paragraph, but one of the things I hate most about myself is I don’t think fast enough on my feet. It’s why I prepare so meticulously when I know in advance that I need to make an important point or defend my position. When an argument becomes heated, when I’m unexpectedly challenged without supporting evidence ready and within reach, I go full Costanza.

Given an hour to craft my argument, I can eviscerate the other side in a scathing email that makes my case. But on the fly, I fumble and back down in the face of a more aggressive opponent.

That doesn’t happen often, but it did today. And I could feel my body reacting, fumbling, and shutting down in real-time as it always does. Fight or flight? I noped right out of there. I packed up my bag and went home for the day.

The difference this time, since the last occurrence, is I was wearing my watch.

Were the stress and my body’s reaction real? Or imagined? (Does it matter?)

A 161 bpm heart rate during a contentious conference call sure seems to validate how I felt.

To me, that’s fascinating. The direct connection between a piece of aluminum strapped to your wrist and the emotions overwhelming your brain. A tenuous but verifiable link between digital and analog. An opportunity to recognize something tangible and make a change.

So when I arrived home today, I silenced and shut down every device and scheduled some vacation time for next week. In my out-of-office auto-reply, I even lied and said that I would be away without access to the internet or a cell phone.

Later, after sunset, I held my daughter’s hand and walked around our backyard in the pitch black for an hour. Me in my house shoes. Her with bare feet. We talked about the usual things that are top-of-mind to a six-year-old.

Why is it cold at nighttime but not really when it’s Summer?

Can wolves hear me with their long ears if I howl quiet?

What was that noise?

Where do bees go when it’s dark?

Where do ants live?

If I get a flashlight, will that scare away the lightning bugs or make them come closer?