Happy #4 to my amazing, kind, funny, crazy, most favorite boy in the world. Love you forever.
Inspired by this article on doing a year-end review of your indie business, I took a few minutes and calculated some stats and compiled my major accomplishments from 2014. The result was eye-opening and made me feel more than a little bit proud.
I’ve made a conscious effort to be more transparent about my business dealings this year because I truly believe getting more people to share these kinds of numbers will benefit the community as a whole. So, here’s my 2014 review of Click On Tyler…
- Earned $61,500 in sales ($168/day)
- Grew revenue by 6%
- Gained 1,714 new customers (40% down from 2013 new customers)
- Sold 1,961 copies of my software
- $31.36 average sale price
- Launched new, responsive clickontyler.com website design
- Launched VirtualHostX 5.0 with folders, SSL support, custom directive templates
- Launched VirtualHostX 6.0 and reset the app’s development cycle to coincide with Apple
- Built and launched Minion
- Built and beta launched Hobo
- Built Upshot
- Built and launched Shutterbox
- Retired Incoming!
- Retired Traffic Advisor
- Migrated off Rackspace to Linode
- Integrated a FAQ/support website into clickontyler.com
- Introduced commercial licenses with priority support
- Began writing regularly at tyler.io
- Wrote 56 blog posts. 35,900 words
- Wrote half a book on my Dropbox photography workflow. 11,251 words
- Gained 883 newsletter subscribers (42% growth)
- Sent 11 newsletter campaigns (10,789 emails)
- 68% newsletter open rate (19,447 opens, 7,406 unique opens) (18% industry average)
- 15% newsletter click-through rate (2,217 clicks, 1,666 unique clicks) (2.5% industry average)
All of that, while caring for (dealing with) our first son, who was born at the start of the year. And, while working a full-time job as well.
A sincere thank-you to all of my customers and everyone who reads this blog. I can’t wait for 2015.
I hate using the word “quit”. Because it’s not quitting. It’s not even “giving up”.
Today, the prolific Manton Reece wrote a blog post announcing that he is sunsetting his Twitter apps. This, after a recent announcement that Twitter’s (amazing!) new fully searchable tweet archives won’t be made available to third-party developers.
Twitter’s fully-searchable index is an announcement I’ve been waiting on for years. Back when I worked at Yahoo!, I co-created Sideline, a cross-platform desktop client for searching Twitter. A year later, after Sideline was sunsetted, I rebuilt it as a native Mac desktop app called Incoming!. Both Sideline and Incoming! were fantastic ways for discovering what’s currently happening on Twitter and for following the latest trends – whether that be an Egyptian uprising or whatever the hell latest trouble Justin Bieber is into.
But those apps were always crippled by the lack of history in Twitter’s search API.
If Twitter hadn’t already turned into such a giant dick towards third-party devs, their full-search announcement would have filled me with technical amazement and hope at the amazing apps that could be built on top of such a corpus. Sadly, I knew it was too good to be true, and sure enough, a link from Marco sealed the deal.
It’s only a matter of time until all third-party apps are either pushed out of business or users leave for features we simply can’t compete with.
As I said at the top of this post, Manton isn’t quitting. He’s one of the most thoughtful developers I’ve ever had the pleasure of speaking with. He, like many of us, is simply recognizing when a window of opportunity has closed shut. I applaud him for the care with which he’s shutting down Tweet Library and Watermark.
Not to jump on his bandwagon, but I’ve also decided to remove Incoming! from the Mac App Store. It’s ability to dive into breaking news and social trends by aggregating tweets never caught on the way I had hoped. I kept it alive for the last few years hoping that, eventually, we’d get access to Twitter’s entire history. But now that we know that isn’t going to happen, I see no reason in keeping it alive.
Incoming! will of course continue working for users who already have a copy. I plan on making a free version available later this year on my archives page for anyone who still has a use for it.
I often find that constraints, real or artificial, can be a huge motivation and productivity boost when I find myself stalled on a project or piece of work. Forcing yourself to work within a specific limitation can cause you to find a creative solution in a direction you might otherwise never consider.
In my own work, I find time constraints the most useful. When I don’t know where to start developing or writing, I’ll often force myself to do something – no matter how small or tangential – to move the work along for thirty minutes or an hour. The artificial time limit frees my mind to attempt starting points I might not typically choose because I know that even in a worst case scenario, I’ll only lose an hour worth of work if my chosen path is unsuccessful. Similarly, when I find myself lacking motivation and energy to work, I’ll use the pomodoro technique. That gives me the kick in the pants I need to break the ice and get moving, which can often turn into real energy and sometimes even that magical “flow” state.
A few weeks ago on a Friday night I found myself burnt out from my day job. It had been weeks since I had done any recreational programming or development on my side projects. I desperately wanted to move them forward, but simply couldn’t find the energy.
So I tried an experiment.
I decided to give myself thirty minutes to brainstorm and come up with a product idea that I could then build to a shippable state within the next two hours. Further constraining myself, I decided to forgo my two 27″ monitors and do everything exclusively on my 11″ MacBook Air.
I knew the easiest way to come up with a new product idea was to examine my typical day and find a small pain point that I could solve with software. After mentally going through my usual day in my head, I began to focus on my calendar. I realized that I was often checking multiple times throughout the day to see how much time I had until my next appointment, which would influence which task I could take on before I have to be somewhere. (As many developers will know, a programming assignment that will take an hour, often takes longer as there’s a definite lead-up time before you really begin coding where you work to get yourself into the zone.)
I thought “Wouldn’t it be easier if I could just glance up and see how much time I had remaining?”
With that insight, I had my product idea.
Over the next ninety minutes I built a simple Mac menubar app that looks at your calendar and simply displays how long you have until your next appointment. I call it Up Next. I’ve been using it as part of my daily routine for the last few weeks and have found it works great. Without losing place in my current work, I can glance at my menu bar and see “Oh, I’ve got two hours until I need to be somewhere” or “Crap, I’ve only got twenty minutes to wrap this up”.
That little bit of awareness doesn’t distract me from getting in the flow, yet keeps me persistently time boxed and motivated.
If you find yourself similarly lacking motivation throughout the day, I encourage you to try placing a constraint on your work. And if you think Up Next might be useful to you, you can give the app a try.
(How could I not title this post after one of my favorite David Byrne songs?)
Gus has a terrific post on his blog about what he calls “the wilderness” – a period of time between major software releases “where I’m pretty lost, and I don’t know what to do.” His working theory on the matter is that it’s basically a forced period of rest put upon you by an overworked brain. How do you get past it? The same way you walk out of any wilderness – “One foot in front of the other…You can walk for miles and maybe months in the wilderness. Just keep on going.”
I take comfort in knowing that I’m not the only one who experiences periods of low energy slogging through work like this. Unlike Gus, however, I don’t blame it on a stressed or over tired mind. For me, as someone who suffers from depression, I know that these periods aren’t just a symptom of working too hard, but that they’re actually just one part of a cycle between times of depression and times of whatever the opposite of depression happens to be.
I’ve been tracking my mental health through copious notes and journaling for three years as a way to better understand what triggers my depression, how long it lasts, and what I can do to work my way out of it. The biggest lesson I’ve learned is that these periods always have an end. They’ll pass. If you have faith that that’s true, then it becomes easier to listen to your mind and not try and force things when the timing and mood isn’t right. In the same way that I take advantage of my mentally high states and code like a crazy person, I know to divert my focus to other more doable tasks when I’m at a low point. Rather than coding, I’ll update my FAQ articles, record screencasts, or do a deep dive into my marketing analytics safe with the knowledge that I’ll come through to the other side and return to coding in a better state of mind.
Gus equates escaping these periods with putting one foot in front of the other and continuing to walk until you find your way out. For me, I see it more as riding waves on a boat. There’s going to be low points just as there are high ones. The key is to keep sailing.
Yesterday, Daniel Jalkut tweeted
The only things better than shipping an app are the thankless months of hard work that go into making it halfway presentable. Yep, it’s fun.
— Daniel Jalkut (@danielpunkass) April 4, 2014
That’s an apt way of describing the strange mix of joy and awe programmers feel when, after hours, days, maybe even weeks or months of work, the computer does what you tell it to do.
I’ve been working on my new app, Minion, for the last four weeks. And, today, I finally had my first successful end-to-end test run of the app. Everything on the client side worked beautifully and then successfully talked to the web service and delivered a notification to my phone. It’s a thrill to see everything come together. A high that I’m not sure I could achieve so frequently if I were in any other profession.
We have a three month old kid. That means we take a lot of photos. I’ve done the math, and in the last three months we’ve taken 1,202 photos of him. As I’ll write about in my upcoming book on Dropbox photography, all of those photos are stored and sorted in a shared Dropbox folder that both my wife and I have access to. For other family members and friends, we share the best of those photos via a shared iOS Photostream. I’m really a big fan of this feature. With just a few taps I can share as many photos as I want and have them near-instantly delivered to the twelve people who subscribe to our photostream. Everyone in our family and circle of friends has an iOS device, so no one’s left out. I never have to fumble with emailing attachments, or posting links to Flickr. All the photos are available in the native iOS Photos app. Best of all, I can post comments with the photos I add, other people can like and add their own comments, and they can even share their own photos, too. It all works splendidly.
The only downside is for our grandparents. They don’t have iDevices. Sure, our parents are always showing photos to them on their phones and iPads, but our grandparents miss out on the personal connection they’d get from having their own collection of photos. To try and fix this, I’ve started physically printing and mailing batches of photos to them every ten days or so. They love getting photos they can touch and display in the mail. It really is like their own real-world photostream.
The problem is this is manually intensive. Printing batches of photos, keeping up with ink and photo paper, finding sturdy enough envelopes to handle twenty photos at a time, and then dealing with postage slows the whole process down. So I’ve been experimenting with three online photo delivery services to handle all of this for me.
Over the last few months I’ve tested Shutterfly, iPhoto, and PicPlum. Ideally, I’m looking for a service that I can quickly upload the latest photos – from both mine and my wife’s iPhone and from our good camera – and have them sent to multiple addresses without having to re-type the address each time. The photos need to be delivered fairly quickly and, most importantly, arrive in good condition.
I’ve given each of the above services multiple tries, and they all have their good and bad points. For those of you who like to skip ahead, the winner was Shutterfly, followed by PicPlum, and then iPhoto.
I really wanted to like iPhoto, as it’s Apple’s recommended service. But there are a few negatives that keep me from going this route. First of all, the iPhoto iPhone app is nearly impossible to figure out. I’m an app developer by trade, so I like to think I can understand most apps without much instruction, but the iPhoto UI baffles me. Selecting multiple photos and preparing an order for delivery was a beast of a process. Add to that extraordinarily long ship times and they were a clear no go. The actual photos were of middle of the road quality and arrived in a plain white cardboard envelope that seemed to protect them well enough.
PicPlum is an interesting service. Unlike iPhoto and Shutterfly, which are really designed for printing and delivering photos to yourself, PicPlum bills itself as a service designed for printing and mailing photos for other people. Everything is done through their lovely web interface. You can drag and drop your photos directly into the web browser. Then it’s just a matter of choosing the recipients from your previously saved addresses.
PicPlum loses points for not having an iOS app. Typically, I want to only send my best photos to be printed. All of the best ones are already handily organized and available in our shared photostream. If they had an iOS app, I could choose them directly from that album. But, as they only support desktop uploading, I have to find and gather them from the various photo albums in my Dropbox. This isn’t a huge deal-breaker, but it is slightly less convenient.
The biggest downside to PicPlum, and ultimately the reason I no longer use them, is the photos arrive in a flimsy paper envelope. The kind of thing you’d mail a birthday card in. I used PicPlum to send twenty photos three times. Twice, the envelope arrived torn with the photos sticking out. In one case, the adhesive sealing the envelope was barely affixed and everything was in danger of spilling out. And while I didn’t encounter this problem in my testing, with such a flimsy delivery method, there’s absolutely no protection against water damage.
I’m actually quite sad that I can’t use PicPlum. They make it easy to send to multiple recipients and their photos were by far the highest quality of the three services.
As I said above, Shutterfly is who I decided to go with. Their iOS app is a little long in the tooth, but it’s serviceable and easy enough to use. I’m able to choose photos from my Photostream and upload them quickly. I can pick from a list of previously saved addresses. The price is the cheapest of the three services, and the photo quality is good. Unlike PicPlum, Shutterfly’s prints arrive in a sturdy cardboard envelope inside an even larger cardboard sleeve. I’ve mailed six batches of photos so far and none have arrived damaged. The double envelopes even protected the photos against our rain soaked mailbox.
The only problem I’ve encountered with Shutterfly is their shipping time. Using their default shipping option, which is about three dollars, the photos arrive anywhere from five to twelve days later. For three bucks, I’m not sure what I expect, but two weeks is way too long to wait for a delivery. So I usually just pony up the extra cash and pay for the $10 two-day delivery method instead. It’s faster, and comes with a tracking number, too.
Overall, our grandparents have been thrilled with the service. They absolutely love getting their bi-weekly photo surprise in the mail. The physicality of holding real photos in your hands makes them feel connected to our son in a way that FaceTime and flipping through photos on an iPad just can’t. I highly recommend keeping your non-technical friends and family in the loop this way.
I’ve been a happy paying customer of GitHub since early 2009. But yesterday, for a few different reasons, I deleted all of my private repositories and moved them over to a self-hosted installation of GitLab. I didn’t make that decision lightly, as I’ve been very happy with GitHub for the last five years, but here’s why…
First, I’ve started working on a new Mac app. Every time I start a new project, unless it’s open source, I create a new private repo for it on GitHub. This project happened to be my 21st private repository on GitHub. If you’re familiar with their pricing structure, you’ll know they charge based on how many private projects you have. $22 a month will get you twenty repos. But as soon as you create that twenty-first one, you graduate onto the $50 a month plan. Maybe if I were actually hosting 50 repositories with GitHub I’d be willing to pay that much, but for the foreseeable future I’m going to be in the low twenties, and $50 a month is just too much. It’s a shame they don’t just outright charge you a dollar per month per project.
The second reason is an issue I’ve been mulling over for quite a while. I love the cloud. I love having my data in the cloud. But some of it is so precious, in this case my code, that I want to know exactly how it’s being taken care of and looked after. While I have no reason to doubt GitHub has plenty of backups in place, I have no way of really knowing for sure how safe my code is. Hosting it myself has its inherit risks, too, but at least I can have full ownership of my data and be certain of the backup strategies in place. This also dovetails nicely with the pleasure nerds like myself get in doing a job themselves. Whether that’s hosting your own email (which I’m not crazy enough to do), managing your own web server (yes, please), or automating your own digital backups, there’s a sick pleasure to be had in doing a job yourself and doing it well.
A final reason for switching away from GitHub was the uneasy feeling I got watching the story of Julie Ann Horvath unfold last week. I didn’t like the idea of my money going to a company that seemed so fundamentally broken. Since then, GitHub has taken forceful, actionable steps to correct the issue, but it still worried me.
So those are my three and a half reasons for moving my private repos away from GitHub. If you agree with me, or if you have your own reasons for wanting to move away, what follows is a brain dump of the steps I took towards getting moved over and situated happily on a GitLab installation.
First off, if you’ve never heard of GitLab, go take a look through their website. It’s a Rails app that is shamefully funny in how closely they’ve copied the look and feel and functionality of GitHub. Everything from the activity timeline, to pull requests, to user and team access roles, to issue tracking, to shareable git-backed gists. It’s all very nicely implemented. Many open source projects start off strong and can later falter when the creators get bored. But I feel fairly confident in GitLab as their community open source version is based off an enterprise product they sell and do support for. Quite a few businesses are using GitLab as a GitHub replacement in situations where their code needs to remain on site.
So, where are we going to host it? My initial thought was to boot up a new virtual server with Rackspace, which is where I host all of my business servers. Rackspace is great. A little expensive, but the customer support makes up for it. Their minimum monthly price for a 512mb server, which is all we’ll need, is around $10 a month. I was nearly about to create the server when I decided to finally take a look at DigitalOcean. They’re the new hotness in cloud hosting and have a reputation for being extremely inexpensive. (Bonus points: they offer two-factor authentication on their user accounts, which is something Rackspace still lacks.) Poking around, I found I could get a comparable 512mb server with DigitalOcean for a flat $5 a month. But what really sealed the deal is they offer one-click installs of various server apps – WordPress, etc. I wasn’t looking forward to the fairly intensive setup that GitLab requires, but amazingly, GitLab is one of DigitalOcean’s one-click installs.
True to their word, I had a ready-to-go GitLab server up and running in less than a minute after clicking the “create” button. All that remained was fine tuning everything to my needs.
The first step upon getting a new cloud server is to secure it. I always follow the steps outlined in this guide. It does a good job of locking everything down and only takes about five minutes to follow.
Of note, when you get to the section about enabling ufw (the firewall), DigitalOcean boxes don’t come with everything you need installed. I had to run the following command before setting up ufw…
sudo apt-get install linux-image-$(uname -r)
Another note, and this is just personal preference, I also modify my ssh port to be something non-standard. That can be changed in…
Also, while the user facing side of GitLab is great, I have no idea how security conscious they are. I’d hate for an unpatched security hole in their web app to expose any of my private code. One way to mitigate that chance is to lock down web traffic to the specific IP addresses you’ll be accessing it from. Your home, your office, etc. With ufw it’s just a quick…
sudo ufw allow from your-ip-address to any port 80
for each of your IPs.
Once you’ve gotten the security taken care of, you can move on to configuring GitLab. Most of the hard work is already done for you by DigitalOcean. You’ll just need to fill in the appropriate values in…
Then restart GitLab with…
sudo service gitlab restart
With all that done, the next step is moving your repositories from GitHub to GitLab. (I’m sure there is a better direct git-to-git way of doing what follows, but this was the simplest solution for my needs.) For each of your repos, do a clean mirror to your Desktop to make sure you’ve got everything.
git clone --mirror firstname.lastname@example.org:username/repo-name.git
Then, cd into the repo directory and….
git remote add gitlab ssh://email@example.com:22/username/repo.git git push -f --tags gitlab refs/heads/*:refs/heads/*
That final git push with all the refs will push every branch and all of your tags making sure nothing is left behind.
Once done, you can safely delete your repo from GitHub.
The last step is making sure you have rolling backups of your GitLab installation and repositories in place. I looked into piecing together my own backup script until I realized GitLab already has a rake backup task available that stores everything into a single tar file. Perfect. I can then just upload that to S3 for safe keeping. To do that, we’ll be using s3cmd to handle the uploads.
sudo apt-get install s3cmd
Configure it with…
Then, create a script in your git user’s home directory called backup.sh containing…
cd /home/git/gitlab && PATH=/usr/local/bin:/usr/bin:/bin bundle exec rake gitlab:backup:create RAILS_ENV=production s3cmd put tmp/backups/`ls tmp/backups/ | grep -i -E '\.tar$' | tail -1` s3://bucket-name/git/
Setup cron to run that script once a day and you’re good.