I spend about ten hours a day staring at two 27-inch Apple cinema displays. It makes coding great. But, with that much screen real estate, I keep losing my mouse cursor. I’ll have to jiggle it around for half a minute trying to find where it’s disappeared to.
No more!
Yesterday I discovered OS X has an option in the Universal Access Preferences pane that lets you adjust the size of the cursor from normal all the way up to holy-gigantic. I have mine set to a comfortable 33% — which is just big enough to keep from getting lost, but not so large that I can’t tell where I’m clicking.
For the last twelve months I’ve been keeping detailed records regarding the number of users pirating my Mac apps and toying with different ways of converting those users into paying customers. I’m not foolish enough to ever think I could actually eliminate the piracy — especially since I believe there are a few legitimate reasons for pirating software — but I was genuinely curious as to what the actual numbers were, the motivations behind doing so, and if there were any way I could make a small dent in those numbers.
A quick summary for those who don’t want to read the full post (tl;dr)
Software developers are foolish if they think they can prevent piracy. The only goals worth pursuing are
Make it incredibly easy for honest customers to purchase your software.
Find simple, non-intrusive ways of encouraging pirates to become paying customers.
Retire to a sandy location with the tens of hundreds of dollars you’re sure to make.
A Bit of History and Harsh Reality
VirtualHostX 2.0 was released on July 19, 2009. Fake serial numbers first appeared (to my knowledge) on August 3, 2009. That’s fifteen days. Fifteen days for someone to take the time to engineer a fake serial number for a relatively unknown, niche app.
Nottingham was released on November 28, 2009. It took eight days for the first serial number to begin appearing.
Admittedly, the serial number scheme I used was incredibly simple. So it was no surprise that it was easy to crack. But seriously? Eight days? I doubt it took whoever did it more than an hour of actual work. I was just flabbergasted they cared enough to even take the time. A little honored truth be told.
So I did what any software developer would do. With each new software update I released, I made sure to ban the infringing serial numbers. Now, I fully realized the futility of what I was doing, but still — I thought that if I at least made it inconvenient for the pirating users to have to seek out and find a new serial number each time, maybe I’d win a few of them over.
Nope.
Rather than posting new serials, CORE (that’s one of the “teams” that release pirated software and serial numbers) simply put out “Click On Tyler MultiGen” — which was an actual app you can download to your Mac and use to create your own, working serial number for all of my products. Here’s a screenshot:
(It even plays music.)
So, with that out in the open (you can download it here), there was no point in banning serial numbers any longer.
Instead, I turned my attention towards measuring the extent of the piracy. I wanted to establish a baseline of how many users were stealing my app, so I could then tell if any of my attempts to counteract it worked.
I won’t go into the technical details of how I measured the number of pirated apps in use, but after a two month period I can say with high confidence that 83% of my users were running one of my apps with a fake serial number. Let that sink in.
Eighty-three percent.
Fuck.
Experiment #1 – The Guilt Trip
My first attempt at converting pirates was appealing to their sense of right and wrong. (I’ll pause while you finish laughing.) I released an update that popped up this error message when it detected you were using a fake serial number:
Two things worth noticing:
I looked up the users first name (if available) from Address Book and actually addressed the message to them.
They only way to dismiss the message was the guilt-trip-tastic “Sorry, Tyler!” button.
Sure those things were cheesy — the folks on the piracy websites actually mocked me for it — but I thought adding a little humanity (and humor) might make a difference. And it did.
Over the next three months I saw a 4% decrease in the number of users pirating my apps. Now, is that for certain because of my silly message? It’s possible, but I can’t be certain. Nonetheless, I thought it was a strategy worth continuing.
Experiment #2 – the guilt trip and a carrot
At the beginning of this year I decided to be a bit more proactive and actually offer users a reason to pay other than simply “doing the right thing”. So, I began showing this error message instead:
And I was serious. I presented the pirates with a choice. A one-time, limited offer that was only good right there and then. They could either click the “No thanks, I’d rather just keep pirating this software button” or they could be taken directly to my store’s checkout page along with a hefty discount.
(I was wary of doing this because I didn’t want to offend my real, paying customers who have been kind enough to part with their money at full price. I realize it’s not fair that honest users might pay more than the pirates. To them, I hope they’ll understand that I was simply trying to convert and make at least a little money off of users who were simply not paying to begin with. Hopefully the full-price you paid was worth it at the time and still is today.)
Did it work?
I was very careful to measure the number of times the discount dialog was displayed and the number of discounted sales that came through. The result? 11% of users shown the dialog purchased the app. I suspect that number might be a little higher as I’m sure some users saw the dialog more than once.
Despite 11% being a small number compared to the overall 83% piracy rate, I was thrilled. Most online advertisers would kill for an 11% conversion rate. I considered the experiment a success and let it continue on for a number of months until the numbers dwindled down to 5%, which brings us to today.
The Big Switch
Last month (April 2011) I released Nottingham 2.0 — and with it, a new serial number scheme that requires a one-time online activation. I’ve always been adamantly opposed to registration checks like this both as a developer and a user. But now that everyone is (almost) always connected, these checks don’t bother me as much as a user any longer. Especially if they’re unobtrusive and one time. Also, after seeing the raw numbers, the developer in me is now more concerned with buying food than lofty expectations.
I hope I’m not stirring up a hornet’s nest by saying this, but so far sales of Nottingham 2.0 are going well and piracy is virtually non-existent. Is that bound to change? Of course. I fully expect my scheme to be cracked at some point. But now that activation is involved, I have a much better view of when and how often it’s happening. Another benefit is that it’s no longer sufficient to pass around a serial number or even a key generator. Pirates will now need to patch the actual application binary (totally doable) and distribute that instead.
With those promising results in mind, I made the decision to convert my existing VirtualHostX 2.0 users to the new serial scheme as well. My goal — as always — wasn’t to stop the piracy but at least make a small dent in it.
My foremost concern was to make things simple for my existing customers. Under no circumstances did I want to annoy (or piss off) them. I couldn’t just invalidate all of their old serial numbers and send everyone an email with their new one. That would surely prevent someone from using the app right when the needed it the most. I had to make sure the switch was as frictionless as possible.
So, I toyed with different upgrade processes for a few weeks and finally settled on a system that I deployed with the 2.7 a few days ago. Here’s how it works.
The first time the user launches VirtualHostX after getting the automatic upgrade to 2.7, they’re shown this window:
I explained the situation as plainly as possible while also being upfront with the understanding that this is an inconvenience for them, not me, and the requisite apology. I also made it simple — one button to click — no other steps.
So, click the button, wait about five seconds and:
The app automatically connects to my server, validates their old serial number, generates a new one, and registers the app without any other user intervention. It’s all automatic.
So far the switch has gone well. I’ve seen about 30% of my registered users go through the update and have had exactly two emails — not complaining — but just confused as to what was going on. One customer even wrote in to say:
That was so painless. Great job on the messaging and single-click process. Very well done.
So that makes me feel good. Even though I wish I could have avoided the process, I’m glad it appears to be going smoothly. If any other developers ever find themselves in a similar situation, I can highly recommend this approach.
So That’s It
Many of the points I’ve written about are hardly new or exciting to anyone who’s written software or pirated it. So I’m not posting this as some sort of revelatory treatise. Rather, I just wanted to document the experiences I’ve gone through as a one-man software company who’s trying to earn a little money while keeping his users happy.
In the end, the most important thing you can do is be respectful of your users’ time by writing software they’ll love so much they can’t wait to pay for. Once you’ve got that down, then you can try and encourage the rest to pay up 🙂
Two years ago I posted some quick instructions on how I keep my Adium chat logs synced between Macs using Dropbox. I’ve tweaked my setup slightly since then. Here’s my new approach.
First, if you already have Adium on multiple machines, copy all your logs over to a single Mac. You can merge the folders easily with an app like Changes. Once you’ve got a canonical folder of all your combined chat logs, place it somewhere in your Dropbox. Then…
If the output lists anything other than i386 or x86_64 you’ll get rejected.
This was particularly painful for me because it appears this check is only run when submitting a new version of your app — PPC framework binaries don’t cause a rejection during the original app submission process. I thought I was going crazy since I had made no project changes since the first submission and running lipo on the app binary didn’t return anything unusual. Hopefully this will save someone else the hour of head scratching I just went through.
Earlier this week, the Chromium Blog announced an official extension API for Chrome’s omnibox (search bar). I’ve always loved keyboard driven interfaces — the command line, [Quicksilver](http://en.wikipedia.org/wiki/Quicksilver_(software)), Alfred, etc — so, I immediately started thinking about what I could build with it.
My first idea was a documentation browser for Apple’s Mac and iOS libraries. I’m always googling for class and framework names as a way to quickly jump to Apple’s documentation site. The problem is that many times the developer.apple.com link is buried down the page, which means I waste time scanning for the link rather than just hitting return for the first search result.
This extension solves that problem by allowing you to type “ios” or “mac” followed by a keyword. It then presents and auto-completed dropdown of matching search results which take you directly to the relevant page on Apple’s documentation site. Here’s a screenshot after typing “ios UIImage”
For those among you wondering how I’m searching the Apple docs, I caught a lucky break. Apple’s Mac and iOS reference site includes a small search box that autocompletes your queries. I tried sniffing the network traffic to see what web service they were using for suggestions (hoping to hook into that myself) but found they were showing search results without sending any data over the wire. A little more digging and I realized they were pre-fetching a dictionary of results as a giant JSON file on page load. With that data — and a sample Chrome extension courtesy of Google — it took no time at all to connect all the pieces and get the extension working.
If you’d like to install the extension, just click here for Mac and here for iOS. You’re also welcome to download and improve the code yourself from the GitHub project page.
Every holiday, between the food and family, I always seem to find time for a quick project. Last year I built the first version of Nottingham over the Thanksgiving break. This year was no exception, and I found myself putting the final touches on Sosumi for Mac after an eighteen hour coding streak this weekend.
Sosumi for Mac builds on the original Sosumi project I started last Summer — a PHP script that returned the location of your iPhone by scraping MobileMe’s website and that eventually evolved to use Apple’s “official” API once that was released.
Last week, Apple pushed a rather large update to the Find My iPhone service and made it free to all users. Along with that came some API changes, which broke Sosumi. With help from Andy Blyler and Michael Greb, we managed to get it working again. I took the opportunity to go all out and write a native Cocoa implementation of Sosumi as well. And, with that done, I went one step further and built a full-fledged desktop app for tracking all of your iDevices.
Now that it’s complete, it’s much easier to simply open up Sosumi for Mac, rather than having to re-login to Apple’s website or iPhone client each time. The desktop app also opens up some fun possibilities. A future version could notify you when your spouse leaves work in the afternoon so you know when to begin preparing dinner. Or alert you if your child strays from their normal route on the way home from school. Or, since Sosumi provides your device’s battery level, you could even send alerts if your phone needs to be charged soon.
Admittedly, this kind of always-on location tracking can certainly be creepy. But that’s almost always the case with these types of applications. Whether Fire Eagle, Foursquare, or Google Latitude — it’s always a matter of striking a reasonable balance between convenience and privacy. I trust you’ll use Sosumi for good rather than evil.
Back in June I wrote a detailed post describing how I backup my data. One of the key components of my backup strategy was using Backblaze for continuous, offsite recovery in the event of a disaster.
Well, disaster struck.
Last week, the hard drive in my father’s MacBook died. In the past, I’d setup a networked Time Machine drive to backup my parents’ laptops, but for whatever reason it never worked reliably. OS X would often become unable to mount the drive — even when connected to an Apple Airport. Fortunately, I gave up on Time Machine a few months ago and installed Backblaze on everybody’s Mac. Ponying up the $50/year per machine seemed like a great deal. Definitely worth the peace of mind it brings me knowing I don’t have to waste time fighting with Time Machine or manually backing up their data whenever I visit.
This past week, with my father’s hard drive verifiably dead, I’m happy to report that Backblaze performed flawlessly.
My father isn’t a heavy computer user, but he still had 20GB of data stored in Backblaze’s cloud. Once we verified that his data really was lost, I signed into Backblaze’s website and requested a full zip file backup of all his files. Twenty minutes later they emailed to say a 20GB zip file (!!!) was ready to download. Over my 30mbit Comcast connection it only took about an hour to download, another ten minutes to unzip, and bam! All of his music, photos, documents, everything right back as it was just hours earlier.
It’s so, so, so important to keep good backups of your data. In my father’s case, he had fifteen years worth of genealogy research on his Mac. I can’t even imagine that data being lost. And while I’ve done a few small restore tests with Backblaze, this was the first time I’ve truly needed it for a full recovery. And, like I said above, it worked just as advertised. Kudos to the Backblaze team on an outstanding product.
I’m totally obsessed with web site performance. It’s one of those nerd niches that really appeal to me. I’ve blogged a few times previously on the topic. Two years ago, (has it really been that long?) I talked about my experiences rebuilding this site following the best practices of YSlow. A few days later I went into detail about how to host and optimize your static content using Amazon S3 as a content delivery network. Later, I took all the techniques I had learned and automated them with a command line tool called s3up. It’s the easiest way to intelligently store your static content in Amazon’s cloud. It sets all the appropriate headers, gzips your data when possible, and even runs your images through Yahoo!’s Smush.it service.
Today I’m pleased to release another part of my deployment tool chain called Autosmush. Think of it as a reverse s3up. Instead of taking local images, smushing them, and then uploading to Amazon, Autosmush scans your S3 bucket, runs each file through Smush.it, and replaces your images with their compressed versions.
This might sound a little bizarre (usless?) at first, but it has done wonders for mine and one of my freelance client’s workflows. This particular client runs a network of very image-heavy sites. Compressing their images has a huge impact on their page load speed and bandwidth costs. The majority of their content comes from a small army of freelance bloggers who submit images along with their posts via WordPress, which then stores them in S3. It would be great if the writers had the technical know-how to optimize their images beforehand, but that’s not reasonable. To fix this, Autosmush scans all the content in their S3 account every night, looking for new, un-smushed images and compresses them.
Autosmush also allowed me to compress the huge backlog of existing images in my Amazon account that I had uploaded prior to using Smush.it.
If you’re interested in giving Autosmush a try, the full source is available on GitHub. You can even run it in a dry-run mode if you’d just like to see a summary of the space you could be saving.
Also, for those of you with giant S3 image libraries, I should point out that Autosmush appends an x-amz-smushed HTTP header to every image it compresses (or images that can’t be compressed further). This lets the script scan extremely quickly through your files, only sending new images to Smush.it and skipping ones it has already processed.
I’ve noticed a resurgence on the web of peopletalkingaboutOmniFocus and how they use the app to manage their task lists. Despite being a user for nearly three years — since the first public beta — for some reason I’ve never gotten around to writing about why I find it so useful and how it fits into my own workflow. So that’s what this post will attempt to do.
(Oh, and if you’re one of those readers who likes to skip straight to the end, allow me to save you the trouble: OmniFocus wipes the floor with every other Mac task manager because of Perspectives.)
Everything I say obviously only applies to my own, odd way of getting things done — but hopefully there will be a few points others might find useful, too. Particularly since my work straddles two different worlds: during the day I’m at a large corporation with responsibilities to multiple teams and relying on tasks delegated to other co-workers. At night, in addition to my personal commitments, I freelance and run my own, small software company.
OmniFocus helps keep me sane. Here’s how.
Three Buckets
All of the tasks I do throughout my day fit into one of three buckets. Some people call these “categories” or “areas of responsibility”. OmniFocus represents them as folders.
My first folder, “Personal”, holds all the tasks and projects that fall under my, well, personal life. This includes everything from the mundane “buy toilet paper” and “rent car for NY trip” to beefier tasks like “get three estimates for new backyard fence”.
Within this folder I have a few single-action lists. These are a special category of task lists in OmniFocus that aren’t actual projects. Meaning, they’re not something you can ever fully complete — they’re ongoing. As you can see in the screenshot to the right, these lists cover topics like “Financial”, “Home Repairs”, “Shopping”, and my “Someday” list. I’ve found that most of my day-to-day tasks fit nicely into one of these lists. And if they don’t, it’s no big deal. I just create a new project as needed.
My next bucket (folder) is “Click On Ideas”, which is the LLC I do freelance work as and use to sell my Mac apps. Within that are folders for each of my apps and projects for any freelance work on my plate.
In the screenshot, I’ve opened up the folder for Nottingham. Inside you’ll see I’m actually using OmniFocus for tracking bugs and new product features. I’d never recommend this for larger, team based work, but as a single developer it works fairly well. The majority of my bug reports and feature requests come from user feedback via email. Because OmniFocus integrates so nicely with Mail.app, it’s practically frictionless to convert an email from a user into a task.
The folder I use to organize my day job at Yahoo! has a different structure. Each project gets its own list which sits inside one of two folders. “Projects” is for items that have a hard delivery or launch date. These are lists that contain concrete steps towards launching whatever it is I’m working on. They’re larger commitments that will eventually be completed and go away. The “Ongoing” folder is for projects that are complete from a development standpoint but still need to be maintained. It also contains other, more generic areas of responsibility that have occasional tasks.
The one thing all three buckets have in common is that I’ve structured their folder and task hierarchy uniquely to match the way I naturally focus on my work. In my freelance world, I rarely multitask. I’m focused on a single project for days at a time. Selecting one of my app folders lets me quickly see everything related to that product and nothing else. But at work, I’m constantly shifting my focus as priorities and other external variables change. Being able to focus on the projects that have due dates makes it easy to evaluate what needs to get done now and what can wait till later in the day or tomorrow.
Starting the Day
The organization explained above is how I make plans and keep an eye on the bigger picture. But when it comes to actually doing the work and knocking down my todo list, I have a rigid routine in place.
Each morning after I wake up, I get a drink (not that kind of drink), freshen up, and sit down on the couch with my laptop and zone out for 30 – 45 minutes checking news, Google Reader, Twitter, etc. I’ve found that getting all of my “soft news” and social updates out of the way first thing in the morning helps repress the urge to to check-in constantly throughout the day. I take note of anything worth reading for later, filing items into OmniFocus and Instapaper as needed.
With that out of the way I move on to my email. Like many people I’m sure, this is usually my largest source of stuff to do. Despite being two timezones ahead of my co-workers, I’ve always got 20+ action emails waiting for me when I wake up. And while I don’t subscribe to Inbox Zero or whatever, I do process my emails immediately, in a way that makes sense to me.
Each email gets scanned and categorized. No excuses. Every message is either
Something I can immediately delete or archive
Something that I can create a task out of and then archive
Or something that needs a reply. If that’s the case, it stays in my Inbox until I do so.
The benefit of this system is that no email gets left behind. Everything is guaranteed to be acted upon or at least seen and acknowledged, quickly. Perhaps more importantly, it means that each message remaining in my inbox is either unread or awaiting a reply — anything urgent is caught before it becomes a problem.
With my email processed and new tasks delivered to OmniFocus, I can turn to what actually needs to get done today. This is where OmniFocus really sets itself apart from the competition.
Perspectives
OmniFocus has the notion of Perspectives. These are saved settings that you can switch between with a single click. When deciding on the day’s work, I switch to the Due perspective. This gives me an instant look at all the tasks that are overdue or need to be completed in the next day or so.
I’ve customized OmniFocus’s built-in Due perspective to group my tasks by context. This gives me a clear division between what I have to do for Yahoo! and everything else. That’s important to me because, after all, Yahoo! pays the bills and those tasks take priority over most of my other commitments.
This first-thing-in-the-morning review provides a good foundation for the day. It’s great starting out with a clear sense of what needs to be accomplished so when your boss emails with a fire drill you know immediately what can be shifted or dropped without wasting time gathering your notes.
With a clear mind, it’s easy to get started and fully concentrate on the work at hand knowing everything else is accounted for and ready when you are.
But OmniFocus’s Due perspective is just the beginning. Here’s my toolbar with the perspectives I flip between most frequently.
There are four perspectives on the right that I’ve created.
Y! Available – Shows all of the work related tasks that I can choose from to do. This is more powerful than it might initially seem. Because OmniFocus lets you make certain tasks dependent on others — Task A has to be completed before Task B — you only see those items you can actually do. It filters out everything else so you don’t get distracted when picking what’s next.
Y! Next – This is similar to the Available view, except it further refines the tasks it displays. Rather than showing everything you can do, it simply gives you the next item available in each project. This is great when you’re in the zone, cranking through your work, and trying to stay focused.
People – I love this view. It generates a list of all the items that I’m waiting for other people to finish. Here’s a screenshot: I accomplish this by creating a context with the name of each person that owes me something. Then, whenever I need to delegate a task, I just assign it a context of that person’s name and forget about it.
Two tricks that make this work well:
Group people by company. This lets me see not only who at Yahoo! I’m waiting on, but also any 3rd party vendors.
Make sure you mark each context as “on hold”. This way, the tasks don’t show up in your Available or Due perspectives. Since the tasks are assigned to other people, there’s no need for you to worry about or even seem them.
Weekly Report – Finally, this perspective generates a list of everything I’ve accomplished in the last week — grouped by project and ordered by date completed. This is a great tool to have at your disposal during stand-ups or review meetings with your boss.
It’s hard to describe how incredibly powerful Perspectives are until you actually spend a few days with them in your workflow. Other task managers have smart folders or dedicated “Today” lists, but they absolutely pale in comparison to the flexibility that Perspectives afford.
Let’s Be Clear
I don’t want to end up with an inbox full of hate mail tomorrow morning, so there’s one thing I want to clarify (because I know how insanely zealous the web’s productivity cult can be). This is my system. Not yours. I’ve timed my daily activity futzing around in OmniFocus, and it has never broken twenty minutes. That’s less than half an hour out of my day in exchange for a clear mind and less stress.
But if you’re one of the many on the web, clamoring for “simplicity”, who work best in plain text files, edited in Notational Velocity, synced via Simplenote to your laptop, where you publish them to your GitHub hosted website using Sinatra, and then review them on your iPad at Starbucks, before transcribing them into your Moleskine using a Blackwing pencil — more power to you. The important thing is that you find a system uniquely fitted to your needs. Which leads me to my final point…
The Takeaway
First off, I apologize for my use of the word “takeaway”. It’s something that was beaten into me during my assimilation into the culture of a corporate Marketing department. (action items!)
The takeaway from this blog post is this: Don’t let your desire to Get Shit Done™ get in the way of you getting shit done.
More specifically, find a system for managing your commitments that works for you and stick to it. Use any tool you want as long as it fits your workflow and keeps you sane and efficient. It’s ok to tweak things down the road, but don’t go jumping ship each time the new task manager du jour gets a favorable Lifehacker review.
I’m not going to call out anyone specifically, but there are a number of well-known bloggers who I follow and respect very much, yet week after week it seems like they discover a new iPhone task tracking app or some holy grail full-screen text editor that promises to revolutionize hipster productivity. And that’s fine — whatever works for them. But my fear is their clout, if you can call it that, is creating an online community of zombie, productivity wanks who put their tools in front of their work — who spend more time figuring out how to get stuff done rather than actually doing it.
My advice to you? Ignore those posts. (Hell, ignore this one!) But hurry up and find a system that works for you so you can get back to doing what you do best — making awesome stuff.
For an upcoming project, I needed a quick PHP function that would generate strong passwords. It’s an easy problem on the surface, but it has some quirky nuances that appear if you spend any length of time thinking about it.
For example, it’s not enough to merely pick characters at random — you have to include at least one character from each set (lowercase, uppercase, digits, symbols) to minimize the chances of someone guessing the password.
Another problem are ambiguous characters. Many times, users won’t (or can’t) cut and paste the password you generate for them. This happens quite often on mobile devices. They’ll manually transcribe the password from an email or even read it aloud. Differentiating between I, l, 1, 0, and O can be a nightmare — for them and for you when they email support because their password won’t work.
Complicating matters further, long, random passwords are often mistyped because users have to look back once or twice while typing the password. For long strings, it’s easy to lose your place and duplicate or leave out a character or two.
The method I finally decided on solves each of these problems. It generates a strong password of a specified length, without any ambiguous characters, and can optionally include dashes between groups of characters to help users retain their place. You can also customize which sets of characters the password will contain, e.g. alphanumeric, no uppercase letters, etc.
<?PHP
// Generates a strong password of N length containing at least one lower case letter,
// one uppercase letter, one digit, and one special character. The remaining characters
// in the password are chosen at random from those four sets.
//
// The available characters in each set are user friendly - there are no ambiguous
// characters such as i, l, 1, o, 0, etc. This, coupled with the $add_dashes option,
// makes it much easier for users to manually type or speak their passwords.
//
// Note: the $add_dashes option will increase the length of the password by
// floor(sqrt(N)) characters.
function generateStrongPassword($length = 9, $add_dashes = false, $available_sets = 'luds')
{
$sets = array();
if(strpos($available_sets, 'l') !== false)
$sets[] = 'abcdefghjkmnpqrstuvwxyz';
if(strpos($available_sets, 'u') !== false)
$sets[] = 'ABCDEFGHJKMNPQRSTUVWXYZ';
if(strpos($available_sets, 'd') !== false)
$sets[] = '23456789';
if(strpos($available_sets, 's') !== false)
$sets[] = '!@#$%&*?';
$all = '';
$password = '';
foreach($sets as $set)
{
$password .= $set[array_rand(str_split($set))];
$all .= $set;
}
$all = str_split($all);
for($i = 0; $i < $length - count($sets); $i++)
$password .= $all[array_rand($all)];
$password = str_shuffle($password);
if(!$add_dashes)
return $password;
$dash_len = floor(sqrt($length));
$dash_str = '';
while(strlen($password) > $dash_len)
{
$dash_str .= substr($password, 0, $dash_len) . '-';
$password = substr($password, $dash_len);
}
$dash_str .= $password;
return $dash_str;
}