One of the biggest misconceptions in the world of programming is that "eating your own dogfood" is a magic cure-all. Sure, developers should use their own apps, it brings the team up to speed and gives an organization context with which to judge the app's worthiness. But dogfooding is no substitute for user-based design and testing.
Dogfooding is particularly effective in two development scenarios. But it can also be a trap. Here are a few scenarios in which almost any dev team can benefit substantially from eating their own dogfood, and an example of how it can go wrong.
- When you have to learn a new platform. You can't build a great app experience for a platform that you’re unfamiliar with. Are you mainly an Android gal or an iPhone guy developing an app for a Microsoft phone? To do your job well, you need to become a real user of the target platform. You need a minimum of four weeks with that platform as your primary platform, and much longer for bigger or more heavily used platforms. This is true not just for developers, but for managers, designers, QA testers, and the like. Anyone who is part of the creative process, development, or decision-making team for the app had better be fluent with the platform.
- When you're testing pre-release software. Eating your own dogfood is vital when you're working on experimental features, which carry the greatest risk of developing major problems or breaking another part of your app. Your dev team will need to catch problems long before the public gets their grubby little mitts on it--not sit back and wait for the complaints to roll in. Remember, if no one in your company uses your app on a given platform, the folks finding bugs in the wild will be customers, and those customers might be justifiably upset. And let’s face it: No matter which app store you're selling through, reviews have the power to totally sink even the best piece of software. You want your app to be as close to 5-star-worth as possible before anyone with the power to review it gets a look.
- When you want to create accountability. Dogfooding gives you a way to solicit immediate and brutally honest feedback that would be almost impossible to obtain anywhere else. Focus groups? People are too consensus-driven. User diaries? People are too kind, knowing that you and your team probably worked hard on these features. If you want the truth, then you want coworkers to give you shit if you’re doing something stupid or just not useful. Fail early. Fail often. But fail internally.
All that said, however, it’s important to remember that dogfooding is not a replacement for traditional (or non-traditional) user testing. You still need good feedback loops with real customers outside your company so you don't succumb to hive-mind.
It’s neither kind nor entirely fair for the industry to pick on Facebook Home any more than it already has; the potential of Home is yet unrealized, but that doesn't mean it will always be so. Still, Home's tepid rates of adoption hold important lessons about how dogfooding does and doesn't work.
For one thing, dogfooding a psuedo-OS layer like Home would take much, much longer than for an app, and it seems Facebook didn't respect this formidable adjustment period. This is doubly true in the instance where employees are building such a complex layer over an already-foreign, heavily segmented platform. According to TechCrunch, Facebook has been "Droidfooding" since late last year, encouraging employees to switch to Android so that a larger proportion of the engineers have a firsthand grasp of the overall Android experience. (In fact, the company went so far as to install various pieces of propaganda around the Facebook campus highlighting Android's explosive growth rate as an impetus to start using it.) But the fact remains: Facebook is an iOS-dominant shop. Says TechCrunch:
While the default choice for what phone employees got used to be an iPhone, a Facebook spokesperson tells me that now “We don’t encourage one device over another. We let employees choose.” When I asked what the breakdown of iOS to Android users is in the company, Facebook’s spokesperson admitted, “I don’t have a ratio but with the early focus on our iPhone app and the multi-year cycle of carrier contracts we do have more iPhones deployed.”
Android's heavily segmented device pool would mean that users aren't true platform natives until they've owned more than one device on a platform, over the course of several post-paid contract periods. Using several Android phones gives one an idea of the range of usability, and provides an opportunity to see how each OEM handles core tasks and design issues slightly differently. Rather than "Droidfood" for two-to-four years, Facebook only did so for about six months before launching Home in May. Adoption of the Home app has been tepid, and the HTC First, the flagship device for Facebook Home, was almost immediately discontinued due to poor sales.
Another problem with Home dogfooding: Secrecy. Now that Facebook is a public company, leaks about black-box projects can have very real impacts on their market cap, which was likely the impetus for majority-internal testing. In our informal tests, Facebook Home has been fairly stable, which tells me they found enough internal users to squash any technical bugs that showed up--this obviously wasn't a failure of testing writ large. In fact, we know from this FastCo.Labs interview with Facebook UX Researcher Marco De Sa that Facebook tested extensively with about 60 individuals--a huge group for in-depth UX testing by most companies' measure--but with few exceptions, all the testers were Facebook employees.
The problem is presumably that Facebook employees love Facebook, and, well, love is blind. No doubt these employees are heavier-than-average users, possibly even by mandate. Normal people love Facebook, sure, but they also want their weather and calendar and other critical data. It's possible the Home team didn’t realize that a device’s “home screen” experience shouldn’t focus on one app, but should straddle all commonly used apps and interactions. In our interview with De Sa, we asked about this very issue. Here was his reply:
Co.Labs: So all these studies were with Facebook employees? Is it always a good idea to test on your most expert users?
De Sa: Oh, we definitely try to test with external users as much as we can and for Facebook Home, at the beginning of the study, we actually showed some external users; but for some of these interactions Facebook employees are users just like any other. Trying to see how they use gestures and things like that, or how they learn or discover new gestures is something that you can actually do with Facebook employees as you would do with another user. Of course, Facebook employees seem to be more proficient with using Facebook, but again these interactions aren't necessarily something that is Facebook dependent. They're just navigating content and things like that.
Co.Labs: Is there really adequate diversity of opinion and experience within the Facebook corporation for testing something that will be used by people all over?
De Sa: We try to talk to Facebook employees who were not involved in design, were not engineers, we try to get a broad sample of people with different levels of expertise, recent employees, older employees. So, we try to recruit considering all of those differences. Yes, we're designing for a billion people.
There is a dogfooding lesson here, though. Does Mark Zuckerberg carry an HTC First, or any other Android phone with Facebook Home installed? Does Mike Matas? (Doesn’t look like it, judging by the “via Twitter for iPhone” metadata on his recent tweets.) Why not?... Turn Facebook Home into an interface that Facebook designers and engineers want to use, not merely feel obligated to use, and then they’ll have something. But if it remains something that even Facebook’s own designers and engineers do not prefer over the iPhone (or stock Android, or any other platform), if it remains something that the company needs propaganda posters to promote even among its own employees, then Facebook Home will remain what it is now. A dud.
Is it any wonder they didn’t anticipate the lukewarm reception from everyday users, and backlash from Android fans?
[Image: Flickr user Derek Gavey]