Beacon: Is Global Opt-Out Enough?

December 5, 2007 at 4:37 PM

Facebook just added a global opt-out for Beacon, their "revolutionary" advertising system.

Prior to this, Facebook claimed to offer users an opt-out privacy scheme. However, users who didn't want to participate were required to disable Beacon for each partner site that attempted to send a story after the first story was sent, and there was no way to prevent that first Beacon story from going out to all of one's friends. That stance flagrantly flies in the face of accepted privacy standards, and is certainly not how the "opt-out" system is defined. The backlash of negative press and user feedback caused Facebook to embrace the accepted definition of opt-out: once you turn it off, it's off everywhere.

The problem is, this still isn't an acceptable privacy stance. "Opt-out" means that unless the user takes action, Beacon partner sites will advertise. Facebook assumes that they have a user's permission to inform people about what that user is spending money on — that's simply not the case.

Most users won't disable Beacon, simply because they are either too apathetic or are uninformed. Facebook is relying on inertia — the inertia associated with understanding the problem and then taking action to fix it — to make Beacon a success. Mark Zuckerberg's blog post about the privacy improvements has an opt-out link, to be sure, but it's as hidden as possible.

Of course, the reasons for this are obvious: Facebook knows that Beacon would fail as an opt-in program. No one is going to volunteer to be advertising fodder.

The ethical standards required when conducting academic research are incredibly high: I was taught that "opt-out" is only an option when no personally identifiable data are being collected. Everything else requires some form of consent, and consent can never be granted implicitly.

It's disturbing to me that research institutions are held to significantly higher standards than corporations. I'm not simply talking about what study participants or web site users are willing to accept — people nowadays will forgo many levels of privacy for hardly any benefit. I'm talking about what each groups' peers enforce. Academics hold each other to standards. If your research methodology isn't up to snuff, it doesn't really matter what you uncovered. Companies, on the other hand, are more than happy to ride the inertia inherent to an opt-out system, and blithely look the other way when others do the same.

My opinion is that your users must be in control of what data are sent to others. They must agree before the first Beacon story is published for the system to be considered ethical in privacy terms. The only conceivable way to do this is an opt-in approach.