Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Why do we need the kind of privacy you speak of? Because while people nod in approval to your rhetoric, their behavior shows that they feel free to give it up the "privacy" of what they're eating for breakfast en masse.


I get the point you are trying to make, but I think it would only be valid if people fully understood the transaction they are making and chose to make it anyhow. I do not think we are there. Nor do I think this is because people are "stupid"; these issues are complicated. Over time, many of these people have discovered the issues that can arise and regret their decisions in hindsight; many more will come to the same realization in the future.

It should also be pointed out that privacy advocates in general are not saying these transactions should be impossible; what they are saying is that it should be possible to not conduct these transactions. There's basically no option to participate on Facebook and, say, pay them money to leave you the hell alone.


it would only be valid if people fully understood the transaction they are making and chose to make it anyhow

It's really not that hard: if you're trying to keep something from your ex-husband, don't tweet or Buzz it publicly.

it should be possible to not conduct these transactions

But that's the case now - I have the option to exert complete control over who gets to see my facebook profile, my tweets, and my buzz-es.


"I have the option to exert complete control over who gets to see my facebook profile, my tweets, and my buzz-es."

To my understanding, those buttons aren't anywhere near as powerful as you may think; I'm pretty sure Facebook affiliates see right through them if you do much more than glance at an app. How many randomly-selected people will be able to correctly answer that question?

Pointing out that there are parts of privacy that people do get ("Gosh, maybe I shouldn't make a post about how much fun my mistress and I had last night") doesn't negate the point that there's a huge chunk that they don't understand at all, and therefore we can't look at their actions as endorsement of the current way private information is being handled.

(Both of you who have replied do not seem to be getting my point here. I'm not making the general purpose argument that "we should have privacy". I'm making the argument that we can't read people's apparent acceptance of the current privacy regime as real acceptance when they don't actually know what the current regime is. Which is why my post here focuses on the issue of what an average person understands is going on, rather than whether what is going on is intrinsically good or bad.)


Do people ever fully understand what they're getting into, I'd say not.

As for your second point; you can take facebook as they present it or you don't have to use it, it's not for you to decide what they can and cannot do


If you can say both of those things at once, you missed my point. You can't deliberately "take it or leave it" (your second paragraph) if you don't know what "it" is (your first paragraph).

The point is not that I demand that Facebook be legally constrained to provide me a high-privacy option. I would fight such a law. The point is that nearly nobody would even understand what such a choice is, and in an environment where nobody fully understands the implications of their actions, it is not a legitimate argument to then cite people's actions as supporting the current system. You could argue that people would still make the same choices even if they did fully understand the transactions they are making (and I'd counterargue after that), but you can't base an argument on current people making conscious choices when they aren't in fact making conscious choices.


People should just assume everything that is on the internet and not encrypted is public. It's safe that way. Heck, people think email is private. I think people need to realize this is something new and that their might be negative consequences that they cannot foresee. They should think, if this is data is public what is the worst that can happen, if this data is correlated with other data i have online what can happen.


I believe the reason is freedom of choice. Clearly not everyone wants to be part of this system. Do you not know of people who lock their profiles on facebook, or people who strictly avoid social networking sites in the first place? If that is the case, I would be very surprised. Do we remove their ability to choose?


Did you actually read the article we're discussing? There's a great example of why privacy is important right there.

As for people being willing to give up their privacy, I think I already addressed that in my previous post: most of those people don't understand the implications and just go along with it because of peer pressure, but when they discover things like the Buzz problem or Facebook's Beacon mess, they are actually quite upset.


Of course I read it. IMO, it's the-sky-is-falling linkbait. Just one example - a woman not realizing the privacy settings the Buzz account she opted into is not comprable to publishing Eric Schmidt's home address. If I were a Google shareholder, I would either expect him to move, or a security detail at his house for a year after such a leak. The cost of his being killed by a disgruntled employee/shareholder/user/random person are just too high.

What percentage of facebook users would you guess know what "Facebook Beacon mess" you're referring to? In particular, more or less than 0.1%?


> Just one example - a woman not realizing the privacy settings the Buzz account she opted into is not comprable to publishing Eric Schmidt's home address.

No, it's not, because in her case there was probably far more sensitive information than just her address, and it was being routed directly to someone who was known to be physically abusive and to that person's friends. Schmidt's address, on the other hand, is presumably a matter of public record already as he's a director of a corporation, and as far as I'm aware he has not personally been physically abused for a prolonged period at any point in his past.

Also, I'm not familiar with exactly how the abused ex opted into Google's new service and thus attracted the unwanted attention. She certainly seems to feel that the information was shared more widely than previously automatically and without her knowledge or consent. Could you explain her misunderstanding and how she opted into the extra service without knowing it?

> If I were a Google shareholder, I would expect a security detail at Eric's house for a year after such a leak.

You don't think one of the most high profile and wealthiest men in the US makes any security arrangements normally?

> What percentage of facebook users would you guess know what "Facebook Beacon mess" you're referring to?

I don't know, but it was enough that Facebook themselves pulled a fast U-turn when they got the feedback, and they haven't gone anywhere near the same idea since. Presumably they have a pretty good idea of how strong the opposition was to pull out like that.

To answer your other question, I find it unlikely that if 99.9% of Facebook users were happy with a change, they would work so hard and so fast to revert it.


Google doesn't know which of her most-emailed contacts are likely to abuse her. She does. It's her responsibility to set her privacy settings and buzz accordingly. I'd support laws that would prohibit Google intentionally misleading people about what is public and what stays private, but Google buzz is not a good example of that.

Schmidt's address is certainly not a matter of public record. Read the original source [1]. The town and his wife's name were found through a local newspaper, and his home address was found through her political donations, which she unwisely entered as originating from her home address, without realizing that those are a matter of public record and would be published by the FEC. The only role Google played here is indexing the newspaper article and providing an the software The Huffington Post used to visualise FEC data. Google did not "collect" this in any sense appropriate here.

1. http://news.cnet.com/Google-balances-privacy,-reach/2100-103...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: