Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

What are these bad actors doing that requires top-down censorship and marginalization?


If you have to ask that question, then you've never been a moderator or ran a platform before.

There is a lot of content that is invisible to you, the user, because it ends up being filtered out by moderators because it consists of overt racism, calls to violence, outright trolling, stalking and more. Even as a moderator of a small private platform there was a user that was stalked by someone who would attempt to reregister accounts solely for the purpose of harassing them or finding any personal details that they could dig up about them

Eventually you have to establish a certain level of moderation for your platform or those bad actors can and will chase off all of the other users for one reason or another. This gets worse as a platform scales and the level of malicious content your platform is exposed to grows exponentially.


None of the problems you've raised are related to YouTube's proposed changes in the linked articles. They already have rules against most of the things you mentioned. The proposed changes go far beyond them.

https://www.youtube.com/yt/about/policies/#community-guideli...

> Starting in 2012, YouTube rebuilt its service and business model around “watch time,’’ a measure of how much time users spent viewing footage. A spokeswoman said the change was made to reduce deceptive “clickbait” clips. Critics inside and outside the company said the focus on “watch time” rewarded outlandish and offensive videos.

YouTube's engine is currently ranking content based on the amount of time other users are spending actively watching that content. Unwanted videos are already de-prioritized by the current algorithm. The proposed changes are explicitly intended to de-prioritize videos that people are actively watching.

It remains to be seen how they will measure "quality". If they find a bias-free way to measure it, I'm all for it. Most likely though, it will be driven by top-down notions of "outlandishness" and "offensiveness", as opposed to bottom-up user engagement.


You asked what kind of bad actors an open platform might have to deal with and I gave you an example with a very small platform.

Now you scale that up to match YouTube and it becomes a nightmare. YouTube is being pressured to take action by it's users because what's happening is exactly what I was talking about: They have bad actors taking advantage of your platform. That's where Elsagate came from or the more recent reveal of pedophiles using the platform to groom kids.

There is no unbiased way of solving this problem. Because you have to establish certain things as being bad for your platform, which means you are going to be biased against them. Youtubes reaction is in turn them trying to solve issues at scale when they clearly didn't consider how the platform would scale in the first place.


The "bad actors" that you're referring to on YouTube aren't harassing anyone or inciting violence or breaking the YouTube community guidelines in any way. They are simply espousing opinions that you dislike and disagree with.

Sure, YouTube is allowed to do anything they want. But suppressing an open marketplace of ideas isn't in society's best interests. If unpopular opinions were suppressed in the past, the movements for women's rights and gay rights and civil rights would have faced a massive setback. And let's not even get into the question of whether it's in society's best interests for a handful of corporations to arbitrarily decide how to censor the marketplace of ideas.


I'd like you to take a minute and go visit sites like Voat or Gab. Just, hover around there for a bit if you haven't. Those are sites that are exactly what you want: A pure and open marketplace of ideas without any sort of censorship or rules. This is to prove a point.

Which is that an open marketplace of ideas has no value in itself, because the marketplace can be very easily taken over by bad actors if you don't exert some control over your userbase.


Clearly YouTube has been extremely valuable even before they felt the public pressure to penalize "low quality content".

Your comment about an "open marketplace of ideas having no value in itself" is very curious. The very idea of freedom of speech being a good thing is predicated on the idea that an open marketplace of ideas is a good thing. If it isn't, you may as well lobby the government to ban any and all speech which you consider corrosive.

"When men have realized that time has upset many fighting faiths, they may come to believe even more than they believe the very foundations of their own conduct that the ultimate good desired is better reached by free trade in ideas--that the best test of truth is the power of the thought to get itself accepted in the competition of the market, and that truth is the only ground upon which their wishes can be carried out. That, at any rate, is the theory of our Constitution. It is an experiment, as all life is an experiment."

― Oliver Wendell Holmes, Jr.


Did you actually visit Gab or Voat? I'm asking you this because if you haven't, then you can't actually argue in favor of the 'marketplace of ideas' very well. Either that or you're not willing to argue in the defense of such sites. Or you're possibly being disingenuous, considering we've switched from corporate curation/censorship of their content towards government censorship which are two very different topics.

As I mentioned, places with zero censorship and zero moderation can and are very quickly overtaken by malicious actors. Either way, if you're willing to defend Gab and Voat on their merits of being an 'open marketplace of ideas' after having gone there then we can continue our argument.


The concept of "marketplace of ideas" goes far beyond Gab and Voat. Two bad apples and cherry picked examples do not make for a counter argument.


Ask @dang and/or your favorite forum admins why any open platform needs moderation.


Posting videos of death, gore, kiddie porn, targeted bullying, fascist propaganda, trying to incite genocide, rape and other violence.

The idea that we should allow sites that are marketed to the general public to be unmoderated is completely ridiculous.


The only things on your list that they should be prohibited by law from showing is kiddie porn, rape, and targeted bullying. People should be able to police themselves from the rest as they see fit, but there's nothing wrong with posting it online.


Yeah, those should be prohibited by law. I think that genocide should also be on that list. And youtube can (and IMO should) continue to prevent the rest. Death, gore, genocide, fascism, violence, etc. do not belong in general public discourse. There's always some other website to post snuff porn.

I don't agree with the way that youtube has applied their guidelines for removing content in general, but I do generally believe that they should moderate their content.


> Yeah, those should be prohibited by law

Then try to get it passed. Write to your representative. If it's so obvious that it should be prohibited by law, then you should have no trouble finding support.

I suspect what you'll find instead is that people don't think it should be prohibited by law, and prefer to instead threaten and pressure a more vulnerable and less democratic entity like YouTube into carrying out their unpopular censorship desires.


I think the things I've outlined are pretty popular TBH.


Evidently not popular enough to become prohibited by law. Granted, I'm not as familiar with non-US laws.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: