The entire stunt appeals to people's sense of moral outrage over businesses buying influence in the form of political donations. The reason people find it morally outrageous is because it corrupts the political process: politicians are supposed to represent their constituents, not the whims of the highest corporate bidder. Politicians who engage in this kind of quid pro quo behavior put selfish gain ahead of the good of the community.
Which is why I found it particularly galling that the PR firm relied on people's moral outrage about paying for influence to peddle their message ("tell them you like our initiative and are TIRED of politicians taking legal bribes") -- while doing exactly the same thing: paying for influence, in the form of purchased Reddit upvotes, which corrupts the upvote process and puts selfish gain ahead of the good of the Reddit community.
Normally, when PR firms use "hacking" to describe their techniques, they're talking about novel approaches to getting coverage, sort of like how "life hacks" are novel solutions to life's problems. But in this case, the firm is using "hacking" very literally -- infiltrating and taking control of a system by illicit means. They are black hats, and we should view them not only as morally bankrupt but also very dangerous.
I'm expecting that any day now they'll run a follow-up post, "How we hacked the U.S. media to help an anonymous powerful Russian client sway the presidential election."
Almost every modern campaign takes advantage of our hairpin
tendency to get outraged. Hell, this entire thread is people getting outraged about "cheating" using millenia-old social tendencies.
The solution isn't getting outraged about outrage. It's designing systems that compensate for this tendency. Unfortunately, that often means slowing down sensitive discourse.
I think a good solution would be to get up early in the morning, assess what you'd like to influence or draw attention away from, and then craft a message that takes advantage of this tendency.
I think the way to deal with both justified and unjustified outrage is to just face it. Discourse that excludes any sort of indignation or deep love isn't more sensitive than one that includes such things.
Sometimes anger, sadness and other emotions are simply strong signals that something is wrong and needs to be changed. Changing the emotion isn't useful in cases where it's the situation that is at fault, it's what the abuser would like to see though. If the outrage led to these firms being eradicated from the face of the Earth, it would wrap to a celebration real quick. Why not push for that?
Just because there's a whole lot of huffing and puffing going on, too, just because there's a lot of emotions "produced" if you will, out of thin air and with no grounding and purpose, doesn't mean you can throw it all together and judge it by the "hype".
Normal people have emotions. Normal intelligent people have emotions and can hold their own in discourse. People without a connection to their core want it to be solely about some supposedly objective and external set of rules, but that's like computing very nice colorful patterns on a computer without a monitor connected. Getting more RAM or a faster CPU is not the upgrade you need in that case.
Friedrich Nietzsche saw this a long, long time ago.. or maybe he didn't, what do I know what he's talking about, but I recognize the timid people walking carefully to protect something that they already lost by doing that erry day. They're so protective of what they lost that they won't even lift the lid.
> "I say unto you: one must still have chaos in oneself to be able to give birth to a dancing star. I say unto you: you still have chaos in yourselves.
> Alas, the time is coming when man will no longer give birth to a star. Alas, the time of the most despicable man is coming, he that is no longer able to despise himself. Behold, I show you the last man.
> 'What is love? What is creation? What is longing? What is a star?' thus asks the last man, and blinks.
> The earth has become small, and on it hops the last man, who makes everything small. His race is as ineradicable as the flea; the last man lives longest.
> 'We have invented happiness,'say the last men, and they blink. They have left the regions where it was hard to live, for one needs warmth. One still loves one's neighbor and rubs against him, for one needs warmth...
> One still works, for work is a form of entertainment. But one is careful lest the entertainment be too harrowing. One no longer becomes poor or rich: both require too much exertion. Who still wants to rule? Who obey? Both require too much exertion.
> No shepherd and one herd! Everybody wants the same, everybody is the same: whoever feels different goes voluntarily into a madhouse.
> 'Formerly, all the world was mad,' say the most refined, and they blink...
> One has one's little pleasure for the day and one's little pleasure for the night: but one has a regard for health.
> 'We have invented happiness,' say the last men, and they blink."
You left out the fact that the proposed legislation was fake too. This whole thing was simply to draw attention to a politician that nobody knew. It's scummy to the core, but the dumbest thing they did was write a post about how they did it. That kinda goes against what their client wanted and probably against their own best interest as well.
Sometimes you have to operate in the system that exists. The way reddit is setup allows companies to blast their Coke ads and Bill Gates PR pieces to the frontpage. A campaign trying to limit a money for influence exchange in one context that uses a money for influence exchange in another is just doing the exact same thing their opposition will. This seems more like a case of "Don't hate the player, hate the game."
I'm digressing a little here, but I'd really like to comment on this point.
I have always detested the "don't hate the player, hate the game" ethos. It's a total shirking of responsibility that inadvertently praises being able to game any existing system regardless of the purpose for which the system exists in the first place -- that is if somebody has or discovers the opportunity to exploit a social contract of any scale for pure personal gain, then the onus isn't on the person to maintain any sort of ethical footing. It's on the system to have processes built in to deal with such a situation -- if it doesn't then it deserves to be exploited.
Maybe somebody would have some valuable input here, but I see no value in that sort of ethos. It tends to end in waste. Or at the very least, the returns are diminishing.
I guess my end point is I don't understand, and as such dislike, how nonchalantly the platitude is thrown around as if it's to just be accepted.
Agreed. To put on my white hat for a moment, vulnerabilities in a system should be documented and publicized so that they can be fixed. Exploiting those vulnerabilities for personal gain (to the detriment of society) is immoral, and in many cases, illegal.
And what do you do when the creator is not fixing the system? Astroturfed ads and PR pieces has been a problem on reddit for quite a while with no fix in sight.
If it just kept happening I think I might take a screwdriver and show it him at the next cookout. Maybe complain that his inaction was bringing bad characters to the neighborhood. Now that's obviously not the intention of this PR piece but it is the effect so I don't really see any issue.
Agreed, the saying to me comes across as a cheap way to try and absolve oneself of any personal (moral/ethical/etc)responsibility as if the mere presence of flaws in a system necessarily means they're not responsible for their own actions. A borderline childish approach to refining like reasoning in my opinion.
I agree with your sentiment, though I do think there are situations where this saying applies. You have to look at how much pressure the overall system puts on the participants.
It's hard for me to tell a specific rule for that; it's more a case of "I'll know it when I see it". One rule of thumb I seem to follow is whether the behaviour is more about explicitly seeking gains, or avoiding expensive loss. Like I described in another comment, there's a meaningful difference between a typical person doing bad things to keep their only job (jobs are hard to come by and vital to survival), vs. an entrepreneur who choses to profit from hurting other people, even though there are alternative strategies of making money.
I see what you're saying (regarding your other comment). In those cases I think the platitude is more closely "Don't shoot the messenger" as it's not their motive.
The saying in question I've always taken to be one where the 'player' is the motivated party, not merely to prevent loss but to distinctly gain. Rather not the employee just trying to keep their job, but the employee playing shady politics to slither their way to the top of the heap and the largest possible portion of the take -- those who see what are team efforts as zero sum games where the teams are `I vs. Everybody else`.
So while we're sorting out the rhetoric, I agree with your sentiment as well -- and it certainly needed to be said that I wouldn't condemn somebody in such a situation. Of course I generally see morality as a gradient.
> I have always detested the "don't hate the player, hate the game" ethos. It's a total shirking of responsibility that inadvertently praises being able to game any existing system regardless of the purpose for which the system exists in the first place
Hm, I've always taken that phrase to mean, "don't be prejudiced against a whole person: people are complex and you can probably find something to agree on with anyone."
So, I often try to say I don't like certain a thing about a person rather than saying I hate them. Hating everything about a person just feels a bit too simple and wrong. Perhaps that is too PC for these days, but, that's how I take it. Even Hitler had a mom- that sort of thing.
I don't think it's too PC for these days and I actually strive to always have the same approach - never hating people, but instead hating particular aspects / behaviours.
That said, I always seen the "don't hate the player, hate the game" saying the way GP does - i.e. shifting moral responsibility away from participants. IMO it does make sense in some cases, but does not in others. I see it as a spectrum of pressure put on individuals.
For instance, I wouldn't blame a customer service employee who lies to me because their boss ordered her to, and she will lose her job if she doesn't. For most of the population, losing a job is as close to life-threatening situation as you can get without an actual medical condition. OTOH, I will blame the boss who ordered lying to people. The boss has many more ways to choose from, and telling employees to lie to people shows preference for profitting by fucking other people over.
>It's a total shirking of responsibility that inadvertently praises being able to game any existing system regardless of the purpose for which the system exists in the first place
I disagree. It is simply an acknowledgement that the system has failed the purpose for which it exists. When that system fails, attacking the people who found the flaw isn't a viable long term solution. The only real solution is to fix the system that failed in the first place.
I have yet to encounter a system that doesn't have some dramatic set of failures. I also have yet to encounter a system whose failures were not made dramatically more painful by 'players'.
The intention is encoded in the terminology. Sometimes it's not a game.
Instead of up votes, I prefer Forum style threads. New replies = bump to top. If those replies are spammy or not contributing, moderation can flag/ban those accounts.
>Which is why I found it particularly galling that the PR firm relied on people's moral outrage [...] while doing exactly the same thing [...] which corrupts the upvote process and puts selfish gain ahead of the good of the Reddit community.
I (truly) mean no disrespect, but surely you see the irony of succumbing to moral outrage over Reddit upvotes? Surely you also see the absurdity of comparing Reddit upvotes to democratic elections.
This tendency to turn everything into a scandal is partly responsible for the very thing you're decrying.
Is there really a qualitative difference? The abomination that is r/the_donald certainly had some effect on the election's outcome, and reddit is somewhere in the top-10 sites worldwide.
It's hard to figure out the exact ratio for the conversion of upvotes to realvotes, but as long as it's positive, buying upvotes is just a roundabout way to buying votes.
Reddit is not a democracy, it's a private forum for discussion. Confusing this with a media outlet or a public forum (to say nothing of a democracy!) is as patently insane as confusing an infomercial with a scientific paper.
Feeding into the outrage only makes the beast stronger. Just roll your eyes and focus on the things that matter, like democracies.
> Surely you also see the absurdity of comparing Reddit upvotes to democratic elections.
He may, but I don't. American elections are decided by very thin margins, and I think even small streams of influence can have outsized outcomes. Swiftboating and Rathergate come to mind. I specifically think the rise of /r/The_Donald, just before the election, which carpet-bombed the ever-living daylights out of the front page, had a non-inconsequential effect on the vote. It's why Reddit changed their long-standing front-page algorithm. I'm truly surprised they didn't do so the month before the election. Utterly astonished, actually.
People get extremely defensive on Reddit if you insinuate that this is common. But it really doesn't take a whole lot of skepticism to see though the more blatant ones.
Reddit is still a really great site when you unsubscribe from all default subs and any sub that has gone "critical shill" at about 100k or more subs.
> Reddit is still a really great site when you unsubscribe from ... any sub that has gone "critical shill" at about 100k or more subs.
AskHistorians is still a great sub, at 620k subscribers. It's (intentionally) not on the default list, and has very heavy handed, strict moderation, which is probably why the quality is so good.
One regret I have about AskHistorians is that I don't get to see all the hissy fits thrown by people who don't like the strict moderation and believe their speech should be uninpeded.
I am a moderator on a major subreddit of serious nature. It's pretty boring. Here are the typical culprits.
Probably the most common who have something to say to the mod team are those who blame us for pushing an agenda. We welcome and encourage debate. We want people to call bullshit on something bunk, but we stand by that it has to be something that can be substantiated. We just want you to provide some kind of proof, source, etc. other than, "This totally happened to my sister's boyfriend's cousin!"
Some people take those removals very personally, because the comment is about something heavy that has greatly impacted that person's life. They feel like we are snubbing them when they took the time to reveal something personal. A non-real example would be a study that shows that cancer survival rate increases when you drink at least 60 oz of water daily. Someone will inevitably reply with, "My mom died from cancer, and all she dra nk was water with the occasional cup of tea." I mean, we're sorry for your loss, but your single anecdotal point doesn't really refute the claim. The comment gets removed, and then we're called monsters. If you really think it's bullshit, find another study that claims otherwise or poke holes in the study itself.
Another common one reason is low effort comments or comments that don't add to the discussion. "Jeez, why is this a comment graveyard?" is a pretty common removal. There usually aren't hissy fits, but this, jokes, puns, lyric chains, comments like, "Yeah and water is wet," are also common. Two other low effort comments are, "Someone give me TL;DR please," or, "I want to believe this, but some comment is going to blow this claim out of the water."
This one annoys me when I see it somewhere like /r/Science. It's like, the miss make it super clear that they remove low grade /off topic comments etc. It's not that hard to understand.
Yeah, I don't experience a lot of obvious shilling on Reddit, but I've unsubscribed to most of the default subs and mostly just read small subs relevant to my interests.
> Yeah, I don't experience a lot of obvious shilling on Reddit
When the same talking points are repeated over and over again on large subs and dissident opinions are automatically downvoted to hell, it's obvious there are organized efforts to push specific narratives or point of view. r/politics, r/news, r/worldnews are full of these, with sometimes the moderation itself in bed with the political astro-turfers, banning users that don't fall in line.
Regarding the article, the news that was astro-turfed wasn't clashing with r/politics narrative. And since journalists and bloggers themselves now source their news and stories from Reddit, it helped spread that news all over the internet.
Reddit is an extremely efficient viral marketing tool, both brands and politicians understand it now. When journalists start quoting reddit in their news that then get posed on reddit as news, we've come full circle.
A sibling to this comment mentions/r/Bitcoin, which I agree has biased moderators. But I don't see such blatant bias in the large political subreddits.
The main thing about /r/politics that I have only recently noticed (since the 2016 election) is that they have become extremely orthodox democrats. Before that election, a lot of the time they were a mixture of libertarian and farther left leaning (but certainly not orthodox) people. Now they focus all of their time on Trump bashing, which isn't bad per se, but is extremely annoying since it's pretty much the only thing the sub is about these days. Thing is, there are a lot more political things going on than Trump, and outrage culture has resulted in at least two WaPo posts over each and every one of his tweets.
And then in the comments, you'll be crucified by the posters for anything other than the pro corporatist democratic views that the posters espouse. Move on from bashing Trump to say, bashing Hillary because she was an utterly unlikable candidate and arguably the biggest political failure in modern history, and you draw the ire of dozens of toe-the-liners who see the world just as black-and-white as the idiot /r/the_donald posters.
There's also an excessive amount of hyperbole that just gets annoying as time goes on. Clearly Trump will be around for quite a while longer (until and unless Mueller finds a smoking gun or in the event of a massive Democrat success in 2018), but reading /r/politics you would think he'll be impeached by next week. Adding fuel to the flames are websites like WaPo and the Independent who publish articles with these terribly misleading and hyperbolic headlines that make it seem like Trump is going down, and you get a massive echo chamber fueled and funded by shitty media websites.
I would say world_news isn't as bad as politics. Specifically, world_news seems to be full of more the_donald people jumping on any and every thread.
They are still there in politics as well, but they are usually way at the bottom of the thread. It is sad though, even if you assume that it isn't as "controlled" as it is -- pretty much the front page is all anti donald trump / republican stuff these days. While some of that is fair and is news, what is worse is that at least in the old days, there would be nice constructive discussion/counter points, usually as the top voted post in the comments. Now its mostly just circle jerking there.
Yeah, I think to summarize what I most dislike about what reddit has become is that it's too partisan. Places like /r/conspiracy have become very pro-republican/trump, ridiculous considering the purpose of the sub, and there are also now like 25 different anti-trump subreddits that constantly get to the front page (and I believe many of these, especially the ones that spring up seemingly out of nowhere to front-page status in weeks, are the toys of astroturfing marketing campaigns).
It's a pretty toxic environment and by constantly focusing on Trump they're crowding out a lot of content that could actually make users happier.
Just the nature of a growing community. Generally the larger the population gets, the lower the average discourse gets, where reddit in many places is no better then a Facebook comment thread. It is a bi-partisan issue for sure.
Anything remotely right leaning. /r/politics could more accurately be named /r/The_Left. Often the top comments of any thread are just whining about Trump in the same way that /r/T_D is just whining about MSM and Hillary. While there boy still posts that they won't tolerate useless comments I've seen nothing but useless comments there these days.
I don't even care about their political leanings if they were just a tiny bit more diverse. But when the whole first page is filled with exactly the same story just posted from different newsoutlets, you know that the moderators are activley promoting an agenda and don't care about news.
That's just wrong. The moderators on both subs are actively curating the front stories and removing 'duplicates'. I don't care about TheDonald as they are not a pre-set sub nor proclaiming neutrality. But you can't be seriously claiming that the 16 out of 25 front page stories of some variation of 'DT colluded with Russia' on /r/politics are unique newsworthy stories.
> That's just wrong. The moderators on both subs are actively curating the front stories and removing 'duplicates'. I don't care about TheDonald as they are not a pre-set sub nor proclaiming neutrality.
I'm unsure what neutrality means at this point. I don't expect anyone to be "neutral", but I make a distinction between debatable news and troll memes.
> But you can't be seriously claiming that the 16 out of 25 front page stories of some variation of 'DT colluded with Russia' on /r/politics are unique newsworthy stories.
No, I'm certainly not, and while moderators remove some obvious duplicates (query string differences etc), the same story from different outlets are still left and often upvoted. I don't think that's a grand conspiracy but merely user habit.
Whatever you think of this is up to you, and I think there are a lot of issues worth discussing that gets drowned out by the clown in chief-stories, but I've seen no evidence of this being anything other than crowd selection.
And it's news (insignificant, duplicate or big) on one sub, and 80% memes on the other. That's the contrast I was commenting on.
> I don't experience a lot of obvious shilling on Reddit
It was more obvious before the election, when the shilling was so prevalent that different accounts were copy/pasting the exact same phrases into their comments. And you would see a new point brought up, then an hour or two delay, then a response would suddenly show up in multiple places, copy/pasted into many comments, in multiple accounts.
But watch /r/all sometimes... you'll see a topic just suddenly start appearing. As an example, you think Rick & Morty grew popular on reddit organically? No, that started over the course of about an hour one morning, as a clearly orchestrated effort, and it worked.
It doesn't happen all the time... quite a bit of reddit is still honest and organic. But watch it carefully, and enough of it is manipulated that I don't put much trust in what I see there.
The very first serious internet campaign I ever noticed was Britney Freakin' Spears. When her first album came out, she, in her little school girl outfit, were in literally half the banner ads you'd see in an average day. TV, magazines, etc. She was EVERYWHERE. I got the same impression with this Rick and Morty thing. After months of it, I finally watched about 5 minutes of it, and proved to myself that there was no way it was even remotely organic. I mean, come on! It's not even in the same league as Bob's Burgers! ;-)
Some subjects touched anywhere on Reddit are huge instant bot/PR magnet.
On top of my head I can think of: Fracking, Glyphosate (or anything related to Monsanto/Endocrine Disruptor), Turkish Politics.
Definitely fracking and Monsanto. I've also noticed some software and hardware products seem to have a shill presence.
I think we all remember that 6 month period where Microsoft decided to shove VSCode down every ones throat. It was impossible to get away from. New vim released? "Yeah vim is good but now I use VSCode TM". New sublime? New Atom? Same thing. They were aggressive with that one.
> I think we all remember that 6 month period where Microsoft decided to shove VSCode down every ones throat.
Not me. I don't remember hearing about VSCode until a few months back. I had no idea it was first released two years ago.
Maybe I just wasn't reading the right places. But I subscribe to /r/programming, and I think I would expect that to get hit, if they're going to be astroturfing on reddit?
I'm not saying this definitely didn't happen, but I'm skeptical.
I saw loads of accusations of that, but very little evidence to suggest that these were shills, rather than real people who like VS Code. We all get very passionate about our editors, and lots of people use and like VS Code so it's not surprising people comment about it when the discussion is about editors. I like VS Code. I can't remember if I've ever commented to this effect. I'm not a MS shill.
Yes, Fracking! The number of shills that were in many subs was off the charts, a few years ago.
I'll add to the list: Windows 8 shills. I remember right after it came out, I had lots of issues and posted in some sub. Wow, the negative responses to my post were amazing.
Eh, you might as well spin up a few alts over Tor and occasionally use them to build karma, Reddit allows Tor users and registration over Tor.
Not saying this isn't a good thing, if Reddit banned Tor a few of my favorite community members would disappear, and I'd likely drift away from the subreddits I do still follow.
The PR firm isn't at all ashamed of what they've done, in fact they are publishing this as an example of what a great PR firm they are. After all, it shows their "hustle". It's odd, but not at all surprising, that someone in the PR/marketing world can fail to even see how something might be morally wrong with their scummy methods.
Is that hustle? His last ditch effort was to use Reddit and was lucky it worked. No experience or expertise, and now potential clients know how little it cost.
have you read the article?
@hn_throwaway_99 was referring to the wording in the article: "This gave the campaign the boost we needed and it was all the direct result of one thing: hustle"
Sometimes I wonder what it says about our business culture that "hustle" describes both earnest effort and scamming people. Maybe it's nothing, but it seems a little odd.
Do you know that nice little HN poster child, AirBnB? They bootstrapped themselves using exactly the same strategy, with Craigslist in lieu of Reddit (they also used to call it "hustling"). You might now boycott them, too.
I do. And I argued my dislike of that since the moment I learned about it. I generally disapprove of astroturfing. AirBnB is not one of my favourite companies.
I prefer to avoid the service itself if I find it resorting to such methods. And I don't care "everyone" does that, I want to punish them, and reward those who don't do such things.
Maybe I'm going to have a hard time, sure. But "be the change you want to see in the world" and all that. In the past, my objections have prevented one company from engaging in marketing spam, and made a third party dump their SEO provider because of shady practices. I'll continue doing as much as I can, and I encourage others to do the same.
I get it, they need to promote their services, but if this is what they think is the right way to do it, that doesn't fill me with confidence that they have any ability to do PR beyond buying fake upvotes on social media.
On a topic where one would expect citizens chasing for public good, we find marketers and advertisers working for a wealthy businessman paying a convictionless campaign to become famous!
And the advertisers are so proud of it, they give all the details of their Reddit cheating, and worse, all the details of the absence of political conviction of their wanna-be-politician client.
Maybe the story is real, but I cannot believe the advertisers are dummy enough to be the ones writing this article.
I would better think of someone related to Fiverr.com behind... [edit: or an enemy/competitor of the politician]
>I cannot believe the advertisers are dummy enough to be the ones writing this article
If it were any other industry aside from PR/advertising I would give them enough credit not to do something like this, but in this case, I could totally see them doing something this stupid.
Look, I give them credit for coming clean to the public. And a lot of people use Reddit to promote their business, band or other brand (though they do it honestly, not by purchasing a boost). But the more well-known the technique of buying upvotes becomes, the worse the site will be for myself and other users.
Early paid upvotes are the seed for later organic upvotes. You don't even need to spend $200 to get them.
The onus is on Reddit to detect and punish voting manipulation.
When someone proposed a similar voting manipulation trick on Hacker News (https://news.ycombinator.com/item?id=13676362), dang explicitly notes that such techniques are a good data point for voting manipulation detection algorithms.
While Reddit seems less than competent to prevent this kind of manipulation, it does apparently violate Reddit's terms of service.
"You agree not to interrupt the serving of reddit, introduce malicious code onto reddit, make it difficult for anyone else to use reddit due to your actions, attempt to manipulate votes or reddit’s systems, or assist anyone in misusing reddit in any way. It takes a lot of work to maintain reddit. Be cool."
I am not sure it is even a question of competency. I think it is a question of motivation. It is the same reason why Twitter doesn't crack down on their bots. Both sites have an inherit interest in appearing popular. Having 50k upvotes or 50m followers looks more impressive than 25k and 25m. It is possible that those companies feel the value of those increased numbers is greater than the potential damage bots do to the community. HN doesn't have the same motivation since HN is more of a vanity project with a bigger focus on the good of the community.
On the other hand, Digg died because they were effectively letting publishers buy front-page slots ahead of organic submissions. If Reddit becomes known for manipulation, it could also suffer (though its scale, heterogeneity and skepticism of shilling are significant antibodies).
It's hard for me to believe that you can't just pay Reddit, Inc., directly, for whatever manipulation you want. It probably just costs more, and needs to be done with more discretion. But then, I lean cynical and conspiratorial that way.
I remember researching this back in 2014. You could buy 50 upvotes for about $350. On most mid-sized subs, an initial hit of 50-100 upvotes would get you to the 'hot' section.
Once you get to the 'hot' section, however, your content quality matters. If the content is good and/or has viral potential, it will attract votes naturally.
I was marketing for a travel site at that time. I learned that rather than paying others to push your content, you can get the same effect by posting at the right time and creating content that would naturally appeal to that subreddit.
I do know some people who've bought established Reddit accounts for $50-200 and use that to get upvotes via proxies.
This was an interesting read. I'm not sure it's the best idea as a blog post because I'm sure Reddit staff will get onto it then keep a much closer eye on this firm. I feel like journalists will be the same too. If I received 10 emails about these guys I'd be a bit skeptical that there is any actual interest.
As an aside, I wonder if they're using the same tactics here.
The amount of spam, manipulation of news and opinion, and fake accounts on reddit is insane, and reddit's staff (actual paid staff, not moderators) do almost nothing about it. This story is only the tip of the iceberg.
Look at the recent US presidential election, look at the ongoing, long running censorship on r/bitcoin, among many other examples. I moderate a small subreddit related to a niche sport and deal with a massive amount of spam and fake accounts, and reddit's spam policing tool (Automoderator) is somewhat difficult to setup, only partially effective and could be a lot easier to use (e.g. one click bans for accounts and domains) if reddit's staff were serious about the problem.
> I'm sure Reddit staff will get onto it then keep a much closer eye on this firm
How can they connect this firm to random submissions and/or upvotes? I think even targeting fiverr would be difficult, because sellers aren't going to advertise which usernames they're going to us to vote.
This is a very difficult problem to properly solve. Even if they can somehow guarantee each user account is one unique individual, what's to stop a company paying 200 actual, real people to go and vote on something on reddit?
They said they were using fake email addresses. I'm also not sure if there's any meaningful way that Reddit can fix this sort of issue either. Sure, they can keep an eye on some suspicious accounts but yeah...
Usually you don't have to buy that many upvotes/likes/whatever either - usually just a few in the beginning helps seeds a post/tweet well enough that it starts to bring in the organic traffic.
"Hustle", like "passion" and "wanting it", is a popular imaginary virtue to claim when your success was either down to luck, or something obvious and unimpressive (e.g. you paid a bunch of people to astroturf for you).
These guys didn't do anything remotely new or skillful. All that's special about them is that they're brazen enough gloat openly about cheating.
Not at all. Hustle means going out and making things happen by talking to people and promoting yourself. It does not imply shady behavior at all. All entrepreneurs hustle.
It's why virtually any /r/all'd subreddit is a shitbed and has been for years. And to be honest, there's absolutely no chance it's just Reddit. People are cheating everywhere. Anyone with weak morals and either excess money or enough desperation is buying fake social engagement in hopes to find more real social engagement. And by doing this, it's ensuring that the playing field is not level for naturally going viral.
Not that any of this isn't obvious, but blaming Hack-PR for one of the older internet tricks is a bit unfair. They're just one of the few shameless enough to simply say it with zero guilt.
> but blaming Hack-PR for one of the older internet tricks is a bit unfair
I know it's not solely their fault. Just like people don't really blame Obama when they say 'Thanks Obama', I wasn't putting all the blame on Hack-PR when I said 'Thanks Hack-PR'.
Why would you consider this isolated to /r/politics? This happens on every subreddit that is big enough to have value in reaching the targeted audience. That is one of the many reasons why the best subreddits are usually the ones with the smallest communities.
> Why would you consider this isolated to /r/politics?
That was the one mentioned in the article and that is a subreddit that has been particularly awful since basically the entire last US election campaign.
I've not used /r/politics but I imagine it'd be pretty shitty even without bots, politics is too polarizing, unless you have a echo chamber, which removes the point of it being a general politics subreddit.
I'm not a big participator on /r/politics but I do follow it to keep up on the news and overall it's not that bad surprisingly. For being politics related and on reddit the mods do a decent job at keeping it civil, which I'm sure is no easy task.
There's a one time $5 fee at signup and a 1 week waiting period after to post/comment. Seems to do a good job preventing this type of thing because the costs quickly get prohibitive if your dummy accounts keep getting deactivated.
It's still an ugly echo chamber on politics though, it is extremely far US-left on everything.
Nominal fees don't work because someone will just lay the $5 per account knowing they can sell them later for more. Even Reddit accounts aren't free when you factor in the cost/time to set them up and build a bit of karma.
It all depends on how aggressive the site is on cracking down and banning dummy accounts and the cost of the "upvote packages" (which I can't seem to find). On Metafilter they seem quite aggressive so the accounts have very limited value/lifespans.
A dummy account on reddit may be allowed to vote hundreds of times before being banned so the price per vote becomes minimal.
>Does anyone know a forum similar to this or Reddit where it's ALL verified accounts?
I'd personally head in exactly the opposite direction - somewhere where there are no accounts at all, in the tradition of anonymous imageboards since the founding of 2chan. Granted the selection isn't so wide any more.
Dear Reddit, Maybe this idea would help slow down this type of abuse:
It seems like it would be easy enough and cheap enough to build a honeypot to identify accounts used for the purchased Reddit upvotes.
For example, Reddit could set up some honeypot posts to track paid upvote accounts.
They then go and pay these upvoters to upvote the honeypot post and identify the accounts used. (It would be helpful if the post was hidden so other people don't find it accidentally. In fact, it is possible to just use a tracking redirect page given only to the paid upvoters and use any post as the upvote "job" so it would be hard to identify by the upvoters.)
Then Reddit could ghost those identified accounts. Simply ignore their votes in the system, but don't tell the account owners, so the owners continue using the accounts without realizing the problem.
This would make it very difficult for the account owners to know which of their accounts were compromised.
Then on any new posts where these upvoter accounts are being used in majority, other accounts can be found. The other accounts that also similarly upvoted on this article could represent other paid upvote accounts.
Track those other accounts and how often they appear beside the ghosted paid accounts, and voila, you have found more paid upvoters.
Keep doing this and it makes the paid upvoters ineffective because although they can work the system, their work is only being used to find other paid upvote accounts and also clients who are paying for paid upvotes.
After a time period, the clients could be sent a warning:
It has been detected that you are using paid upvote services which are against Reddit TOS. Please contact customer service so we can work together to remedy the problem. Failure to do so may cause your account to be banned and all your posts removed from Reddit. Have a good day.
Of course Reddit doesn't have to do this, and really anyone could do the same process to build a list of paid upvoter accounts and a list of articles and clients that use those services...
So what do you think, would this put a dent in the upvoters effectiveness?
They have been doing this since the start of Reddit. They are called shadowbans.
The problem is that many of these accounts are only used once or twice before being retired or sold off. So if you pay the "upvoters" to honeypot the accounts, all you'll do is put a very small dent in their accounts and next week they'll be back up to 100%. And if you pay them every week you'll just become their main source of income.
And the accounts are much more "crafty" than you think. often copying high-karma comments in reposts, or posting semi-nonsensical markov-chain style comments for years before being used as "paid" upvotes.
It's really weird, I follwed one a few months ago. I saw it was posting those markov-chain style comments, and I watched it until it's karma hit like 500 then it deleted all it's previous comments and only had one comment talking about how great some app was on /r/android.
IIR, there were a string of screenshots of Reddit admin/mod chats making the rounds last year where they were bemoaning how the Correct the Record money was going to start drying up.
It's not just accounts that are bought and paid for. Every major social network is making money by shaping conversations around the interests of paying customers, political and otherwise.
This is basically what Reddit was doing before with "shadow bans". IIRC they had to stop because they just couldn't keep up with it and were banning people that shouldn't have been.
The OP is using techniques that used to work on the wild wild web 10-15 years ago. I thought by now everything is being normalized, or at least serious people don't use spamming techniques to launch a business.
If all these bought upvotes come from new accounts, or from the same few IP ranges, or have a lesser ratio of comments to upvotes, or are interacting only between themselves and not with the larger community -> reddit can detect them and turn them into ghost accounts.
Reddit needs to open up a Kaggle challenge for detecting rented upvotes and other abuses, use the data it has already shared with the AI community (the reddit dataset) to detect such attempts as they happen.
Good vote spamming business will be completely automated. They'll have accounts in reserve pool, with ability to spin up thousands more in a couple minutes. Most likely they make a couple hundred accounts a day and prime them with VA's with real content & votes with the intention of burning them with fake votes once aged appropriately.
If you run a website and you're getting spammed, the worst thing you can do it also ban the accounts as this give spammers feedback that the account is no longer valid to use and they'll rotate it out. Best thing you can do is /dev/null the account. To the user it will look like account is working as expected, but on the back end anything they do doesn't actually do anything. Harder for spammer to know when that account is burned. This way they'll continue to use it, and you can use this data to find related accounts and /dev/null them as well.
The title of the article is "How we hacked reddit...", this submission currently says "How to go viral by using fake reddit likes", and is more accurate. They didn't hack reddit, they bought upvotes.
I don't get the impression that there's any substantial vote monitoring, and so it surprises me that it even cost money to do this kind of astroturfing. How hard would it be to setup and maintain a dozen Reddit accounts and spread them over a VPN service? 10 min initial startup, and not more than a minute a day of doing innocuous activity on those accounts, occasionally. When a campaign rolls out, then have the accounts work in concert.
Sure, it might not be as 100% successful as Fiverr (though I imagine it's fairly easy for Reddit to ad-hoc identify voting blocs if something was known to be bought). But you could employ additional optimization techniques, such as the one used by most high-karma users (e.g. Gallowboob): if a post fails to hit critical upvote mass, then delete and resubmit later in the day.
To give you an idea of how things seem to be relatively unmonitored until users flag it, there's the story of Unidan:
And as a more recent, obscure example, there was the mystery of why the mod of r/evilbuildings had something like 499 of the 500 most upvoted posts in his own subreddit. The math was so laughably in favor of manipulation but a Reddit admin, using whatever shit tools they have to investigate this, acquitted the mod:
The details of how this mod was able to boost his own posts without being called out for vote manipulation is too banal to explain in detail (basically, he would shadowdelete other popular posts so that his would get picked up by the Reddit front page, and then undelete the popular posts before anyone noticed). But the fact that a Reddit admin (I.e. a paid employee) thought that the evilbuildings mod always having the top post in his own forum for 6 months straight was just a coincidence, and/or because that mod was just apparently an amazing content submitter, spoke hugely about how uncreative the Reddit admins might be in detecting fraud.
If this is the kind of effort users put toward imaginary points (though arguably raising karma is part of Gallowboob's professional work), I'm nervous to think about the schemes that PR firms will construct when they realize the easy return on investment offered by Reddit popularity.
You can be into understanding the media around you without actually wanting to produce it yourself. However, if you are interested in adopting some of the takeaways, the strategies can be used tastefully and ethically.
If you don't like the harmful ways that media manipulation is used, you might actually like that the book explicitly warns the industry how vulnerable it is to this practice, which can be used by anybody for any purpose. This warning came four years before the 2016 U.S. presidential election.
fantastic book about media manipulation. Very helpful in putting any type of media you see (reddit or otherwise) in perspective, and identifying where it is vulnerable to manipulation (which, on the internet, is virtually everywhere).
It seems that everyone is aware that likes, follows, upvotes, etc can all be bought, and therefore these numbers are manipulated regularly. But does anyone care to see the problem solved?
Reddit should really try to proxy / VPN / Bot detection because I'm willing to bet the people on fiverr are using large proxy networks to achieve this.
What would be the proper way of gaining traction on reddit? Is that even possible anymore? I mean if the game is already rigged what chance do honest businesses stand in this environment?
I dont have an account on reddit (been there for what? 7 years now?) and I always wondered how somebody go viral and get traction now this stuff makes me think everything is basically done and paid for.
I don't know if this is a hack, per se. I work in media and PR, and this is just one of those things you do. Pump up the issue, get eyeballs on the campaign, find a way to jazz the reporters, and off to the races. What may have made this fly is that the idea was already in the minds of the public, and the media. It's a LOT easier when that happens.
So I'm working on a side project that basically has an HN/Reddit interface. One monetization idea I had for down the road is basically a legitimate means to boost certain posts for certain periods of time, giving them prominence on the site in a clearly labeled area for such purposes.
While I understand people finding this distasteful, it's exactly the kind of rule-breaking that they should be doing. Cheating? Airbnb broke Craigslist's rules to good effect, among others.
It's naughty without being outright evil. When did that become a bad thing on HN?
A lot of people don't seem to realize that being the top link on r/politics is a public good that's available to everyone. Just because someone pays $200 to make some politician's publicity stunt that nobody cares about be the top link there (I mean really, nobody cared - the idea of a law forcing politicians to walk around wearing the logos of their top ten donors is beyond silly), doesn't mean that everyone else can't also be the top link there at the same time, with other publicity stunts nobody cares about!
The great thing about being a top link is everyone can do it at the same time. It doesn't corrupt the process at all or waste anyone's time. Everyone can benefit from it and it doesn't make things worse for anyone.
For example imagine if all the top links on hacker news were just corporate advertisements disguised as stories. Would it be a worse place or cause any of us harm? Of course not.
> imagine if all the top links on hacker news were just corporate advertisements disguised as stories. Would it be a worse place or cause any of us harm? Of course not.
Did you leave out the </sarcasm> tag by accident?
How can the front page being topped with "publicity stunts nobody cares about" not make it a worse place?
(I must say I'm a bit blown away that you think I require a sarcasm tag, though - I wrote that "everyone can be the top link", an obvious absurdity since how can everyone be the top link, and that hacker news wouldn't be any worse if every single link were a paid story. I didn't think I came even close to having to mark my comment as sarcastic, which I sometimes do.)
Seriously, right? Can totally see another article like this one with the title, "How I created a viral thread on hackernews and made X dollars on Fiverr."
To be fair, this is probably the only time I've seen the term hacked used correctly. I no longer know whether I should use it in this sense or the "incorrect but used by almost everyone" sense.
I don't think it's used correctly here. They didn't hack together any code, they didn't hack on some multipart kludge. All they did was break rules by buying votes.
Which is why I found it particularly galling that the PR firm relied on people's moral outrage about paying for influence to peddle their message ("tell them you like our initiative and are TIRED of politicians taking legal bribes") -- while doing exactly the same thing: paying for influence, in the form of purchased Reddit upvotes, which corrupts the upvote process and puts selfish gain ahead of the good of the Reddit community.
Normally, when PR firms use "hacking" to describe their techniques, they're talking about novel approaches to getting coverage, sort of like how "life hacks" are novel solutions to life's problems. But in this case, the firm is using "hacking" very literally -- infiltrating and taking control of a system by illicit means. They are black hats, and we should view them not only as morally bankrupt but also very dangerous.
I'm expecting that any day now they'll run a follow-up post, "How we hacked the U.S. media to help an anonymous powerful Russian client sway the presidential election."