>> > the rule bans reviews and testimonials attributed to people who don’t exist or are generated by artificial intelligence, people who don’t have experience with the business or product/services, or misrepresent their experience.
I guess they don't know about how people scam Amazon reviews by getting legit people to simply buy the product and leave a five star review and then get reimbursed for their purchase later by the company or the company the company hired to get these people to do this.
(From 2022)
Inside the Underground Market for Fake Amazon Reviews
Buying Positive or Negative Reviews: The final rule prohibits businesses from providing compensation or other incentives conditioned on the writing of consumer reviews expressing a particular sentiment, either positive or negative. It clarifies that the conditional nature of the offer of compensation or incentive may be expressly or implicitly conveyed.
I hope this is actively enforced with real teeth very soon. I 1-star fake products and call them out in reviews resulting in the devious vendor somehow being able to send me a postcard to my real physical address offering money for 5 stars. The sham vendor also spam my email weekly. Amazon appears to actively support this process. It needed to be curtailed decades ago.
>> The final rule prohibits businesses from providing compensation or other incentives.
Amazon has had this rule in place for a long time and I still get cards in the boxes of the stuff I buy, "Give us a 5 star review and get 30% off your next purchase!"
Clearly Amazon doesn't know about this or isn't generally enforcing it. I'm wondering how the FTC is going to patrol this since Amazon has already had this rule in place for a while and it hasn't dissuaded sellers from changing their habits.
Given that hundreds of people reading this thread have experienced exactly what you’re talking about, I think it’s impossible that Amazon doesn’t know anything about it.
Amazon is currently providing a LLM-generated summary of these faked customer reviews. To abide by the FTC ruling, Amazon would now have to prove that all of their training data is legitimate customer reviews. Do you think they will actually do that?
It seems the gift of free AWS cloud services reconciles all harm Amazon continues to do against customers and employees alike. The government will need to locate its backbone.
The government is not a single entity. those investigating this type of thing are rewarded for success, and are not in any way related to those who would use services.
(as pointed out, it is also illegal for AWS to do that)
That conspiracy theory needs work. The federal government pays billions annually for cloud services and it’s “people go to jail” illegal for the government to accept free services which would otherwise cost money (i.e. the government can use the AWS free tier like everyone else but above that they’re paying like everyone else).
It's not a conspiracy theory. It's business as usual for AWS. I'm all for righteousness but that's not applicable to the US Government and DOJ. People are bought and sold all the time.
The people I've met that leave reviews for free product aren't required to leave any "particular sentiment". They just rely on tacit laws of reciprocity.
I think the bigger issue Amazon will face is that you can edit items in a big way... it's not like just clarifying "Multi-socket extension cord" to "Three socket extension cord" but swapping out products wholesale once you've built up a clout of good reviews on it.
Honestly - Amazon really needs some serious lawsuits to force it to stop being such a bad actor in the online retail space.
> Honestly - Amazon really needs some serious lawsuits to force it to stop being such a bad actor in the online retail space.
I think Amazon they would say that they are not a bad actor at all and in fact are providing a meaningful service to consumers and are a major driver of the economy and besides it isn't really a problem because AI[1] yadda yadda yadda.
The truth is:
a. fake reviews make them money, and
b. almost no matter how bad fake reviews get on Amazon, people will continue to dump dopamine into their brains by buying shiny baubles that they might never take out of the box. The "joy" is in purchasing these things, not actually using them.
This is an extremely hard problem to solve. What degree of change makes it a different product? And that doesn't even touch the problem that products can look identical on the outside and use cheap crap on the inside. Amazon is not a bad actor here. They have every incentive to solve this problem. But they won't, not because they don't try, but because this is a problem as old as commerce.
It's not hard at all, it just needs moderation. Amazon is absolutely the bad actor because they allow sellers to edit their listings to utterly unrelated items, rather than having moderators reject those changes. It's not hard to prevent a cheap kitchen utensil with 2,000 positive reviews from being edited into an expensive drone.
And while moderating things like social media at scale has a lot of challenges, moderating product pages does not. There are orders of magnitude less of them, and they don't need to change that often.
It's a hard problem for a computer to solve - a computer shouldn't be used to solve it... computers were never used to solve it before Amazon because it's clearly a hard problem (and it scales really well with human labor).
Amazon are being a bunch of cheap bastards and skimping on human moderation of product listings - we, as a society, don't need to give them a free pass for trying to make an even more enormous profit. This is only deeply unprofitable to moderate if you have a lot of products listed you're never going to sell any of.
Suddenly we now have a ton of "new" issues cropping up everywhere. Suddenly being last 20-ish years. These aren't "new". They're just difficult to automate with a computer program, and every company is cheapo now and tries to automate everything with a computer program.
This problem doesn't exist at, say, Walmart. Presumably they physically vet products to at least some degree.
Walmart shuffles parts the other way - the barcode will change every year or so or whatever so they can be sure to clearance out the old.
Walmart’s online store has some similar problems. But you maybe it $5 to lost a product, $10 to change it, problem solved. Now you can hire real humans.
Users need more ability to intelligently contribute. I just hit this yesterday. 2-star review that was actually entirely about the third party that shipped expired stock, not about the product itself. All I could do is flag the review as "other", no text. (As it was the only review I also reported it under the something wrong with the product which does allow text.) And specifically give us "wrong version", "wrong product" and "seller, not product" flags. And don't reject my review that clearly called out that this isn't the real thing. I didn't simply get a counterfeit, the whole listing was counterfeit.
Abuse problems? Give more weight to squawks by people with a lot of purchases and not a lot of what are found to be bogus gripes.
Just to be clear, Amazon is the bad actor here, assisting worse actors. This problem exists because they don’t want to spend money managing vendors, and it’s not a problem for anyone else. I never go to Walmart and discover that the cereal has been replaced with sawdust because even a huge, cost-obsessed retailer will hold their vendor accountable for that.
If the government started enforcing real penalties, each order would have an easy way to report this, they’d actually accept abuse reports when you get a contact from a vendor buying reviews, and they’d start spending money reviewing products themselves like other retailers do.
If you haven’t heard of it, in addition to traditional inventory there’s an entire profession around “mystery shopping” where businesses pay normal people to use their services and report their experiences. They’re explicitly trying to catch dishonest middle managers and suppliers who might do things like pull expired food from the shelves when it’s time for the annual inventory or make a better meal for someone they recognize as a corporate employee, and it’d be very, very easy for Amazon to check sellers the same way and if they actively solicited abuse reports they’d have an easy prioritization mechanism. It’d cut slightly into their profit margins but I doubt Bezos would even have to reduce the number of spacecraft he buys because I’ve heard so many people mention not buying things on Amazon because they’ve been burned by fraud in the past.
"generated by artificial intelligence" ? So if I write "this product sucks" for a review and I use Bing or some other source to rewrite this to "this product's quality does not live up to the manufacturer's claim" based on my input does that make it a crime?
I read it as "attributed to people who ... are generated by artificial intelligence.'
Insurance against the argument that "This person who wrote the review does exist, just not in a flesh body, they're an AI creation." But that might also be an instant-flop argument legally since I'm sure "personhood" has some definition near-future AI can't hope to approach.
Especially now that literally anything the FTC does could be struck down by a federal judge at any time, unless it is explicitly written out or delegated legislation.
I guess they don't know about how people scam Amazon reviews by getting legit people to simply buy the product and leave a five star review and then get reimbursed for their purchase later by the company or the company the company hired to get these people to do this.
(From 2022) Inside the Underground Market for Fake Amazon Reviews
https://www.wired.com/story/fake-amazon-reviews-underground-...