Wait what? AI generated reviews? Please tell me this is a joke? What the fuck is the point of that? In an ideal scenario I want to read about what experience a buyer has with a product. I already expect at the bare minimum of half of the reviews to be fake. Why would you admit you absolutely shouldn't trust any of them?
Particularly since summarizing text is something that llms are actually decent at, it makes sense to use them for that. They're unreliable at generating new content, but asking for a description of text that's just below it is reasonable.
The point is that some VP probably pushed for the use of AI, and a director and senior manager chasing promotion decided to deliver it, despite there being no clear use for customers.
Fake Amazon reviews is a service you can buy to boost your product. Using genAI is an obvious move for these providers. Makes it harder for Amazon to find the fakes, because they can generate more content variety.
When you run a botnet for such a service, you can't only put 5 star reviews on your client's products. You want a variety of usage pattern modifiers to stay below the radar. Putting reviews on semi-random products is one technique.