Here's a question about y'all's opinions on gen "AI"...
If I wrote a story, it's my own original idea, my own work of fiction, I'm dictating the events of the fictional story, and I used an "AI" to help turn my flat, often repetitive writing (I tend to say "so-n-so said" ad nauseam) in to something more readable, and then proofread it or had another human proofread it, would you consider this acceptable?
Additional clarification - *I* would be creating the events of the story, the "AI" would only be helping with how to word the description of those fictional events.

@hellomiakoda before the LLM "revolution", there was some promising "AI" tech focused on improving writing. As far as I know, it's gone, destroyed by LLMs.

I would not use an LLM for this task.

@hellomiakoda
I intensely dislike the usual LLM slop, but that's definitely not what you're proposing. Using the LLM just to help refine your prose sounds reasonable to me. I'll be curious to hear how that works out for you.
@brouhaha I don't actually plan on running what I write through it. I was merely curious. If I ever do actually write the story that's been sitting in my head for many years, I'd put it out there as is, shitty writing and all. Not sure it'll ever happen though. Executive dysfunction is a bitch!
@hellomiakoda I tried to do just that in GPT-2 and GPT-3 era, before ChatGPT exploded. It really ruins the writing. Even with state of the art models things are pretty bad: you feed it a masterpiece, you get generic flavourless slop out of it. If you prompt it harder, you get sparkling slop. There was a great article about an experiment of an English/Literature college teacher allowing students to experiment with LLMs, and how generic and flavourless results of the students' writing had become. Let me see if I can find it.
@hellomiakoda I think it would be fine. I'm not a fan of AI writing, but if you just step through the output to see some suggestions of alternative phrasing, then pick and choose what you like, I don't see the problem
@hellomiakoda dunno what acceptable means in this case, but I wouldn't wanna read it
@noiob Acceptable = not wanting to express backlash, or viewing the author negtively. Simply not wanting to read it yourself is just personal taste
@hellomiakoda I don't care what you do with your life. I think it'd be a waste of everyone's time.
@noiob I actually agree. If I'm not going to release something as I wrote it, any help I got in writing it would need to be credited - and LLMs aren't really creditable with any level of honesty, unless it was one I ran and trained myself, from training data used consentually - which would then all need to be credited as well.
@hellomiakoda if you let the machine choose your words for you all you're doing is mechanically listing events. Good writing is so much more than describing what happens.
@noiob "Valid", I should have used the word "valid" in my question.
@hellomiakoda I think I would prefer to read your original text even with repetitions than the bland average AI would turn your story into.
@hellomiakoda as an editor myself: AI would be the absolute worst thing you could use for specifically that problem. it is utterly incapable of fixing that, but could certainly make it worse.
@trekbek
Having an ¿experienced? editor decide on this is kinda cool.  
@hellomiakoda

@curiousicae @hellomiakoda going on like 15 years, so I'd say relatively experienced!

like, I'm sure there's some things AI can or could do well, but writing will never be one of them, in the same way you couldn't cook a meal with only a blender. if you *want* generic writing, maybe? but if you want good writing, the only way to do that is actually write it.