"It's for the audience to answer where their value lies. High skills [...] is a value proposition." I would be generally inclined to agree with you, but the premise of my essay is that the people who need editors are not aware of the deepest problems in their work; this is why they're unaware that AI/LLMs can't adequately do the job.
The bigger problem that flows from this is that, in the research world, when people publish machine-edited work, important errors are apt to fly under the radar and then go on to influence policy and "science" for the rest of the world. It becomes a quality-of-life problem for us all. The danger is similar when we're talking about technical writers, government publications, or any writing of an informational nature on which quality-of-life (or law, or basic health and safety) will depend.
Tl;dr: my clients are very rarely just average students in need of an Aristotle; they are researchers, whose work is poised to shape society in ways that few people outside of academia tend to recognize. These are NOT people who should be handing the reins over to AI/LLMs.
I hope that, as the problems with AI/LLMs become more common knowledge, the research world will return to hiring human editors. ;)