Try PowerToys Advanced Paste. Its "Paste as Markdown directly" works fine. I see it has a "Paste with AI" feature that looks cool!
I found an AutoHotKey script that's supposed to to do this here (with minor modifications to the pandoc command I copied from the normalize-clipboard script), but couldn't get it to work right. Maybe you'll have better luck.
where did the power go?
I don't think this is a helpful frame, but if you want to use it, the power went to the people – mostly due the democratization of information, accelerated by the internet, social media, etc. – and they then turned it on each other, wasting most of it and leaving little for the people they elect to govern with.
This ACX post, Your Incentives Are Not The Same, explains it adequately. He is prominent enough that what might be obviously negative-value behavior for you might not be for him.
These are the circumstances in which one engages in kettle logic, so I wouldn't read too much into any of their arguments.
This is the core of the dispute between the USPTO and OpenAI over their (failed) attempt to trademark the term in the US, so citing their papers doesn't help resolve this.
GPTs. Yes, it's still an initialism like "LLM" but it's much easier to pronounce ("jee-pee-tee") and you can call them "jepeats" (rhyming with "repeats") if you want.
From what I understand, they executed a risky maneuver, lost, tried to salvage what they could from the wreckage (by pinning the blame on someone else), but got pushed out anyway. So I can see where you're getting "scheming or manipulative" (them trying this scheme), "less competent" than Altman (losing to him), and "very selfish" (blaming other people). But where are you getting "cowardly" from? From their attempt at minimizing their losses after it became clear their initial plan had become exceeding unlikely to succeed? If so, I'd say it speaks to how poorly you think of valor if you believe it precludes so sensible an action.
I thought the cases in The Man Who Mistook His Wife for a Hat were obviously as fictionalized as an episode of House: the condition described is real and based on an actual case, but the details were made up to make the story engaging. But I didn't read it in 1985 when it was published. Did people back then take statements like "based on a true story" more seriously?
I believe there's a line from the Art of War[1] that translates loosely to "know victory," as in "be able to identify it when it occurs, and distinguish it from defeat."
Like much of the book, the advice seems so obvious that it's silly to state until you see someone making precisely the mistake warned against. If your goal is raising awareness of the risk of AI causing human extinction (or some comparable catastrophe), having a quarter (!!) of the US public list it in their top three reasons means it's just about accomplished.
This is from memory. It's possible the quotation is spurious.