Thinking Is Formulating

I’ve written probably a thousand texts in my life. Not all of them were good, of course, but that’s not really the point here. What matters to me is that every text does something to me that I don’t know before I start writing. It wasn’t about the research or the new knowledge I acquired. No, it was the writing itself. The act of writing lets the thought emerge. Not the other way around. Even as a child, writing was my most effective way of learning for exactly this reason.

Today there’s AI and it’s commonly described as a “writing assistant”. It helps with “phrasing, structuring and especially grammar”. That sounds like a harmless division of labor, and you’d think the author thinks while the AI formulates. I have a thought and the AI finds the right words.

Unfortunately, thinking doesn’t work that way. Thinking isn’t something that happens in your head and then gets put into words and written down. Thinking happens while you write. And it’s not just me. I attended writing workshops and seriously engaged with this. The fact is, the sentence you start doesn’t take you where you planned, because at the beginning of a sentence you don’t always have a plan. The word you choose opens a direction you hadn’t seen before. And the structure a text takes on is usually not the result of a finished thought. It’s the birth of a thought. With many non-fiction texts it can work differently, but if you look at individual sentences or paragraphs, it applies to non-fiction too.

And many people who write know this well enough. You sit down with an initial thesis and three paragraphs later you realize the thesis doesn’t hold up or isn’t that important anymore. There’s no dialogue and nobody can contradict you while you write. It’s simply the act of formulating that forces you to be precise with what’s developing as a thought in your head. And here’s the thing: precision is allowed to have gaps.

When an AI takes over the formulating, you skip exactly this moment. You enter a thought that’s vague and get back a text that’s very precisely formulated. The vague isn’t resolved but overwritten by the assumption that the thought was clear from the start. But it was perhaps never thought through to the end. It was formulated to the end. By an algorithm that doesn’t initially notice whether that would have been your actual thought.

There’s a crucial reason why philosophers write. They don’t dictate and they don’t have texts summarized. Wittgenstein rearranged sentences until they were right. Not because he was pedantic, but because the order was the thought. Because the difference between “The world is everything that is the case” and “Everything that is the case is the world” isn’t a question of style. It’s philosophy.

Whoever has AI write their formulations treats the power of thoughts too superficially. You act as if a text has content and form, and that form can be delegated. But this separation doesn’t exist in the real process of thinking and writing. Content arises in form. Whoever gives up the formulation gives up the entire thinking process.

I’ve tested this myself often enough and was fascinated that I supposedly no longer had to think because the AI would handle it. I asked ChatGPT to flesh out one of my thoughts that I’d formulated as a raw prompt. The result was a text that read like something I might have written. But it still didn’t say what I meant. It said something similar, certainly. And it was plausible and also pointed in the right direction. But it didn’t really hit the point. That would have meant trying two wrong formulations and then realizing why they were wrong. AI can’t do that.

A separate topic is structuring, and AI is certainly a brilliant tool for logically structuring content. But structure isn’t an outline. Structure is the logic of an argument. The order in which thoughts appear determines what they mean. The same argument in a different order is a different argument. When you delegate the structure, you also delegate the entire argumentation. And with it the decision about the effect of the text.

For me, this is the point where the much-praised efficiency of AI makes a complete mockery of the responsibility of intellectual work. If a consultant has AI formulate their recommendation, whose recommendation is it?

Many see AI as a writing assistant. Few see writing as the most important place where humans still think for themselves. AI only delivers output that looks like thinking. And even if the texts read as though someone reflected, nobody reflected.

How these texts are written is explained here.