>in a couple decades there won't be many people who can write
seems quite wrong to me.
If anything the trend seems to go the other way - when I was younger pre internet most communication was face to face or voice over the phone.
Now the predominant thing seems text - sms, whatsapp, this text box I'm typing into now. I saw a stat the other day that online / app dating had gone from a minority to over 50% of how couple meet. And that is mostly a combination of some photos and text. Be able to write text or fade from the gene pool!
That said long form text may be different but those who write novels and the like were always a minority.
10-15 years ago I think this take was correct, the Internet was about writing.
It isn't anymore, not for newer generations - e.g. Gen Z spending most of their time on Tiktok and phones, and not knowing how to use a word processor.
In the span of ~30 years pg is talking about I can absolutely imagine some job where you speak to the AI and it writes the documents for you and you never learned how to write one yourself. It will not be a good job but millions of people will hold it. They will not be able to write with much sophistication themselves ergo they will not be able to think with much sophistication either.
Online dating is not about writing. It was before Tinder, but it's not anymore. Like Instagram, it's about being skilled with photo filters and/or hiring a professional photographer. No one bothers to hire a profile writer - because no one reads the profile.
If the other person's photos are hawt you will click a button and the AI will send some funny jokes and if you're hawt too you'll share locations and shag. Idiocracy or some Eloi/Morlocks world will be real
Curious about your hypothesis, I went on Tiktok and clicked trending. The top thing (clips from Superstore I think) had 16535 text comments on it though mostly stuff like
>"Don't you hate Tuesdays?" "AHHHHHH"
so not really long-form essays. Maybe the future is that stuff?
The present is already non-written communication with Emojis. This supports TFA's thesis, as these forms of communication are about emotions, not thinking. Maybe it is a bit unfair to pick a media that focuses on video/image, though.
TikTok videos are embedded in text -- titles, usernames, descriptions, comments. It's very different from TV, and if anything if it replaces TV it will make newer generations more literate, not less.
> It isn't anymore, not for newer generations - e.g. Gen Z spending most of their time on Tiktok and phones, and not knowing how to use a word processor.
This is a classic fallacy as old as society. “Whatever the hoi polloi are doing is by definition not the good stuff”. But long-term whatever the masses are doing always wins.
You know Shakespeare? He was the rube who thought plays could be entertaining to the masses. How quaint and silly, who would expect a commoner to appreciate a play. pfft.
Mozart? Taylor Swift of his day.
Printing press? Don’t even get me started, ew the commoners think they can just, like, write things down? How rude.
I’m as much an anti-fan of the short video communication trend as anyone, but it works. When bandwidth is cheap and video recording ubiquitous, video is a great medium. Who cares what you say, show me.
edit to add an uncomfortable truth: The in-crowd talks to develop ideas. What you see in writing is weeks, months, or even years behind the bleeding edge.
There should be a term for the fallacy on display here. The Spurious Generalization Fallacy perhaps?
At no point did you address whether the shifting habits of younger generations will be bad for their literacy, instead making a general point that new trends in society are routinely panned by older members of such a society.
As a counterpoint, before radio and the phonograph, musical ability was quite widespread. Now, it's much rarer.
You haven't even attempted to address whether various developments in society and technology might do this to literacy the way earlier trends did to musical skills. I think that result is quite likely, by the way.
> You haven't even attempted to address whether various developments in society and technology might do this to literacy the way earlier trends did to musical skills. I think that result is quite likely, by the way.
Fair, I was making a different point. Yes literacy might be reduced, my argument is that this isn’t necessarily a problem. Our abilities shift to take advantage of technology.
A lot like how we got really bad at memorizing long epics because we can just write them down instead.
That said, I don’t think writing/literacy will go away as much as we might fear. The new technologies are not a good enough replacement (yet?)
That minority only flourished for a brief period in the 20th century when universal literacy was achieved through schooling and proliferation of white collar work.
For most of history, writers were a tiny minority. It exploded 100x in the last few decades. If it goes down 10x, it's still way above where we were in the 1800s.
That's a fascinating point of view, thanks for sharing.
I know that the concept of dark ages is overblown, but still - something about relying on AI like this makes me think of the end of classical antiquity.
The issue with the modern internet is that it's primarily used for communication (via short writing), often resulting in poorly written messages. Animals communicate through sounds, but this doesn't mean they can talk.
I agree with PG on this point and have noticed that people around me are often surprised when they receive well-written wsapp/sms messages that include proper punctuation and other linguistic markers. Additionally, many people rarely engage in handwriting today, and handwriting is known to improve clear thinking and literacy skills.
I expected much more from the article.
It is, especially, poorly argumented.
I feel the author wrote it hastily.
To begin with, the following assumption is false:
>To write well you have to think clearly, and thinking clearly is hard.
For most people, most life situations which require clear thinking have nothing to do with writing.
>This is why eminent professors often turn out to have resorted to plagiarism.
What's the percentage of such professors ? In the university I studied, there is no case of plagiarism till today. And plagiarism is not done because professors can't write, but due to other professional factors.
>If you're thinking without writing, you only think you're thinking.
As if writing is the only way to think well/correctly/effectively. My father never wrote a word: still, some of the most thoughtful statements I ever heard in my life were told to me by him during our conversations.
When you face a situation of danger, such as a wolf is running towards you: will you start to write your thoughts about what you should do, or will you just run right away and decide about the safest paths to follow while you are escaping ?
> For most people, most life situations which require clear thinking have nothing to do with writing.
The problem with "clear thinking" is that it is subjective. I think Paul Graham and Leslie Lamport, have experienced something like this: when they sit down to write about a certain topic, they realize that their initial thoughts were not nearly clear enough, and after a number of iterations they became clearer and clearer. Most of us don't write essays, so we simply don't recognize this feeling.
I think he is hinting to when I said: "For most people, most life situations which require clear thinking have nothing to do with writing."
I meant: since most life situation where we need clear thinking do not involve writing, then we are obviously well equipped to think clearly.
And if thinking clearly is not that problematic for most people, then the author can't say we can't write because thinking clearly is hard/or we can't think clearly.
You're still both missing PG's point, and getting your logic wrong for the point you are on. About the latter:
> "I meant: since most life situation where we need clear thinking do not involve writing, then we are obviously well equipped to think clearly."
That's not the QED you seem to think it is. The statement that "most life situation where we need clear thinking do not involve writing" doesn't give any reason to think that most people are good at clear thinking most of the time, nor whether people find clear thinking easier with the help of writing or if writing has no benefit to the goal of clear thinking. You're just putting two opinions you have next to each other and acting like one confirms the other.
And a friendly tip, "have I explained better what I meant before?" would come off as a lot more polite than "got it?", which to anyone who agrees with the rest of your comment could easily read as snide/patronising, while anyone who thinks you're still wrong will see it as smug and wrongly confident. (Apologies if English isn't your first language, in which case you're very good at it, and apologies if you didn't want unsolicited opinions on how your choice of language makes you seem in my view!)
edit to give an analogy: I feel your argument is like if somebody said "control of body movement is key to being a great athlete", and you replied "everyone is always controlling their body movement, clearly therefore it's not relevant to how good an athlete is".
That writing is the only way to do deep, clear, thinking simply isn't true.
Stephen Hawking is the first example that comes to mind.
He developed a remarkable ability to perform complex calculations and visualize intricate mathematical concepts entirely in his mind. He once mentioned that his ALS diagnosis, which limited his physical abilities, led him to focus intensely on theoretical physics, as it required more intellectual than physical effort.
But sure, writing (and drawing) is a great tool to aid in deep thinking. So are AI tools.
I think you have understood "writing" in a very narrow sense. As mentioned in other replies, Stephen Hawking was a very prolific author. He did not write much, but he sure knew how to write.
PG is obviously talking about the mental process of writing, i.e. of organizing a complex network of thoughts in a linear hierarchy that others can grasp, not the physical one.
> That writing is the only way to do deep, clear, thinking simply isn't true.
You're correct here.
> Stephen Hawking is the first example that comes to mind.
The post is obviously speaking of the general population or at best average professional, and in my opinion choosing one of the most brilliant exceptional scientific minds of our lifetimes is not a good counterargument for a piece that speaks of a potential problem with society at large.
As someone who teaches PhD students who are quite far beyond "average professional", I concur completely with PG on this one. Writing forces you to make very clear and concrete ideas that feel like they make sense but are still fuzzy. It's certainly not the only way, but it's the most common and easy way.
To use an overextended computer metaphor: serializing data structures to a wire format forces lazy evaluation, turning up any errors that were previously hidden by laziness.
I don't disagree, just want to mention that as someone married to someone who supervises Phd students, they're not by any means "far beyond average professional"... but perhaps you're on a exceptionally highly regarded faculty where that may be the case.
Reading and writing are essential for the transfer and percolation of knowledge across society.
Stephen Hawking's thinking and imagination wouldn't have meant much had he not finally penned them down for others to read, and neither would his ideas have been taken seriously had he chosen to make tiktoks or podcasts to explain them instead.
I think what he's getting at is that while you CAN use an AI to assist with "ideation," we will inevitably create new, low paying jobs where there is no ideation and the employee just operates an AI, because economics. That will in turn create a large cohort within society who are functionally illiterate. Literacy profoundly alters the brain for the better, and this won't happen to those people.
It's useful for ideation: suggesting ideas and concepts that you might not think of. A bit like a conceptual thesaurus. But it doesn't replace the hard work of thinking for yourself.
a) No / little data: Whenever you are starting to think about a subject, you can ask it to give you a structure / categories.
b) Existing data: What I do very often is to give it a lot of "raw data" like unstructured thoughts or an unstructured article, then I ask him to find suitable top categories.
I see, I don't want to shame this kind of use. It's kind of almost like talking about something briefly with an educated person.
Until it's not.
I'm not the type who'd say "don't use AI". Use whatever works. Myself I became really fascinated by transformer LLMs / GPTs in winter 2019, then again when ChatGPT was published and a good few months after that.
It's just that my interest&enthusiasm has almosted vanished by now. Surely it will reemerge at some point.
Very good point. I often use AI to see things from multiple points of view. It is a good tool to check if you have included obvious things in your argumentation. Spell checking is just one of those obvious things.
> But the middle ground between those who are good at writing and those who can't write at all will disappear.
This observation of Paul Graham may generalize beyond writing: modern technology appears to turn populations into bi-modally distributed populations - for example, those that write/consume human-written prose and those that produce/consume AI-generated prose; those that can afford human medical doctors and those that can only afford to consult ChatMedicGPT or Wikipedia; those that can afford human teachers for their childrens and those that let EduGPT train them, etc. Generally speaking, I expect a trend that more affluent people will use higher quality human services and the rest has to live with automation output.
I follow Paul's argument about the consequences of the churning of low quality output by AI, but I think there's a second order effect that's more concerning. The ability to judge other people's knowledge of a subject area will become exceedingly rare and priceless.
Unlike younger generations, who are growing up surrounded by AI-generated content, many of us older folks have had the experience of engaging directly with people and evaluating their competence. We developed a knack for quickly determining someone's skill level through just a few minutes of face-to-face conversation—a skill that was essential for navigating various life situations.
Now that anyone can use AI to generate seemingly competent text, videos, and more, and as in-person interactions decline, the conditions that once allowed us to gauge competence are fading. I worry that in the future, no one—including AI trained on our outputs—will be adept at making these assessments.
Those of us who take time to carefully compose arguments and revise them, as Paul suggests, will have a better handle on this, so that's a helpful consideration.
I worry strongly about a future like that in Ideocracy[1], where nobody has a clue bout actually judge competence, and instead goes with the best sound bites.
The one path out that I can see, and it's unlikely, is to teach the skill of explicitly tracking history, and reviewing how well someone predicted the future, over time.
The explicit generation and curation of a reputation is part of that priceless nexus that they'll all be seeking in future generations, and yet it'll pale in comparison with the ability to size someone up in a few minutes of interaction.
Judging other's competence will only be a problem if brain-computer interfaces become widespread before AI largely replaces the competent workforce, and my money is on AI replacement coming first.
- Only a brain chip could make AI usage undetectable in practice. Without that you can tell if the person is checking his phone etc. Though you're right that an in-person interaction will be needed, otherwise there's no way of knowing what the other person is doing or if he's a real person at all... And since the latter problem (dead internet) will only grow, perhaps beyond the rectifiable, in-person communication will surely be in business again.
- Once AI replacement of competent humans has reached a certain threshold, what do you stand to gain from testing a human's level thereof? Are you interviewing for "above AI" positions? If not, relying on AI will be as normal as relying on a calculator.
> many of us older folks have had the experience of engaging directly with people and evaluating their competence. We developed a knack for quickly determining someone's skill level through just a few minutes of face-to-face conversation—a skill that was essential for navigating various life situations.
I think I have a bit of this knack, in some areas, tempered by an awareness of some of my blind spots, but most people don't even claim to have this knack...
As evidence from our own field: before the explosion of LLM cheating, we had the explosion of Leetcode hazing.
Because, supposedly, good experienced software developers couldn't plausibly recognize each other just by talking with each other.
So instead we whip out these douchetastic did-you-prep-for-this rituals. And afterwards you still have no idea what the other person would be like to work with (except that you now know both of you are willing to play fratbro nonsense games).
Another (albeit optimistic) possibility: today we have an informal oral culture contrasted with a formal literary culture, so Graham perceives the latter as synonymous with thinking. However, before literacy was widespread there was, on top of the informal oral culture, also a formal oral culture, so maybe with the popularity of short video clips we might see a resurgence in structured speech?
If you observe the trend speech has actually devolved with the arrival of popular internet networks. Today's elites talk in a fashion that would embarrass even their immediate parents, let alone their ancestors.
Just look at the quality of presidential debates and political discourse we've been having for the past decade. Not just in the US, but all over the world. The situation is perilous.
Political speech probably chases not even the median, but the marginal, voter, so you'll see that US SOTU addresses having been going regularly "Dick and Jane"-wards for the last couple of centuries.
Others in this thread are likely using their own metrics; my responses were all based on declining Flesch-Kincaid score of State of the Union addresses (from "20th" grade to 8th grade over 200+ years): https://news.ycombinator.com/item?id=41961710
The total IQ on the planet remains constant even as the number of people increase, I guess. Although I really wasn't expecting the average to go down this fast--consider the idealistic and rationally thorough political speeches of the 70s and 80s, and the way things went quickly downhill from there...
As someone said, it's no wonder The Matrix chose the 90s as the peak of human civilization.
The recent political speeches at the economic clubs of Detroit and Chicago were a little deeper than Dick and Jane! Also 21 Nobel Prize economists found the content to be deep enough to disagree!
EDIT: keep in mind that the expected "general" audience for the SOTU has also expanded dramatically due to technological change in between 1790 and 2018...
Oral culturals often had other physical representations of thoughts in stories. The physical objects that held the stories could be women cloth or carved wood.
It may not be actual writing as much as coherent cultural transfers through objects that PG is referring to.
> I'm usually reluctant to make predictions about technology, but I feel fairly confident about this one
It's interesting to be usually cautious but then predict something so radical, and yet with no real argument other than "AI is gonna replace us".
Painting should have been replaced by photography, but it hasn't been. In my opinion, there are still plenty of people who want to write, so there will still be plenty of people who know how to write.
And maybe my opinion is wrong, because it's an opinion. But to have to transform it to a certainty, I'd have to see much, much more data than a feeling and a conviction.
I'm not a photographer, but a musician. I won't say that musicians have been replaced by automation, but the ability of someone to earn a living from their abilities as a musician has been eroded considerably over the past century. The preservation and advancement of many musical styles is occurring primarily in living rooms, or is performed for tiny audiences of enthusiasts. I'm happy to help keep jazz alive in that way.
Writing may become the same thing. In the workplace, if someone is writing, they're probably doing it for their own entertainment. Some people write at home, writing journals, blogs, etc. Nobody will know that you're writing, unless it affects your thinking, and your thinking affects your work.
I think we already reached the stage where people stopped writing, before AI entered the picture. I rarely see anybody write a lengthy report any more. Reports have been replaced by PowerPoint, chat, e-mail, etc. One consequence is that knowledge is quickly lost. Or, it's developed by writing, but is communicated verbally.
I think of writing as similar to a linear extension of a partial order. Your brain doesn't think a single letter at a time, instead, all of your neurons are doing neuron things all at the same time. But writing is linear. This forces order and I think is partially responsible for the "clear thinking" ascribed to writing!
Hopefully I'll live the couple of decades to find out if PG's prediction is correct, I would bet against it.
Here however, I do agree with his articulation -- "writing is thinking" -- and like you, I've thought a bit about the linear nature of writing.
My view is that the "jumble" of ideas/concepts/perspectives is just that -- a jumbled mess -- and the process of linearizing that mess requires certain cognitive aspects that we (humans) generally consider as constituting intelligence. IMO, the rapid generation of grammatically-correct + coherent linear sequences by LLMs is one reason some folks ascribe "intelligence" to them.
I liked his analogy about how the disappearance of widespread physical work meant that one now had to intentionally invest Time and Effort (at the gym) to maintain physical health. The facile nature of LLMs' "spitting out a linear sequence of words" will mean fewer and fewer people will continue to exercise the mental muscles to do that linearization on their own (unassisted by AI), and consequently, will experience widespread atrophy thereof.
As someone working on linear extensions of partial orders (some of the time), I found your observation very insightful, a perspective I haven't considered before.
To add to this, when I think of ordering I’m reminded of the NP complete traveling salesman problem. It’s easy to make a program to visit all locations, but optimal order is so much harder.
I suspect thinking is similar, which brings up questions about LLMs as well. We all can now quickly write hundreds of generic business plans, but knowing what to focus on first is still the hard part.
The idea that there will be a "middle class" that relies on AI to churn out missives without much thought scares me a lot more than having people who don't write a lot, honestly.
I'm seeing that happen today with corporate documents (there's always that one enthusiast in each team who says "oh, let me improve that with [LLM]", and it's a slog to go through pages and pages of things that could be a bullet point). Quality has been trumped by mediocre quantity, and the cluelessness of the people who do this willingly baffles me.
As someone who's been writing pretty much constantly for over 30 years and both uses AI code completion to speed up first drafts (of code) but switches off everything except macOS's (pretty good) word completion for prose--and absolutely refuses to use AI to generate work documents or even my blog post drafts--this post was a bit of a "oh, so this would be the ultimate consequence of keeping all of it on" moment.
Accelerating writing (with simple word completion) is fine. But letting the AI generate entire sentences or paragraphs (or even expand your draft's bullet points) will either have you stray from your original intent in writing or generate overly verbose banalities that only waste people's time.
I use iA Writer to draft a lot of my stuff, and its style checks have helped a lot to remove redundancies, clichés and filler, making my prose a bit more cohesive and to the point. That's been around for ages and it's not LLM-style AI (more of a traditional grammar checker), but that sort of assistance seems to be missing from pretty much every AI "writing aid"--they just generate verbose slop, and until that is fixed in a way that truly helps a writer LLMs are just a party trick.
I read an interesting remark once, that current growth of bureaucracy, NIMBYism et al. was greatly assisted by Ctrl-C, Ctrl-V.
During the typewriter era, anyone's ability to produce pages and pages of text was limited by their ability to type. Nowadays, you can copy/paste large blocks of code and thus inflate documents to enormous sizes. Which works just like sand in a gearbox of the decision-making process.
I wonder if the future of ESG, DEI and such is that one AI will produce endless reports and another AI will check if the correct buzzwords are used in correct frequency. And instead of yearly reports, they could easily become daily or hourly reports...
It would be a way to tout "allyship" on the social networks without actually doing anything substantial.
I like and believe too the notion that writing IS thinking and that cultivating this skill helps you form clearer thoughts.
One question about this: I'm trying to engage more and more in writing for this reason, but I seem often not to know what to write about. For example, as PG has a lot of experience when it comes to startups, I feel I don't have as much knowledge to share, or if I have some opinion, I don't feel as confident in sharing it. Any tips on how to cultivate writing as an exercise in thinking in this situation?
Just curious, but do you think my collective 20 years of random posts across 10+ social media platforms, often with thousands of posts per platform, has been worth my time?
I have found journalling to be pretty effective about anything. Most times when I meet a block, I write about it. Doesn't require tons of experience to journal about stuff.
He doesn't get it. It is not "those that can write", it will be those that can communicate, really communicate versus those that cannot. It's amazing now how few people can explain themselves, without thinking that act of explaining themselves is somehow a punishment or a lead up to a punishment. Beyond that, how few developers can explain what they do without the use of empty acronyms and a host of gobbledygook that they probably only know as a pile of gobbledygook because they don't work in that stuff, they just use it, tangentially.
But the real issue is people that are not already engaged and knowledgeable about what one another are doing, the key moment when a non-tech needs to discuss a tech need with someone from the tech developer sphere: can they even communicate, and I'm not talking through a salesperson, but actually discuss what one needs and what one provides without resorting to empty jargon? Real communications needs no jargon and does not use jargon, it modifies itself to be understood by the audience, using the audience's terms.
This is critical in the coming decades: learn to communicate, professionally communicate, and I'm not talking about being a media talking head, I'm talking about learning how to speak to anyone anywhere from any stature. It's a critical skill and it is damn well needed now as well as tenfold in our fast approaching future.
I think this is a bit too simple. Some people might be good at thinking (e.g. in math) but bad at writing. AI can just help them to do their job better.
I am for example very good at math and reasoning etc. But when I write something I tend to construct long complicated sentences (probably because I think that way^^) and the result whould often be considered badly written.
Now of course you can feel superiour, for your better writing style. If it makes you happy ;)
For what it’s worth, I don’t think “reasoning” and “conveying that reasoning via writing” are quite as siloed as you suggest. Many people write long and complicated sentences in a first draft, but it is the subsequent drafts where that initial block of mental marble is carved and honed into a compelling argument. I believe it is in fact that very act of redrafting — seeing one’s work through the eyes of others — that leads to one’s own greater understanding. Skipping that step with AI limits that potential to whatever ended up in the first draft, and Hemingway’s words apply to far more than fiction.
But I can also achieve the same (act of redrafting) by refactoring my source code, working on a math formula or you know just sit there and think in my mind.
All of those are related to language, because our thinking (and also math and logic) is based on language.
But just because we think in language, does not imply that writing is the only form of reasoning. It is ok when it is your preference and certainly has a value - like other thinks.
My imminent concern is that it seems like increasing numbers of people can’t read anything more complicated than a tweet. Do we end up with a tiny group of people who can seriously read and write, just communicating with each other? I think we had something like that several centuries ago? Is this whole “universal literacy” concept just doomed?
Lots of comments in this thread are reacting against the idea that writing is required for clear thinking. I think this is true, but a little more nuanced.
Writing and clear thinking are are related in at least two ways:
1. Both good writing and good thinking require structured thought, which is the ability to organize and categorize ideas in relation to eachother.
2. Both good writing and good thinking require meeting other minds: expressing your ideas in such a way that they are also comprehensible to someone who doesn't share our own brain. This is important because it's entirely possible to _think_ you have a good thought without actually having a good thought. The ability to articulate a thought is an effective discriminator between the two scenarios.
It's probably possible to think clearly without writing, if these two elements are still present. But writing well is an effective forcing function, which is why they're so closely related.
Can this be put in analogy to arithmetic and calculators? People had to be a lot better at mental math and calculator removed the pressure. You could imagine making similar arguments that losing the ability would be disastrous. The reasons it wasn’t was 1) people still learn and get tested in school without calculator 2) they still need to do enough mental math in day to day life without a calculator at hand so the ability hasn’t just gone to zero. Net result is people are mostly fine and there is a net improvement in arithmetic correctness across economy. Other similar aids: spellcheck, google translate, …. I feel some hesitation that these concerns are time invariant. The argument is probably that writing is different.
Yes. The argument is that writing is different and I don't think the analogy holds. Calculators are not tools for mathematical thinking or communication. Writing is a form of processing thoughts and communication.
The "people are mostly fine" is true. People were mostly fine at the time when only the elites could write, but the society will not be the same. We are moving towards those old times.
Maybe you deal with intellectual elite from the best schools and haven't noticed[1], but adult literacy even among college educated has been on a downward trend for some time and we can see some results. Normal interactions in the corporate world are more difficult. People in middle-level management can't explain and articulate things as well as they used to. There is more communication, but it's not as efficient. One well-written report in every two weeks used to be enough. You need to be on a Zoom call 2-3 hours weekly for the same thing.
Mediocre or low quality writing is mentally taxing to read. If what you read is grammatically correct AI-slob it really kills all interest in reading and communicating in writing.
[1] Only about 10% of adults have PIAAC adult literacy level 4 or 5.
The difference is that with arithmetic and calculators I am simply playing the rules of a formal game without regard to the semantics of those numerical symbols. I don't care so much what the numbers mean as much as I care that I have executed the rules correctly and arrive at the "right number."
With language however we are attempting to refer to some kind of underlying meaning or reality, and an LLM will not give you the understanding of the things being referred to - only the outward representations themselves. If you are indeed interested in meaningless exchange of symbols a la Searle's Chinese Room then perhaps there is some utility in this; otherwise it's the act of digesting and comprehending meaning, and the internal cognitive processes that go into the production of the written artifact that matter, not just the written artifact itself.
Calculators don't really help you think. You need to have a concept of the calculation beforehand and you only run the technical steps on the calculator.
Writing is different. It is more akin to the left leg, when the right leg is your thinking. I am a weird combination of an author and a mathematician-turned-programmer, and both these skills are very useful in either activity; I wouldn't be able to program half as well if I didn't dump my messy ideas to a doc first, especially when the model is far from straightforward.
Writing things down is a huge feedback loop for thinking. Plenty of edge cases stick their head out of the doc once you write everything down.
I wonder if some people would argue that it _was_ disastrous though?
We seem to struggle with things like putting people on the moon and not making airplanes fail compared to 50 years ago.
Maybe there was some critical mass of mental maths skills that engineers had in the 60s that we've lost now? Are we still inventing things at the same rate as before?
Arithmetic tends to be a one-shot thing to perform a step in a daily activity. Writing tends to be about planning or communicating more complex things.
Not that, say, doing long division in your head isn't a valuable skill, mind.
This would be better differentiated by one’s ability to express rather than to write. Of course, a person who knows how to read as well as how to express an idea will do fine with writing as a means of expression but one can also express an idea orally and use that oration to refine their idea.
I agree with Paul. Our creative and intellectual abilities are so powerful, yet there’s a real danger in handing them over to AI. That said, I think AI is still extremely useful for writing, if it complements one’s skills.
I recently published a [major philosophical work][1] that is the result of decades of thinking and three months of writing. I’m not a native English speaker, and although I know what I want to say, I often don’t know how to write it. I may not know or can’t find the right terms or phrasing, or I might make grammar mistakes. Sometimes, I can describe my ideas in a clumsy way, and I need help refining my sentences.
So, I use AI. I think, write my thoughts in my own way, and then work with AI to bring them closer to what I want. It’s hard work. Although AI can be an amazingly good writing partner, it often alters my text in ways that change the meaning completely. Even replacing a single word word with a synonym or adding a comma can turn a sentence into something totally unintended. It can be a lot of back-and-forth work to find the right paragraphs. Still, AI is a tremendous
help, and my work would have been more immature and unpolished without it, even if it sometimes feels a little artificial.
Of course, it’s much more ideal to master English fully and to practice writing until it feels natural. But AI helps with that, too.
> So a world divided into writes and write-nots is more dangerous than it sounds. It will be a world of thinks and think-nots. I know which half I want to be in, and I bet you do too.
I'd disagree with "half" here because I can't imagine it being anywhere close to 50/50. I expect a power law distribution: most won't be able to write well. The ones who do will have a massive advantage, in the same way that those who can concentrate in our age of distractions have a significant advantage over those who can't.
Look at the situation now: I don't think 25% of the population can write a single paragraph on some non-banter non-gossip topic that does not appear as if it were written by a 4th grader. Look at the business communications actual people write, it's jargon and very simple directives like "did x happen?" Look at the logical flow of the sentences in technical white papers and wish the authors had more basic writing, basic communications practice, because their sentences do not flow.
We're already in a poverty of quality communicators. All that nonsense with Bitcoin was fast talking nonsense that sounded plausible. This is what happens when real communications breaks down: fraudulent technical products surrounded by a word salad of abused language and people afraid of looking stupid so they never ask for clarifications of the gobbledygook.
writing offers relatively high information density in a low bandwidth medium. but video offers high information density in a high bandwidth medium. the challenge in both cases is coming up with something to say that requires a high information density. for those who do manage to find something to say that requires a high information density format, speaking will be more effective (newspapers/books were replaced with radio which was replaced with tv). https://x.com/kenwarner/status/1840060518461010225
the infrastructure of the internet has matured enough now that we don't have to talk to each other in ASCII characters any longer. being online increasingly will mean using your voice and face (or suitable synthetic alternatives) to talk to the rest of humanity. not like TikTok though with its algorithmically driven mental corruption. and not like YouTube with its copyright oriented business model. More like early days twitter. Just everyday people talking to each other. But video. And realtime.
I write software that solves hard problems and makes money. I am very good at it. Computers do what I tell them, and people are happy.
The biggest obstacle to my job is when I need to convince other people of a course of action. People do not do what I ask of them, because people need a nice narrative. Indeed the reaction of others to my writing is so negative I can get into disciplinary proceedings just for writing facts. Yet, when the shit hits the fan, and people need a leader and a plan, suddenly people do what I ask and are successful, presumably looking past my communication issues because of the imminent and larger threat.
I am Autistic. I look forward to the day when my email and slack client, my JIRA client, automatically take my statements of fact and turn them into something neurotypical people won’t have a social signaling reaction to and respond to as a threat.
Also, thinking can also be doing.
And while I’m here, there’s plenty of people who can write and not think. And more who cannot think critically. Electrolytes: it’s got what plants crave.
I agree very much with Paul here. Many a times as I'm thinking, I realize none of those are legitimate thoughts until I can put them in words. Describing your intuition is fundamentally a hard task, because it is trying to decipher the vague strands of reflection. Clarity emerges only when focused harder.
I understand his point and think it's an interesting observation, but honestly I disagree. What AI is great at is handling all the bullshit writing tasks that you have to do that no one reads or cares about. For example I got asked to write a blurb for a company newsletter recently. Told an LLM the things I needed to talk about and how long it should be, and the tone I was shooting for. Done in less than a few minutes. Previously that would have taken me at least an hour.
I am a professional writer (in the sense that I have published short stories that were paid for, even though it is a hobby). Recently, my wife was participating in an event where she had to portray an historical figure and she had a fact sheet and a couple of articles about the person (she wasn’t super famous). I used Perplexity with ChatGPT-4o and prompted it with all the materials we had and asked it to generate a 5 minute monologue in first person for the event. First draft was excellent, I touched up a few lines and printed it out. Done.
People would start writing more and getting better at it with the help of LLM. This would create a positive feedback loop that would encourage them to write more and better. LLM should be used a tool improve productivity and quality of output. Like we use a computer interface to write faster, move and edit text, instead of using a pencil and an eraser, today you can use a LLM to improve your writing. This will help people to get better at organizing their thoughts and think more clearly instead of replacing the thinking.
I largely agree with the point he's making, but would say it's less about writing than it is about one's attention span (people are always checking phones and devices to avoid long stretches of thought) and willingness to avoid conformity (our society has changed to reward being part of a group). Thinking and writing requires both in order to be good. It feels to me that those activities are being discouraged and even punished in the US in 2024.
I too wonder if the writing-is-thinking thesis holds. Does Han Kang (Nobel Prize in Literature, 2024) think through writing? Have I read something thrilling, funny, mysterious, spiritual, or revealing of human nature written by PG? No. He may well be correct about a particular subset of writing, but I do not take his views as authoritative about writing in general (nor am I convinced that hackers are just like painters).
If an AI can barf up a first draft before I need to actually engage my brain with the underlying material, is that a bug or a feature? Probably the former.
If I were teaching at a university, I think I could not assign essay grades until i sat down with every student for a quick Q&A on the subject of their paper. That would reveal the plagiarists and charlatans and lame non-thinkers.
Interesting, I just finished the manuscript of a sci-fi novel where one of the narratives is set in an idilic small town, with no scarcity, with AI systems making the life easy for everyone and where learning how to read and write is frowned upon.
Everyone uses voice interface, so writing is looked as an antiquated habit.
New and non-plagiarized text is essential for all the AI models to train on. Without these the AI models will reach the saturation point or worse the output will not be as effective as it can be.
If AI trains on generated text it is equivalent of an incestuous relationship and I don't have elaborate on analogy further.
There seems to be a modern fallacy which says that everything is knowable, and what is knowable is expressible. It seems all thoughts should have a lead weight tied to them so that they stay nicely on the ground where they can be admired, discussed and replicated. No wonder our age so pedestrian.
I also do not agree to this thinking/writing duo. Most of my thoughts are free floating, so to say, and I ( and possibly anyone) can derive/enhance/combine ideas in my head without any writing necessary and even come to a conclusion how to act on that thoughts.
As opposed to the writing at one end, these thoughts are rooted in an experience, an emotion, and this association never dissapears. Add the above act on them, it is like writing.
Maybe the author is referring strictly to communication or to the art of writing.
Writing done entirely by AIs is currently pretty awful, but I'm sure that it will get better over time. There was a prof on Twitter (can't find the tweet now) who assigned his class to generate and then critique an essay written by ChatGPT and it caused all of them to stop using it.
If the point of writing is to share your precise thoughts, there is no use of an LLM. Your prompt would be become equivalent to the writing you wanted to share in the first place.
If on the other hand, you need to generate a bunch of content for a school paper, blog or etc then its immensely helpful.
All the more important that everyone goes to school and is required to write (and think) there. Not everyone will become a great writer or thinker, but everyone gets the basics and the opportunity.
Chess has seen rating inflation because of youtube, stockfish and leelazero, and chess.com. basketball is seeing a boom in 3 point accuracy. More people than ever know about squats and deadlifts. I would learn to program much faster today with chatgpt than I did without it. Legalese and flowery prose and even rhymes were really unnecessary for pure communication and thought. we know thought, search, computation is infinite, so there will be no end to progress.
We're also on the edge of the read, and read-nots. There is an abundance of content to consume and AI lets us consume it in any format easily.
Why read a book when we can have an idea distilled in a quick infographic, a shortform video, or a pithy tweet? I love a deep dive book that lets you immerse yourself in idea and study it from multiple angles, done masterfully in Dune or Thinking, Fast and Slow.
But are we losing that chance to really contemplate given the speed at which more information is being thrown at us across every form factor?
On the contrary (though it may not be everyone's case), my reading is better off with the presence of these shortform formats. By just going through these "trailers" I can now easily filter out the books that I should actually be reading, while still retaining the core ideas of those that I'd rather not read fully. It is doing wonders for discoverability in the right hands.
the world of greek philosophy is marked by two eras, with the socrates as the inflection point. as far as we know, he wrote nothing. we only know of his ideas from his students, and they are pretty good ideas. socrates sharpened his wits in conversations. plato’s dialogues are a homage to this fact. it appears that productive thinking happens in conversations. unless you’re engaged in a mere manipulation of symbols within some grammar (as happens in mathematics) can it be accepted that solo thinking via writing is a great aid. this is leslie lamport’s domain. it should influence how one understands his statement on thinking and writing. about complex, irregular ideas solo thinking only gets you so far. the sort of solo thinking you do when you write is what has misled paulg to write this incomplete think piece. if he had discussed this idea in a conversation someone would have pointed out to him that socrates, jesus, and muhammad wrote nothing but have left behind some huge edifices of thought.
maybe someone would have even pointed out to him from what activity the peripatetics derived their name. but alas!
He didn't need to right that article. He is already rich. So why did he do it?
Everyday hn is full of articles where people have some some amazingly complex thing, entirely for fun. Then they have written a blog about it entirely for fun.
Then we get one of the familiar detachments from reality
> In preindustrial times most people's jobs made them strong. Now if you want to be strong, you work out. So there are still strong people, but only those who choose to be.
Except for all of the people whose jobs still make them strong. Scaffolders, tree surgeons, bricklayers, carpenters, et al.
I need to write a document next week. I have begun to analyse a complex system that chatgpt will not be aware of. I need to apply my specialism to it, to decide what to do with lots of steps. Writing it will test my understanding of the system, and encourage completeness. It will allow others to know what they are expected to do. It will allow a constructive discussion of the choices and reasons. ChatGPT wont help me, except perhaps in layout, rephrasing a sentence, something like that. My job will keep my writing muscles strong. Paul Graham has lost touch with reality.
Just write - but in seriousness, On Writing Well: The Classic Guide to Writing Nonfiction by William Zinsser and Dennett's Intuition Pumps and Other Tools for Thinking
> And yet writing pervades many jobs, and the more prestigious the job, the more writing it tends to require.
I would say that it's quite the opposite! The most prestigious the job, the more likely the person will have one or many assistants to help them write.
Think of presidents, governors and CEOs. They must *read* much more than they write. Their response can fit in a post-it attached to the paperwork.
The next level also reads more than they write. Instead of a post-it, they will probably come up with bullet points which will be fleshed out by people below them.
The people who *really* have to write stuff is the people at the *bottom* of the hierarchy.
Writing well could be a way to go up the ladder. But it is definitely not required at the top.
What will change, in the future, is that *everyone* will have assistants.
The point was that it was difficult to get into such a position without having learnt to write. I doubt that people in such positions, having already learnt to write, just start dictating to staff and having them completely rewrite it. They would still use those writing skills to communicate with the staff and directly with other correspondents.
One of the best indications / talks on writing and thinking is Jordan Peterson - the best advice he gives anyone is to write - 5 minute video https://www.youtube.com/watch?v=Kwjw2J6ByJo
The threat to individual (and overall) development posed by easy access to shortcuts like ChatGPT provides is not benign. We are seeing very real cases where it's being overused at the expense of quality and understanding in not just academics and resumes, but also in corporate environments (consulting especially). Most people in offices are already refusing to compose their own emails and messages, and soon this will percolate into the lives of children. It is genuinely scary to me what this could bring on in the future.
Sorry, while I have a lot of the Graham essays, hopefully all them, now also including this one, here I nearly fully disagree with Graham:
(1) Due to computer-based word processing and spelling and grammar correction, writing, and good writing, are much easier now than before personal computers (PC). Indeed, a quip is that the typewriters killed off the ink pens, and the PCs killed off the typewriters; writing got easier and likely better, not less common. People got a lot more practice.
(2) Email, Internet blog posts, and other communications generate more writing. Can we find some data on total US email volume and compare that with old USPS mail volume, letters to the editor of newspapers, etc.?
(3) Now there is a lot of competition for good writing: At Hacker News, bad writing, especially from bad thinking, gets down voted. On Web sites using Disqus, part of getting voted up is clear, short, maybe just one sentence, maybe sarcastic, clever, and humorous, say, succinct, on the point, and maybe fun, and that means in some respects better writing. Maybe Disqus could tell us how Internet blog post writing volume has increased? A lot?
(4) For the media, via the Internet and Web sites, that is now much cheaper to produce than old newspapers, magazines, TV news, and I'd guess that the total of media as writing or as oral reading of what was written is much greater than before. For the future, I anticipate many more words per day, i.e., more writing. Uh, the writing at Hacker News has been going down, up, or staying the same? There is Facebook, X, Reddit, Wikipedia. There are sites for narrow interests. Sounds like a lot more writing.
(5) People have smart phones with them nearly all the time; net, that can mean more communications, and writing is less intrusive on the receiver than in-person voice. From good STEM field communications or just mature socialization, to avoid being misunderstood or offensive, good writing is important.
(6) Sure, now some Google searches result in AI answers, and for some simple questions the AI answers can be a little okay. But, I don't take the AI answers seriously, and the old Google search facility works fine and, also, of high importance, gives the URLs for the search results.
(7) Looking back at my writing, from personal letters to academics, on-line political discussions, etc., I see no way AI could help -- the AI writing is worse, not better.
(8) Since supposedly Taylor Swift is now worth $1.6 billion, there can be increased interest in guitar playing and the claim that such music is based almost entirely on the four chords I, IV, V, and VI. So, my niece wanted to know, and I wrote her an essay, 6,200 words with several YouTube URLs with pictures and sound. For some music with a lot of chords included URL
Sorry, but for my niece I found nothing nearly as good as what I wrote and believe that AI would be a poor substitute.
(9) The world is changing, especially related to writing, at likely a uniquely high rate, and no AI training data can report today what is new tomorrow and needs good writing.
Yet again, Paul Graham states his view that people other than him are incapable of thought. I'm shocked at how frequently he does this. I would be embarrassed to make such an assertion in public even once, let alone repeatedly. While the notion that AI will replace writing is somewhat prescient, the idea that it will replace thought most certainly isn't. Paul Graham is someone who writes to think, therefore he assumes writing is the only way to think. Entirely consistent with his massive and unrelenting ego and simultaneous total lack of empathy or understanding of the differences between people. I shall leave you with this quote of his—the most sneering text I've ever read:
“It's not surprising that conventional-minded people would dislike inequality if independent-mindedness is one of the biggest drivers of it. But it's not simply that they don't want anyone to have what they can't. The conventional-minded literally can't imagine what it's like to have novel ideas. So the whole phenomenon of great variation in performance seems unnatural to them, and when they encounter it they assume it must be due to cheating or to some malign external influence.” - https://paulgraham.com/superlinear.html#f12n
There you have it folks. The genius Paul Graham is one of a select few people with the ability to have ideas, something which those who disagree with him are simply incapable of comprehending.
How is that your takeaway? Please quote the sections from his essay that led you to your conclusion.
My takeaway: people that know how to write, that have trained that muscle, are better at thinking in a structured way and articulating their thoughts. The number of people that know how to write is declining, at least in part due to the advent of GenAI. The number of people who know how to write is still non zero and is not limited to only Paul Graham.
The people who like to write will still write no matter how many chatGPTs are there. New people will start writing because it enjoys them and because curiosity.
But there are a lot of others which never liked to write, they do not need this for their job and why should not use this GPT as a tool like the promised land of AI / robots.
Same will happen with cooking: people who like to cook will cook traditionally even after our incoming household robots will be able to.
> Instead of good writers, ok writers, and people who can't write, there will just be good writers and people who can't write.
> writing is thinking. In fact there's a kind of thinking that can only be done by writing
> So a world divided into writes and write-nots is more dangerous than it sounds. It will be a world of thinks and think-nots. I know which half I want to be in, and I bet you do too.
PG states, clear as day, that he expects the world to be divided into people who can think (him) and people who can't (almost everyone else). When I say Paul Graham imagines only he can think, this is hyperbole. I'm sure there's a small group of people with views very similar to his to whom he would also attribute the ability of thought. I am commenting on the clear and undeniable pattern of PG writing that huge swathes of the population are incapable of thinking.
>in a couple decades there won't be many people who can write
seems quite wrong to me.
If anything the trend seems to go the other way - when I was younger pre internet most communication was face to face or voice over the phone.
Now the predominant thing seems text - sms, whatsapp, this text box I'm typing into now. I saw a stat the other day that online / app dating had gone from a minority to over 50% of how couple meet. And that is mostly a combination of some photos and text. Be able to write text or fade from the gene pool!
That said long form text may be different but those who write novels and the like were always a minority.
(source for the dating thing - not sure how accurate but kind of scary https://www.reddit.com/r/interestingasfuck/comments/1fzqgvk/...)
10-15 years ago I think this take was correct, the Internet was about writing.
It isn't anymore, not for newer generations - e.g. Gen Z spending most of their time on Tiktok and phones, and not knowing how to use a word processor.
In the span of ~30 years pg is talking about I can absolutely imagine some job where you speak to the AI and it writes the documents for you and you never learned how to write one yourself. It will not be a good job but millions of people will hold it. They will not be able to write with much sophistication themselves ergo they will not be able to think with much sophistication either.
Online dating is not about writing. It was before Tinder, but it's not anymore. Like Instagram, it's about being skilled with photo filters and/or hiring a professional photographer. No one bothers to hire a profile writer - because no one reads the profile.
If the other person's photos are hawt you will click a button and the AI will send some funny jokes and if you're hawt too you'll share locations and shag. Idiocracy or some Eloi/Morlocks world will be real
Curious about your hypothesis, I went on Tiktok and clicked trending. The top thing (clips from Superstore I think) had 16535 text comments on it though mostly stuff like
>"Don't you hate Tuesdays?" "AHHHHHH"
so not really long-form essays. Maybe the future is that stuff?
The present is already non-written communication with Emojis. This supports TFA's thesis, as these forms of communication are about emotions, not thinking. Maybe it is a bit unfair to pick a media that focuses on video/image, though.
TikTok videos are embedded in text -- titles, usernames, descriptions, comments. It's very different from TV, and if anything if it replaces TV it will make newer generations more literate, not less.
> It isn't anymore, not for newer generations - e.g. Gen Z spending most of their time on Tiktok and phones, and not knowing how to use a word processor.
This is a classic fallacy as old as society. “Whatever the hoi polloi are doing is by definition not the good stuff”. But long-term whatever the masses are doing always wins.
You know Shakespeare? He was the rube who thought plays could be entertaining to the masses. How quaint and silly, who would expect a commoner to appreciate a play. pfft.
Mozart? Taylor Swift of his day.
Printing press? Don’t even get me started, ew the commoners think they can just, like, write things down? How rude.
I’m as much an anti-fan of the short video communication trend as anyone, but it works. When bandwidth is cheap and video recording ubiquitous, video is a great medium. Who cares what you say, show me.
edit to add an uncomfortable truth: The in-crowd talks to develop ideas. What you see in writing is weeks, months, or even years behind the bleeding edge.
There should be a term for the fallacy on display here. The Spurious Generalization Fallacy perhaps?
At no point did you address whether the shifting habits of younger generations will be bad for their literacy, instead making a general point that new trends in society are routinely panned by older members of such a society.
As a counterpoint, before radio and the phonograph, musical ability was quite widespread. Now, it's much rarer.
You haven't even attempted to address whether various developments in society and technology might do this to literacy the way earlier trends did to musical skills. I think that result is quite likely, by the way.
> You haven't even attempted to address whether various developments in society and technology might do this to literacy the way earlier trends did to musical skills. I think that result is quite likely, by the way.
Fair, I was making a different point. Yes literacy might be reduced, my argument is that this isn’t necessarily a problem. Our abilities shift to take advantage of technology.
A lot like how we got really bad at memorizing long epics because we can just write them down instead.
That said, I don’t think writing/literacy will go away as much as we might fear. The new technologies are not a good enough replacement (yet?)
Less internet for you.
More going outside for you.
> That said long form text may be different but those who write novels and the like were always a minority.
I think this is the article’s point - that this minority is going to shrink even more.
That minority only flourished for a brief period in the 20th century when universal literacy was achieved through schooling and proliferation of white collar work.
For most of history, writers were a tiny minority. It exploded 100x in the last few decades. If it goes down 10x, it's still way above where we were in the 1800s.
That's a fascinating point of view, thanks for sharing.
I know that the concept of dark ages is overblown, but still - something about relying on AI like this makes me think of the end of classical antiquity.
The issue with the modern internet is that it's primarily used for communication (via short writing), often resulting in poorly written messages. Animals communicate through sounds, but this doesn't mean they can talk.
I agree with PG on this point and have noticed that people around me are often surprised when they receive well-written wsapp/sms messages that include proper punctuation and other linguistic markers. Additionally, many people rarely engage in handwriting today, and handwriting is known to improve clear thinking and literacy skills.
Which is still massively more then amount of writing people did pre-internet.
> Which is still massively more then amount of writing people did pre-internet.
I wonder if your answer it intentionally badly written or that's a sign of the problem already affecting even us on HN :D
I expected much more from the article. It is, especially, poorly argumented. I feel the author wrote it hastily.
To begin with, the following assumption is false:
>To write well you have to think clearly, and thinking clearly is hard.
For most people, most life situations which require clear thinking have nothing to do with writing.
>This is why eminent professors often turn out to have resorted to plagiarism.
What's the percentage of such professors ? In the university I studied, there is no case of plagiarism till today. And plagiarism is not done because professors can't write, but due to other professional factors.
>If you're thinking without writing, you only think you're thinking.
As if writing is the only way to think well/correctly/effectively. My father never wrote a word: still, some of the most thoughtful statements I ever heard in my life were told to me by him during our conversations.
When you face a situation of danger, such as a wolf is running towards you: will you start to write your thoughts about what you should do, or will you just run right away and decide about the safest paths to follow while you are escaping ?
> For most people, most life situations which require clear thinking have nothing to do with writing.
The problem with "clear thinking" is that it is subjective. I think Paul Graham and Leslie Lamport, have experienced something like this: when they sit down to write about a certain topic, they realize that their initial thoughts were not nearly clear enough, and after a number of iterations they became clearer and clearer. Most of us don't write essays, so we simply don't recognize this feeling.
The author: A requires B
You: what nonsense. Clearly, B does not necessarily require A, and yet he says it does, how poorly argued.
Yup, as long as you ignore this quote “If you're thinking without writing, you only think you're thinking.”
Exactly. Isn't it ironic?!
I think he is hinting to when I said: "For most people, most life situations which require clear thinking have nothing to do with writing."
I meant: since most life situation where we need clear thinking do not involve writing, then we are obviously well equipped to think clearly.
And if thinking clearly is not that problematic for most people, then the author can't say we can't write because thinking clearly is hard/or we can't think clearly.
Got it ?
You're still both missing PG's point, and getting your logic wrong for the point you are on. About the latter:
> "I meant: since most life situation where we need clear thinking do not involve writing, then we are obviously well equipped to think clearly."
That's not the QED you seem to think it is. The statement that "most life situation where we need clear thinking do not involve writing" doesn't give any reason to think that most people are good at clear thinking most of the time, nor whether people find clear thinking easier with the help of writing or if writing has no benefit to the goal of clear thinking. You're just putting two opinions you have next to each other and acting like one confirms the other.
And a friendly tip, "have I explained better what I meant before?" would come off as a lot more polite than "got it?", which to anyone who agrees with the rest of your comment could easily read as snide/patronising, while anyone who thinks you're still wrong will see it as smug and wrongly confident. (Apologies if English isn't your first language, in which case you're very good at it, and apologies if you didn't want unsolicited opinions on how your choice of language makes you seem in my view!)
edit to give an analogy: I feel your argument is like if somebody said "control of body movement is key to being a great athlete", and you replied "everyone is always controlling their body movement, clearly therefore it's not relevant to how good an athlete is".
I upvoted for:
> "have I explained better what I meant before?" would come off as a lot more polite than "got it?"
Thank you very much.
PS. English is not my native language.
That writing is the only way to do deep, clear, thinking simply isn't true.
Stephen Hawking is the first example that comes to mind.
He developed a remarkable ability to perform complex calculations and visualize intricate mathematical concepts entirely in his mind. He once mentioned that his ALS diagnosis, which limited his physical abilities, led him to focus intensely on theoretical physics, as it required more intellectual than physical effort.
But sure, writing (and drawing) is a great tool to aid in deep thinking. So are AI tools.
I think you have understood "writing" in a very narrow sense. As mentioned in other replies, Stephen Hawking was a very prolific author. He did not write much, but he sure knew how to write.
PG is obviously talking about the mental process of writing, i.e. of organizing a complex network of thoughts in a linear hierarchy that others can grasp, not the physical one.
> That writing is the only way to do deep, clear, thinking simply isn't true.
You're correct here.
> Stephen Hawking is the first example that comes to mind.
The post is obviously speaking of the general population or at best average professional, and in my opinion choosing one of the most brilliant exceptional scientific minds of our lifetimes is not a good counterargument for a piece that speaks of a potential problem with society at large.
As someone who teaches PhD students who are quite far beyond "average professional", I concur completely with PG on this one. Writing forces you to make very clear and concrete ideas that feel like they make sense but are still fuzzy. It's certainly not the only way, but it's the most common and easy way.
To use an overextended computer metaphor: serializing data structures to a wire format forces lazy evaluation, turning up any errors that were previously hidden by laziness.
I don't disagree, just want to mention that as someone married to someone who supervises Phd students, they're not by any means "far beyond average professional"... but perhaps you're on a exceptionally highly regarded faculty where that may be the case.
That is probably the case.
One of the most exceptional scientific minds of the time, who, I might add, despite not picking up a pen, nevertheless wrote books!
Strange example to pick as someone who did not write.
Best-selling author Stephen Hawking?
A Brief History Of Time?
Reading and writing are essential for the transfer and percolation of knowledge across society.
Stephen Hawking's thinking and imagination wouldn't have meant much had he not finally penned them down for others to read, and neither would his ideas have been taken seriously had he chosen to make tiktoks or podcasts to explain them instead.
> That writing is the only way to do deep, clear, thinking simply isn't true.
You have committed the Fallacy of the Inverse.
But for most of the rest of us in practice I suspect that it is more true than false.
Most of us have neither the intellect of Hawking nor his situation.
It is weird he doesn’t think of AI as deep thinking tool at all.
Sure some will thoughtlessly copy and paste but for many AI helps to structure their thoughts and they think clearer as a result.
I think what he's getting at is that while you CAN use an AI to assist with "ideation," we will inevitably create new, low paying jobs where there is no ideation and the employee just operates an AI, because economics. That will in turn create a large cohort within society who are functionally illiterate. Literacy profoundly alters the brain for the better, and this won't happen to those people.
Can you expand on that? I can't see any sense in which an LLM improves the structure of the user's thought process
It's useful for ideation: suggesting ideas and concepts that you might not think of. A bit like a conceptual thesaurus. But it doesn't replace the hard work of thinking for yourself.
In the same way it can detour your thoughts to the mainstream and lead you away from a line of thought that might have ended up somewhere else.
Here are two examples:
a) No / little data: Whenever you are starting to think about a subject, you can ask it to give you a structure / categories.
b) Existing data: What I do very often is to give it a lot of "raw data" like unstructured thoughts or an unstructured article, then I ask him to find suitable top categories.
a) Doesn't that mean to limit oneself to bias, mediocrity, and preconceived judgements instead of actually thinking?
Well it helps me to get started. Not more but also not less.
For me it’s very important to emphasize that AI is a tool. You have to use it responsibly. But there is no reason not to use it.
I see, I don't want to shame this kind of use. It's kind of almost like talking about something briefly with an educated person.
Until it's not.
I'm not the type who'd say "don't use AI". Use whatever works. Myself I became really fascinated by transformer LLMs / GPTs in winter 2019, then again when ChatGPT was published and a good few months after that.
It's just that my interest&enthusiasm has almosted vanished by now. Surely it will reemerge at some point.
Very good point. I often use AI to see things from multiple points of view. It is a good tool to check if you have included obvious things in your argumentation. Spell checking is just one of those obvious things.
> But the middle ground between those who are good at writing and those who can't write at all will disappear.
This observation of Paul Graham may generalize beyond writing: modern technology appears to turn populations into bi-modally distributed populations - for example, those that write/consume human-written prose and those that produce/consume AI-generated prose; those that can afford human medical doctors and those that can only afford to consult ChatMedicGPT or Wikipedia; those that can afford human teachers for their childrens and those that let EduGPT train them, etc. Generally speaking, I expect a trend that more affluent people will use higher quality human services and the rest has to live with automation output.
That's actually a super interesting idea that I've never considered.
It's interesting to think of humans as being like a premium service where AI's are a sort of knock-off/budget human service.
I follow Paul's argument about the consequences of the churning of low quality output by AI, but I think there's a second order effect that's more concerning. The ability to judge other people's knowledge of a subject area will become exceedingly rare and priceless.
Unlike younger generations, who are growing up surrounded by AI-generated content, many of us older folks have had the experience of engaging directly with people and evaluating their competence. We developed a knack for quickly determining someone's skill level through just a few minutes of face-to-face conversation—a skill that was essential for navigating various life situations.
Now that anyone can use AI to generate seemingly competent text, videos, and more, and as in-person interactions decline, the conditions that once allowed us to gauge competence are fading. I worry that in the future, no one—including AI trained on our outputs—will be adept at making these assessments.
Those of us who take time to carefully compose arguments and revise them, as Paul suggests, will have a better handle on this, so that's a helpful consideration.
I worry strongly about a future like that in Ideocracy[1], where nobody has a clue bout actually judge competence, and instead goes with the best sound bites.
The one path out that I can see, and it's unlikely, is to teach the skill of explicitly tracking history, and reviewing how well someone predicted the future, over time.
The explicit generation and curation of a reputation is part of that priceless nexus that they'll all be seeking in future generations, and yet it'll pale in comparison with the ability to size someone up in a few minutes of interaction.
[1] https://www.imdb.com/title/tt0387808/
Judging other's competence will only be a problem if brain-computer interfaces become widespread before AI largely replaces the competent workforce, and my money is on AI replacement coming first.
- Only a brain chip could make AI usage undetectable in practice. Without that you can tell if the person is checking his phone etc. Though you're right that an in-person interaction will be needed, otherwise there's no way of knowing what the other person is doing or if he's a real person at all... And since the latter problem (dead internet) will only grow, perhaps beyond the rectifiable, in-person communication will surely be in business again.
- Once AI replacement of competent humans has reached a certain threshold, what do you stand to gain from testing a human's level thereof? Are you interviewing for "above AI" positions? If not, relying on AI will be as normal as relying on a calculator.
> many of us older folks have had the experience of engaging directly with people and evaluating their competence. We developed a knack for quickly determining someone's skill level through just a few minutes of face-to-face conversation—a skill that was essential for navigating various life situations.
I think I have a bit of this knack, in some areas, tempered by an awareness of some of my blind spots, but most people don't even claim to have this knack...
As evidence from our own field: before the explosion of LLM cheating, we had the explosion of Leetcode hazing.
Because, supposedly, good experienced software developers couldn't plausibly recognize each other just by talking with each other.
So instead we whip out these douchetastic did-you-prep-for-this rituals. And afterwards you still have no idea what the other person would be like to work with (except that you now know both of you are willing to play fratbro nonsense games).
Yes
I been intentionally changing up my candor so that people
Who get caught up in the structure, lose the message
If u know you know
Another (albeit optimistic) possibility: today we have an informal oral culture contrasted with a formal literary culture, so Graham perceives the latter as synonymous with thinking. However, before literacy was widespread there was, on top of the informal oral culture, also a formal oral culture, so maybe with the popularity of short video clips we might see a resurgence in structured speech?
If you observe the trend speech has actually devolved with the arrival of popular internet networks. Today's elites talk in a fashion that would embarrass even their immediate parents, let alone their ancestors.
Just look at the quality of presidential debates and political discourse we've been having for the past decade. Not just in the US, but all over the world. The situation is perilous.
Political speech probably chases not even the median, but the marginal, voter, so you'll see that US SOTU addresses having been going regularly "Dick and Jane"-wards for the last couple of centuries.
Lagniappe: https://www.youtube.com/watch?v=EwPnJXXX5Ic
Devolved by what metric?
Others in this thread are likely using their own metrics; my responses were all based on declining Flesch-Kincaid score of State of the Union addresses (from "20th" grade to 8th grade over 200+ years): https://news.ycombinator.com/item?id=41961710
Devolved in the ability to speak on-point without ad hominems.
The usual one: young people are using it!
the simplification of speech and writing is a trend that had been going long before the advent of the Internet.
The total IQ on the planet remains constant even as the number of people increase, I guess. Although I really wasn't expecting the average to go down this fast--consider the idealistic and rationally thorough political speeches of the 70s and 80s, and the way things went quickly downhill from there...
As someone said, it's no wonder The Matrix chose the 90s as the peak of human civilization.
I'll be generous and pretend you said intelligence, not IQ.
What is the mechanism you propose, by which the birth of a child makes every person now living incrementally dumber?
Personal anecdote only, but I'd suggest lack of sleep. Every time a kid is born, the total amount of quality sleep extant adults get drops.
(Kids, if you ever see this, I'm not saying it wasn't worth it. But seriously, 2AM every night for months?)
> The total IQ on the planet remains constant even as the number of people increase
IQ, (from “intelligence quotient”), is a number used to express the relative intelligence of a person.
So for the whole population it is constant by definition :)
The recent political speeches at the economic clubs of Detroit and Chicago were a little deeper than Dick and Jane! Also 21 Nobel Prize economists found the content to be deep enough to disagree!
Yes, I picked SOTU because that's a general audience, unlike economic clubs.
And heading in the direction of Dick and Jane doesn't mean that we ever reached it; according to https://community.jmp.com/t5/image/serverpage/image-id/8926i... recent SOTUs should be comprehensible to high school freshmen.
(full discussion at https://community.jmp.com/t5/JMPer-Cable/Regression-to-model... ; through 2018 — anyone have 2022?)
EDIT: keep in mind that the expected "general" audience for the SOTU has also expanded dramatically due to technological change in between 1790 and 2018...
Oral culturals often had other physical representations of thoughts in stories. The physical objects that held the stories could be women cloth or carved wood. It may not be actual writing as much as coherent cultural transfers through objects that PG is referring to.
> I'm usually reluctant to make predictions about technology, but I feel fairly confident about this one
It's interesting to be usually cautious but then predict something so radical, and yet with no real argument other than "AI is gonna replace us".
Painting should have been replaced by photography, but it hasn't been. In my opinion, there are still plenty of people who want to write, so there will still be plenty of people who know how to write.
And maybe my opinion is wrong, because it's an opinion. But to have to transform it to a certainty, I'd have to see much, much more data than a feeling and a conviction.
I'm not a photographer, but a musician. I won't say that musicians have been replaced by automation, but the ability of someone to earn a living from their abilities as a musician has been eroded considerably over the past century. The preservation and advancement of many musical styles is occurring primarily in living rooms, or is performed for tiny audiences of enthusiasts. I'm happy to help keep jazz alive in that way.
Writing may become the same thing. In the workplace, if someone is writing, they're probably doing it for their own entertainment. Some people write at home, writing journals, blogs, etc. Nobody will know that you're writing, unless it affects your thinking, and your thinking affects your work.
I think we already reached the stage where people stopped writing, before AI entered the picture. I rarely see anybody write a lengthy report any more. Reports have been replaced by PowerPoint, chat, e-mail, etc. One consequence is that knowledge is quickly lost. Or, it's developed by writing, but is communicated verbally.
I think of writing as similar to a linear extension of a partial order. Your brain doesn't think a single letter at a time, instead, all of your neurons are doing neuron things all at the same time. But writing is linear. This forces order and I think is partially responsible for the "clear thinking" ascribed to writing!
Hopefully I'll live the couple of decades to find out if PG's prediction is correct, I would bet against it.
I don't always agree with everything PG writes.
Here however, I do agree with his articulation -- "writing is thinking" -- and like you, I've thought a bit about the linear nature of writing.
My view is that the "jumble" of ideas/concepts/perspectives is just that -- a jumbled mess -- and the process of linearizing that mess requires certain cognitive aspects that we (humans) generally consider as constituting intelligence. IMO, the rapid generation of grammatically-correct + coherent linear sequences by LLMs is one reason some folks ascribe "intelligence" to them.
I liked his analogy about how the disappearance of widespread physical work meant that one now had to intentionally invest Time and Effort (at the gym) to maintain physical health. The facile nature of LLMs' "spitting out a linear sequence of words" will mean fewer and fewer people will continue to exercise the mental muscles to do that linearization on their own (unassisted by AI), and consequently, will experience widespread atrophy thereof.
As someone working on linear extensions of partial orders (some of the time), I found your observation very insightful, a perspective I haven't considered before.
To add to this, when I think of ordering I’m reminded of the NP complete traveling salesman problem. It’s easy to make a program to visit all locations, but optimal order is so much harder.
I suspect thinking is similar, which brings up questions about LLMs as well. We all can now quickly write hundreds of generic business plans, but knowing what to focus on first is still the hard part.
Reading is linear. Writing is not.
The idea that there will be a "middle class" that relies on AI to churn out missives without much thought scares me a lot more than having people who don't write a lot, honestly.
I'm seeing that happen today with corporate documents (there's always that one enthusiast in each team who says "oh, let me improve that with [LLM]", and it's a slog to go through pages and pages of things that could be a bullet point). Quality has been trumped by mediocre quantity, and the cluelessness of the people who do this willingly baffles me.
As someone who's been writing pretty much constantly for over 30 years and both uses AI code completion to speed up first drafts (of code) but switches off everything except macOS's (pretty good) word completion for prose--and absolutely refuses to use AI to generate work documents or even my blog post drafts--this post was a bit of a "oh, so this would be the ultimate consequence of keeping all of it on" moment.
Accelerating writing (with simple word completion) is fine. But letting the AI generate entire sentences or paragraphs (or even expand your draft's bullet points) will either have you stray from your original intent in writing or generate overly verbose banalities that only waste people's time.
I use iA Writer to draft a lot of my stuff, and its style checks have helped a lot to remove redundancies, clichés and filler, making my prose a bit more cohesive and to the point. That's been around for ages and it's not LLM-style AI (more of a traditional grammar checker), but that sort of assistance seems to be missing from pretty much every AI "writing aid"--they just generate verbose slop, and until that is fixed in a way that truly helps a writer LLMs are just a party trick.
Edit: I just realised that I wrote about this at length in February - https://taoofmac.com/space/blog/2024/02/24/1600#the-enshitti...
I read an interesting remark once, that current growth of bureaucracy, NIMBYism et al. was greatly assisted by Ctrl-C, Ctrl-V.
During the typewriter era, anyone's ability to produce pages and pages of text was limited by their ability to type. Nowadays, you can copy/paste large blocks of code and thus inflate documents to enormous sizes. Which works just like sand in a gearbox of the decision-making process.
I wonder if the future of ESG, DEI and such is that one AI will produce endless reports and another AI will check if the correct buzzwords are used in correct frequency. And instead of yearly reports, they could easily become daily or hourly reports...
It would be a way to tout "allyship" on the social networks without actually doing anything substantial.
I agree. See also my comment in a sibling thread about using LLMs to write tests: https://news.ycombinator.com/item?id=41961214
I like and believe too the notion that writing IS thinking and that cultivating this skill helps you form clearer thoughts. One question about this: I'm trying to engage more and more in writing for this reason, but I seem often not to know what to write about. For example, as PG has a lot of experience when it comes to startups, I feel I don't have as much knowledge to share, or if I have some opinion, I don't feel as confident in sharing it. Any tips on how to cultivate writing as an exercise in thinking in this situation?
Just get going. Quoting https://randsinrepose.com/archives/your-writing/ :
* Less than 1% of your writing will be life-changing.
* 3% will be trivial to write.
* 4% will strongly resonate with others in a way you didn’t expect.
* 5% will be quite good.
* 15% probably should’ve never been published.
* 26% will elicit a reaction you did not expect. Positive or negative.
* 28% will become vastly better because you chose to edit.
* 30% will start as one piece but finish as another.
* 40% will be good solid writing.
* 45% will do much worse than you expect when published.
* 60% of your writing will never be finished. Be ok with that.
* 100% of your writing is worth your time.
And when numbers are just rethorical devices you can pull them out of thin air.
>* 100% of your writing is worth your time.
Just curious, but do you think my collective 20 years of random posts across 10+ social media platforms, often with thousands of posts per platform, has been worth my time?
What's "worth" in this question?
May be hard to answer. My view from the same behavior: yes, worth it.
Was everything productive for a career? For a relationship with someone else? Sometimes yes. Sometimes no.
Would love to hear your thoughts on my first question!
Well, I think mine have. There were some weird tweets, but on average… yeah.
> some weird tweets
Sounds like some of the 15% or some of the 28%. :)
I have found journalling to be pretty effective about anything. Most times when I meet a block, I write about it. Doesn't require tons of experience to journal about stuff.
He doesn't get it. It is not "those that can write", it will be those that can communicate, really communicate versus those that cannot. It's amazing now how few people can explain themselves, without thinking that act of explaining themselves is somehow a punishment or a lead up to a punishment. Beyond that, how few developers can explain what they do without the use of empty acronyms and a host of gobbledygook that they probably only know as a pile of gobbledygook because they don't work in that stuff, they just use it, tangentially.
But the real issue is people that are not already engaged and knowledgeable about what one another are doing, the key moment when a non-tech needs to discuss a tech need with someone from the tech developer sphere: can they even communicate, and I'm not talking through a salesperson, but actually discuss what one needs and what one provides without resorting to empty jargon? Real communications needs no jargon and does not use jargon, it modifies itself to be understood by the audience, using the audience's terms.
This is critical in the coming decades: learn to communicate, professionally communicate, and I'm not talking about being a media talking head, I'm talking about learning how to speak to anyone anywhere from any stature. It's a critical skill and it is damn well needed now as well as tenfold in our fast approaching future.
I think this is a bit too simple. Some people might be good at thinking (e.g. in math) but bad at writing. AI can just help them to do their job better.
I am for example very good at math and reasoning etc. But when I write something I tend to construct long complicated sentences (probably because I think that way^^) and the result whould often be considered badly written.
Now of course you can feel superiour, for your better writing style. If it makes you happy ;)
For what it’s worth, I don’t think “reasoning” and “conveying that reasoning via writing” are quite as siloed as you suggest. Many people write long and complicated sentences in a first draft, but it is the subsequent drafts where that initial block of mental marble is carved and honed into a compelling argument. I believe it is in fact that very act of redrafting — seeing one’s work through the eyes of others — that leads to one’s own greater understanding. Skipping that step with AI limits that potential to whatever ended up in the first draft, and Hemingway’s words apply to far more than fiction.
But I can also achieve the same (act of redrafting) by refactoring my source code, working on a math formula or you know just sit there and think in my mind.
All of those are related to language, because our thinking (and also math and logic) is based on language.
But just because we think in language, does not imply that writing is the only form of reasoning. It is ok when it is your preference and certainly has a value - like other thinks.
Imo, they just don't lead to more actually clarity inside my brain. They are just a lot of work on top of it.
My imminent concern is that it seems like increasing numbers of people can’t read anything more complicated than a tweet. Do we end up with a tiny group of people who can seriously read and write, just communicating with each other? I think we had something like that several centuries ago? Is this whole “universal literacy” concept just doomed?
Lots of comments in this thread are reacting against the idea that writing is required for clear thinking. I think this is true, but a little more nuanced.
Writing and clear thinking are are related in at least two ways:
1. Both good writing and good thinking require structured thought, which is the ability to organize and categorize ideas in relation to eachother.
2. Both good writing and good thinking require meeting other minds: expressing your ideas in such a way that they are also comprehensible to someone who doesn't share our own brain. This is important because it's entirely possible to _think_ you have a good thought without actually having a good thought. The ability to articulate a thought is an effective discriminator between the two scenarios.
It's probably possible to think clearly without writing, if these two elements are still present. But writing well is an effective forcing function, which is why they're so closely related.
Can this be put in analogy to arithmetic and calculators? People had to be a lot better at mental math and calculator removed the pressure. You could imagine making similar arguments that losing the ability would be disastrous. The reasons it wasn’t was 1) people still learn and get tested in school without calculator 2) they still need to do enough mental math in day to day life without a calculator at hand so the ability hasn’t just gone to zero. Net result is people are mostly fine and there is a net improvement in arithmetic correctness across economy. Other similar aids: spellcheck, google translate, …. I feel some hesitation that these concerns are time invariant. The argument is probably that writing is different.
Yes. The argument is that writing is different and I don't think the analogy holds. Calculators are not tools for mathematical thinking or communication. Writing is a form of processing thoughts and communication.
The "people are mostly fine" is true. People were mostly fine at the time when only the elites could write, but the society will not be the same. We are moving towards those old times.
Maybe you deal with intellectual elite from the best schools and haven't noticed[1], but adult literacy even among college educated has been on a downward trend for some time and we can see some results. Normal interactions in the corporate world are more difficult. People in middle-level management can't explain and articulate things as well as they used to. There is more communication, but it's not as efficient. One well-written report in every two weeks used to be enough. You need to be on a Zoom call 2-3 hours weekly for the same thing.
Mediocre or low quality writing is mentally taxing to read. If what you read is grammatically correct AI-slob it really kills all interest in reading and communicating in writing.
[1] Only about 10% of adults have PIAAC adult literacy level 4 or 5.
The difference is that with arithmetic and calculators I am simply playing the rules of a formal game without regard to the semantics of those numerical symbols. I don't care so much what the numbers mean as much as I care that I have executed the rules correctly and arrive at the "right number."
With language however we are attempting to refer to some kind of underlying meaning or reality, and an LLM will not give you the understanding of the things being referred to - only the outward representations themselves. If you are indeed interested in meaningless exchange of symbols a la Searle's Chinese Room then perhaps there is some utility in this; otherwise it's the act of digesting and comprehending meaning, and the internal cognitive processes that go into the production of the written artifact that matter, not just the written artifact itself.
Calculators don't really help you think. You need to have a concept of the calculation beforehand and you only run the technical steps on the calculator.
Writing is different. It is more akin to the left leg, when the right leg is your thinking. I am a weird combination of an author and a mathematician-turned-programmer, and both these skills are very useful in either activity; I wouldn't be able to program half as well if I didn't dump my messy ideas to a doc first, especially when the model is far from straightforward.
Writing things down is a huge feedback loop for thinking. Plenty of edge cases stick their head out of the doc once you write everything down.
I think this is a good comparison.
I wonder if some people would argue that it _was_ disastrous though?
We seem to struggle with things like putting people on the moon and not making airplanes fail compared to 50 years ago.
Maybe there was some critical mass of mental maths skills that engineers had in the 60s that we've lost now? Are we still inventing things at the same rate as before?
Arithmetic tends to be a one-shot thing to perform a step in a daily activity. Writing tends to be about planning or communicating more complex things.
Not that, say, doing long division in your head isn't a valuable skill, mind.
This would be better differentiated by one’s ability to express rather than to write. Of course, a person who knows how to read as well as how to express an idea will do fine with writing as a means of expression but one can also express an idea orally and use that oration to refine their idea.
https://en.wikipedia.org/wiki/Rubber_duck_debugging
As a person who frequently uses writing to focus my thoughts, I don’t see why one would see writing as the only way to focus their thoughts.
I agree with Paul. Our creative and intellectual abilities are so powerful, yet there’s a real danger in handing them over to AI. That said, I think AI is still extremely useful for writing, if it complements one’s skills.
I recently published a [major philosophical work][1] that is the result of decades of thinking and three months of writing. I’m not a native English speaker, and although I know what I want to say, I often don’t know how to write it. I may not know or can’t find the right terms or phrasing, or I might make grammar mistakes. Sometimes, I can describe my ideas in a clumsy way, and I need help refining my sentences.
So, I use AI. I think, write my thoughts in my own way, and then work with AI to bring them closer to what I want. It’s hard work. Although AI can be an amazingly good writing partner, it often alters my text in ways that change the meaning completely. Even replacing a single word word with a synonym or adding a comma can turn a sentence into something totally unintended. It can be a lot of back-and-forth work to find the right paragraphs. Still, AI is a tremendous help, and my work would have been more immature and unpolished without it, even if it sometimes feels a little artificial.
Of course, it’s much more ideal to master English fully and to practice writing until it feels natural. But AI helps with that, too.
[1]: https://news.ycombinator.com/item?id=41954302
> So a world divided into writes and write-nots is more dangerous than it sounds. It will be a world of thinks and think-nots. I know which half I want to be in, and I bet you do too.
I'd disagree with "half" here because I can't imagine it being anywhere close to 50/50. I expect a power law distribution: most won't be able to write well. The ones who do will have a massive advantage, in the same way that those who can concentrate in our age of distractions have a significant advantage over those who can't.
Look at the situation now: I don't think 25% of the population can write a single paragraph on some non-banter non-gossip topic that does not appear as if it were written by a 4th grader. Look at the business communications actual people write, it's jargon and very simple directives like "did x happen?" Look at the logical flow of the sentences in technical white papers and wish the authors had more basic writing, basic communications practice, because their sentences do not flow.
We're already in a poverty of quality communicators. All that nonsense with Bitcoin was fast talking nonsense that sounded plausible. This is what happens when real communications breaks down: fraudulent technical products surrounded by a word salad of abused language and people afraid of looking stupid so they never ask for clarifications of the gobbledygook.
writing offers relatively high information density in a low bandwidth medium. but video offers high information density in a high bandwidth medium. the challenge in both cases is coming up with something to say that requires a high information density. for those who do manage to find something to say that requires a high information density format, speaking will be more effective (newspapers/books were replaced with radio which was replaced with tv). https://x.com/kenwarner/status/1840060518461010225
the infrastructure of the internet has matured enough now that we don't have to talk to each other in ASCII characters any longer. being online increasingly will mean using your voice and face (or suitable synthetic alternatives) to talk to the rest of humanity. not like TikTok though with its algorithmically driven mental corruption. and not like YouTube with its copyright oriented business model. More like early days twitter. Just everyday people talking to each other. But video. And realtime.
> Just everyday people talking to each other. But video. And realtime.
Does video add not a dimension of friction over sending text? For one, you can't scan a video the same way you can scan text.
Yet
I write software that solves hard problems and makes money. I am very good at it. Computers do what I tell them, and people are happy.
The biggest obstacle to my job is when I need to convince other people of a course of action. People do not do what I ask of them, because people need a nice narrative. Indeed the reaction of others to my writing is so negative I can get into disciplinary proceedings just for writing facts. Yet, when the shit hits the fan, and people need a leader and a plan, suddenly people do what I ask and are successful, presumably looking past my communication issues because of the imminent and larger threat.
I am Autistic. I look forward to the day when my email and slack client, my JIRA client, automatically take my statements of fact and turn them into something neurotypical people won’t have a social signaling reaction to and respond to as a threat.
Also, thinking can also be doing.
And while I’m here, there’s plenty of people who can write and not think. And more who cannot think critically. Electrolytes: it’s got what plants crave.
I agree very much with Paul here. Many a times as I'm thinking, I realize none of those are legitimate thoughts until I can put them in words. Describing your intuition is fundamentally a hard task, because it is trying to decipher the vague strands of reflection. Clarity emerges only when focused harder.
And often when you have to implement what you've written in code you notice the written thought wasn't so legitimate after all.
I understand his point and think it's an interesting observation, but honestly I disagree. What AI is great at is handling all the bullshit writing tasks that you have to do that no one reads or cares about. For example I got asked to write a blurb for a company newsletter recently. Told an LLM the things I needed to talk about and how long it should be, and the tone I was shooting for. Done in less than a few minutes. Previously that would have taken me at least an hour.
I am a professional writer (in the sense that I have published short stories that were paid for, even though it is a hobby). Recently, my wife was participating in an event where she had to portray an historical figure and she had a fact sheet and a couple of articles about the person (she wasn’t super famous). I used Perplexity with ChatGPT-4o and prompted it with all the materials we had and asked it to generate a 5 minute monologue in first person for the event. First draft was excellent, I touched up a few lines and printed it out. Done.
I don't think this example refutes the article. LLMs can obviate bullshit writing tasks and remove the appeal of writing altogether for some people.
Can LLM/AI have the opposite effect?
People would start writing more and getting better at it with the help of LLM. This would create a positive feedback loop that would encourage them to write more and better. LLM should be used a tool improve productivity and quality of output. Like we use a computer interface to write faster, move and edit text, instead of using a pencil and an eraser, today you can use a LLM to improve your writing. This will help people to get better at organizing their thoughts and think more clearly instead of replacing the thinking.
I largely agree with the point he's making, but would say it's less about writing than it is about one's attention span (people are always checking phones and devices to avoid long stretches of thought) and willingness to avoid conformity (our society has changed to reward being part of a group). Thinking and writing requires both in order to be good. It feels to me that those activities are being discouraged and even punished in the US in 2024.
I too wonder if the writing-is-thinking thesis holds. Does Han Kang (Nobel Prize in Literature, 2024) think through writing? Have I read something thrilling, funny, mysterious, spiritual, or revealing of human nature written by PG? No. He may well be correct about a particular subset of writing, but I do not take his views as authoritative about writing in general (nor am I convinced that hackers are just like painters).
If an AI can barf up a first draft before I need to actually engage my brain with the underlying material, is that a bug or a feature? Probably the former.
If I were teaching at a university, I think I could not assign essay grades until i sat down with every student for a quick Q&A on the subject of their paper. That would reveal the plagiarists and charlatans and lame non-thinkers.
The trouble with that is that it doesn't scale time-wise (not with 50+ seat classes)
And yet, isn't face-to-face going to be the only reliable way to outfox and do an end run around AI?
There will also be many more people that can't read. Sure, they can read simple sentences but not an article.
I'm stuck on the title.
What's this article about? Why should I read it?
(I read it)
This post's title does not echo some of the points in the article. Good human-sourced writing will be hard to do/find?
Well, don't perpetuate the problem with a title that doesn't tell me why to pay attention.
Interesting, I just finished the manuscript of a sci-fi novel where one of the narratives is set in an idilic small town, with no scarcity, with AI systems making the life easy for everyone and where learning how to read and write is frowned upon.
Everyone uses voice interface, so writing is looked as an antiquated habit.
New and non-plagiarized text is essential for all the AI models to train on. Without these the AI models will reach the saturation point or worse the output will not be as effective as it can be.
If AI trains on generated text it is equivalent of an incestuous relationship and I don't have elaborate on analogy further.
There seems to be a modern fallacy which says that everything is knowable, and what is knowable is expressible. It seems all thoughts should have a lead weight tied to them so that they stay nicely on the ground where they can be admired, discussed and replicated. No wonder our age so pedestrian.
I also do not agree to this thinking/writing duo. Most of my thoughts are free floating, so to say, and I ( and possibly anyone) can derive/enhance/combine ideas in my head without any writing necessary and even come to a conclusion how to act on that thoughts.
As opposed to the writing at one end, these thoughts are rooted in an experience, an emotion, and this association never dissapears. Add the above act on them, it is like writing.
Maybe the author is referring strictly to communication or to the art of writing.
Writing done entirely by AIs is currently pretty awful, but I'm sure that it will get better over time. There was a prof on Twitter (can't find the tweet now) who assigned his class to generate and then critique an essay written by ChatGPT and it caused all of them to stop using it.
For awhile, I'd guess there will be requirements to work through a thinking&writing task by oneself.
But already there's a large number of people cheating at those tasks.
So, we could also say we'll have the Thinkers and the Cheaters.
Sadly, I don't expect the Thinkers to be the Haves.
If the point of writing is to share your precise thoughts, there is no use of an LLM. Your prompt would be become equivalent to the writing you wanted to share in the first place.
If on the other hand, you need to generate a bunch of content for a school paper, blog or etc then its immensely helpful.
Idiocracy indeed.
All the more important that everyone goes to school and is required to write (and think) there. Not everyone will become a great writer or thinker, but everyone gets the basics and the opportunity.
Chess has seen rating inflation because of youtube, stockfish and leelazero, and chess.com. basketball is seeing a boom in 3 point accuracy. More people than ever know about squats and deadlifts. I would learn to program much faster today with chatgpt than I did without it. Legalese and flowery prose and even rhymes were really unnecessary for pure communication and thought. we know thought, search, computation is infinite, so there will be no end to progress.
We're also on the edge of the read, and read-nots. There is an abundance of content to consume and AI lets us consume it in any format easily.
Why read a book when we can have an idea distilled in a quick infographic, a shortform video, or a pithy tweet? I love a deep dive book that lets you immerse yourself in idea and study it from multiple angles, done masterfully in Dune or Thinking, Fast and Slow.
But are we losing that chance to really contemplate given the speed at which more information is being thrown at us across every form factor?
On the contrary (though it may not be everyone's case), my reading is better off with the presence of these shortform formats. By just going through these "trailers" I can now easily filter out the books that I should actually be reading, while still retaining the core ideas of those that I'd rather not read fully. It is doing wonders for discoverability in the right hands.
Is the ability to convince or prove a thesis more importan than the thesis itself?
Can it be a thesis if it is not a convincing thesis?
If you haven’t read William Zinsser “Writing to learn” you’re doing yourself a disservice
Very scary.
the world of greek philosophy is marked by two eras, with the socrates as the inflection point. as far as we know, he wrote nothing. we only know of his ideas from his students, and they are pretty good ideas. socrates sharpened his wits in conversations. plato’s dialogues are a homage to this fact. it appears that productive thinking happens in conversations. unless you’re engaged in a mere manipulation of symbols within some grammar (as happens in mathematics) can it be accepted that solo thinking via writing is a great aid. this is leslie lamport’s domain. it should influence how one understands his statement on thinking and writing. about complex, irregular ideas solo thinking only gets you so far. the sort of solo thinking you do when you write is what has misled paulg to write this incomplete think piece. if he had discussed this idea in a conversation someone would have pointed out to him that socrates, jesus, and muhammad wrote nothing but have left behind some huge edifices of thought.
maybe someone would have even pointed out to him from what activity the peripatetics derived their name. but alas!
Paul Graham proving himself wrong....
He didn't need to right that article. He is already rich. So why did he do it?
Everyday hn is full of articles where people have some some amazingly complex thing, entirely for fun. Then they have written a blog about it entirely for fun.
Then we get one of the familiar detachments from reality
> In preindustrial times most people's jobs made them strong. Now if you want to be strong, you work out. So there are still strong people, but only those who choose to be.
Except for all of the people whose jobs still make them strong. Scaffolders, tree surgeons, bricklayers, carpenters, et al.
I need to write a document next week. I have begun to analyse a complex system that chatgpt will not be aware of. I need to apply my specialism to it, to decide what to do with lots of steps. Writing it will test my understanding of the system, and encourage completeness. It will allow others to know what they are expected to do. It will allow a constructive discussion of the choices and reasons. ChatGPT wont help me, except perhaps in layout, rephrasing a sentence, something like that. My job will keep my writing muscles strong. Paul Graham has lost touch with reality.
Which books are best for learning good writing?
Just write - but in seriousness, On Writing Well: The Classic Guide to Writing Nonfiction by William Zinsser and Dennett's Intuition Pumps and Other Tools for Thinking
> And yet writing pervades many jobs, and the more prestigious the job, the more writing it tends to require.
I would say that it's quite the opposite! The most prestigious the job, the more likely the person will have one or many assistants to help them write.
Think of presidents, governors and CEOs. They must *read* much more than they write. Their response can fit in a post-it attached to the paperwork.
The next level also reads more than they write. Instead of a post-it, they will probably come up with bullet points which will be fleshed out by people below them.
The people who *really* have to write stuff is the people at the *bottom* of the hierarchy.
Writing well could be a way to go up the ladder. But it is definitely not required at the top.
What will change, in the future, is that *everyone* will have assistants.
The point was that it was difficult to get into such a position without having learnt to write. I doubt that people in such positions, having already learnt to write, just start dictating to staff and having them completely rewrite it. They would still use those writing skills to communicate with the staff and directly with other correspondents.
One of the best indications / talks on writing and thinking is Jordan Peterson - the best advice he gives anyone is to write - 5 minute video https://www.youtube.com/watch?v=Kwjw2J6ByJo
Use it, or lose it.
Even smart people completely over-rate “AI” it seems
The threat to individual (and overall) development posed by easy access to shortcuts like ChatGPT provides is not benign. We are seeing very real cases where it's being overused at the expense of quality and understanding in not just academics and resumes, but also in corporate environments (consulting especially). Most people in offices are already refusing to compose their own emails and messages, and soon this will percolate into the lives of children. It is genuinely scary to me what this could bring on in the future.
is there a way to block any PG posts from my feed?
so true
Sorry, while I have a lot of the Graham essays, hopefully all them, now also including this one, here I nearly fully disagree with Graham:
(1) Due to computer-based word processing and spelling and grammar correction, writing, and good writing, are much easier now than before personal computers (PC). Indeed, a quip is that the typewriters killed off the ink pens, and the PCs killed off the typewriters; writing got easier and likely better, not less common. People got a lot more practice.
(2) Email, Internet blog posts, and other communications generate more writing. Can we find some data on total US email volume and compare that with old USPS mail volume, letters to the editor of newspapers, etc.?
(3) Now there is a lot of competition for good writing: At Hacker News, bad writing, especially from bad thinking, gets down voted. On Web sites using Disqus, part of getting voted up is clear, short, maybe just one sentence, maybe sarcastic, clever, and humorous, say, succinct, on the point, and maybe fun, and that means in some respects better writing. Maybe Disqus could tell us how Internet blog post writing volume has increased? A lot?
(4) For the media, via the Internet and Web sites, that is now much cheaper to produce than old newspapers, magazines, TV news, and I'd guess that the total of media as writing or as oral reading of what was written is much greater than before. For the future, I anticipate many more words per day, i.e., more writing. Uh, the writing at Hacker News has been going down, up, or staying the same? There is Facebook, X, Reddit, Wikipedia. There are sites for narrow interests. Sounds like a lot more writing.
(5) People have smart phones with them nearly all the time; net, that can mean more communications, and writing is less intrusive on the receiver than in-person voice. From good STEM field communications or just mature socialization, to avoid being misunderstood or offensive, good writing is important.
(6) Sure, now some Google searches result in AI answers, and for some simple questions the AI answers can be a little okay. But, I don't take the AI answers seriously, and the old Google search facility works fine and, also, of high importance, gives the URLs for the search results.
(7) Looking back at my writing, from personal letters to academics, on-line political discussions, etc., I see no way AI could help -- the AI writing is worse, not better.
(8) Since supposedly Taylor Swift is now worth $1.6 billion, there can be increased interest in guitar playing and the claim that such music is based almost entirely on the four chords I, IV, V, and VI. So, my niece wanted to know, and I wrote her an essay, 6,200 words with several YouTube URLs with pictures and sound. For some music with a lot of chords included URL
https://www.youtube.com/watch?v=UZsqnHhyub0
Sorry, but for my niece I found nothing nearly as good as what I wrote and believe that AI would be a poor substitute.
(9) The world is changing, especially related to writing, at likely a uniquely high rate, and no AI training data can report today what is new tomorrow and needs good writing.
Yet again, Paul Graham states his view that people other than him are incapable of thought. I'm shocked at how frequently he does this. I would be embarrassed to make such an assertion in public even once, let alone repeatedly. While the notion that AI will replace writing is somewhat prescient, the idea that it will replace thought most certainly isn't. Paul Graham is someone who writes to think, therefore he assumes writing is the only way to think. Entirely consistent with his massive and unrelenting ego and simultaneous total lack of empathy or understanding of the differences between people. I shall leave you with this quote of his—the most sneering text I've ever read:
“It's not surprising that conventional-minded people would dislike inequality if independent-mindedness is one of the biggest drivers of it. But it's not simply that they don't want anyone to have what they can't. The conventional-minded literally can't imagine what it's like to have novel ideas. So the whole phenomenon of great variation in performance seems unnatural to them, and when they encounter it they assume it must be due to cheating or to some malign external influence.” - https://paulgraham.com/superlinear.html#f12n
There you have it folks. The genius Paul Graham is one of a select few people with the ability to have ideas, something which those who disagree with him are simply incapable of comprehending.
How is that your takeaway? Please quote the sections from his essay that led you to your conclusion.
My takeaway: people that know how to write, that have trained that muscle, are better at thinking in a structured way and articulating their thoughts. The number of people that know how to write is declining, at least in part due to the advent of GenAI. The number of people who know how to write is still non zero and is not limited to only Paul Graham.
The people who like to write will still write no matter how many chatGPTs are there. New people will start writing because it enjoys them and because curiosity.
But there are a lot of others which never liked to write, they do not need this for their job and why should not use this GPT as a tool like the promised land of AI / robots.
Same will happen with cooking: people who like to cook will cook traditionally even after our incoming household robots will be able to.
From the essay:
> Instead of good writers, ok writers, and people who can't write, there will just be good writers and people who can't write.
> writing is thinking. In fact there's a kind of thinking that can only be done by writing
> So a world divided into writes and write-nots is more dangerous than it sounds. It will be a world of thinks and think-nots. I know which half I want to be in, and I bet you do too.
PG states, clear as day, that he expects the world to be divided into people who can think (him) and people who can't (almost everyone else). When I say Paul Graham imagines only he can think, this is hyperbole. I'm sure there's a small group of people with views very similar to his to whom he would also attribute the ability of thought. I am commenting on the clear and undeniable pattern of PG writing that huge swathes of the population are incapable of thinking.
https://xkcd.com/610/ about sums up my views on his attitude.
You said the same exact thing as parent
Tf