A wonderful meditation on the role of tools.
In the article, my favorite part is where Holly writes:
Tools mediate our ability to approach the rawness of the world as it is and turn it to our intentions. Genesis 2:15 admonishes us, “The LORD God took the man and put him in the garden of Eden to work it and keep it.” Tools to work — to cultivate — the land are part of our mandate. Many tasks cannot be performed without them. I cannot prune my apple trees without a pruning saw. I cannot sew an outfit for one of my children without needles. Tools, rightly used, are one of the original fruits of obedience.
In another passage, she says:
Using a good tool has the effect of training us at the same time. Repetition teaches our hands, minds, and hearts to focus; to learn more deeply than memory alone. Muscle and sinew record the actions required for the task and gradually become able to repeat it without our conscious effort.
AI as a Cognitive Tool
Reading this piece, I couldn’t help but think about AI tools like ChatGPT and Claude. Even though that’s not what Holly is writing about, AI chatbots are, at the end of the day, tools—and, as things stand, they can extend our capabilities, help us learn, and teach us tricks to become better at what we do.
But at the same time, there’s the temptation to outsource everything. In her essay, Holly uses the example of a bread machine at her house. She explains how the machine took away the need for her to be involved—all she had to do was dump the ingredients in, and bread would magically appear. She hated it, because it stripped away all the relational elements between the tool and its user.
What’s the Cost?
That point hit me deeply, because it connects to how I think about ChatGPT. One of my default mental models for using it is to constantly ask myself: what’s the cost?
Unlike a gardening shear, a knife, or a scissor, ChatGPT is a cognitive tool. For people in knowledge work, there’s nothing wrong with using it—I completely disagree with the doomers. But perhaps the most important question we can ask ourselves is still: what’s the cost? Because there are no pure choices, only trade-offs, and that applies to generative AI tools as well.
If we don’t ask that question, there’s a real risk of our minds becoming dull contraptions—just like a knife that grows blunt and rusty when left unused.
That’s the tangent Holly’s essay sent me on.
The Future Risk
I keep saying as things stand today because, if the progress of AI continues at its current pace—and I’m saying if, because no one knows, though it’s a very real probability given the hundreds of billions being spent to create this digital god—then there’s a real risk that these tools will get better at doing not just what we do, but the very things that give us meaning.
If that happens, there are two major risks:
First, we’ll not only become redundant, but we’ll forget how to do the things machines now do better.
Second, we’ll be left with a giant, gaping hole where an activity once gave meaning to our lives.
I don’t know how that future will play out, but it’s something I’ve been thinking about deeply.
Hat tip to Hadden Turner