I now doubt I chose a good book for learning about Thomism. It's not that Edward Feser's book isn't good—it may very well be. Rather, when reading Feser's commentary, I feel like I've imposed upon a heated exchange between a Red Sox fan and a Yankees fan: an argument that's been going on a lot longer than I've been alive and is about something I don't care much about. In this case, the argument is between philosophers and is about things philosophers care about—non-falsifiable claims science expelled as irrelevant a long time ago. But I've imposed upon the exchange between the two fans, and now one of them is telling me I'm mistaken for not caring about the Red Sox and Yankees and I ought to take a side.
Thomists and Catholics, like most people in the world, wish more people listened to them and took them seriously. But to be taken seriously you must first take other viewpoints seriously, if only to understand how to phrase your explanations to people who see things differently. I want a book about Thomism that takes my viewpoint seriously and then explains Thomism's relevance.
The book about Thomism I'd like to read would be written for people who aren't impressed with metaphysical systems just because they possess internal logical consistency—i.e., claims ultimately backed with you can't prove me wrong
. The book would start with the assumption that the reader takes a skeptical view towards uses of the word natural
and would start with Gödel and Chaitin and logical incompleteness, and go from there to show Thomism still matters even after knowing these things. Does such a book, however unmarketable, exist?
I doubt it, and I won't be the one to write it because I don't understand Thomism. Instead, today I'm posting my ignorance in the form of a CAQ—Craig-Asked Questions—questions I derived from the notes I jotted in the margins besides my main notes from Feser's book. These notes pertain only to the chapter on metaphysics.
Note: Unlike FAQs, CAQs don't have answers, just more befuddlement.
How does one know which potentialities of an object are natural and which are unnatural?
Using Feser's example, rubber balls don't bounce from here to the moon, nor do they move by themselves and follow people menacingly, because they lack the potential to do so—i.e., such potentialities are unnatural
. What then is the definition of natural
?
Forms are abstractions, but is matter not an abstraction, too? If it's not an abstraction, what then is matter made of?
This reminds me of my philosophical position on atoms: atoms don't actually exist, but they're useful constructs to keep in mind when reading a chemistry textbook. Ditto for circles with respect to math textbooks.
Are substantial and accidental forms relative?
Using Feser's example, painting a ball a different color causes the ball to lose one accidental form (i.e., non-essential form) and take on another accidental form, but the ball's substantial form of being a ball remains. But perhaps instead of saying we started with a ball that happened to be, say, red, we said we started with a red thing that happened to be in the shape of a ball. In such a case, the accidental and substantial forms would be flipped. Is this a valid way to think about the metaphysical truth of the universe? If so, are there limits to how relativistic accidental and substantial forms are? Without limits, there exists an infinite combination of accidental and substantial forms that may be applied to any one thing.
If substantial and accidental forms aren't relative or are limited relativistically, then what criteria ought we use for determining them?
And how do we justify the criteria themselves? And how do we justify the justification of the criteria? And so on.
What are the final causes of stochastic radiation?
Is chance event
a valid final cause? If so, how do we know when chance event
isn't the final cause of something?
If causes happen simultaneously to their effects then how does motion occur at all?
I suspect I missed something here. According to Feser's example of a brick smashing through a window, the brick pushing into the glass and the glass giving way are simultaneous events—indeed, actually the same event considered under different descriptions. But cause-and-effect are used to explain change, and saying that a cause and its effect are simultaneous implies a sort of Zeno's paradox whereby change cannot occur. What did I miss?
Are final causes and privations relative?
This deserves a story. I once remarked to my former coworker Shafik how it bugged me that electrons in electrical circuits flow from negative to positive, all due to Benjamin Franklin's arbitrary 18th century terminology. Because of Franklin, a positive potential signifies a negative concept: a lack of electrons.
Shafik put me to ease with an idea so simple it frustrates me I didn't think of it myself: Craig, if the terminology bugs you, then think of a positive potential not as a lack of electrons but rather as a positive desire to obtain electrons.
Only because of mental feebleness does this cause electrical flow to make more sense to me.1
Shafik's advice follows from a relativism heuristic that aids in understanding a lot of math and science: use whichever terminology makes the most sense of what you see. How does this heuristic apply to final causes and privations? For example, maybe the final cause of an eye is to see and cataracts are the result of a privation that hinders the final cause of sight. But maybe instead the final cause of an eye is the development of cataracts and all our early decades of clear sight are a privation of cataracts? Is any one system of terminology more valid than another? If so, what are the criteria for judging the validity of one final cause theory over another?
Why are final causes not tautologies?
(obligatory xkcd reference here)
Feser explicitly claims the notion of final causes is non-tautological, but he doesn't explain why. To Feser, the two statements:
Opium causes sleep because it causes sleep.
and
Opium causes sleep because it has the power to cause sleep.
are inequivalent.
Why are they inequivalent? From the book:
[The second statement says] that opium has a power to cause sleep; that is to say, it tells us that the fact that sleep tends to follow the taking of opium is not an accidental feature of this or that sample of opium, but belongs to the nature of opium as such.
That leads us back to the question of the relativism of accidental and substantial forms and how we judge one accidental-substantial pair as more valid than another. It seems Thomism hinges on a preformed notion of
natural
.
What's insufficient about the distinction between context-free and context-specific that makes final causes necessary for understanding the significance of a given causal chain?
Feser gives the example of how bear DNA causes bears to be big and furry but bear DNA doesn't cause bears to be good mascots for football teams. Feser's point is this implies there's a final cause at work with DNA, and the final cause includes size and furriness but not mascot-worthiness.
But bear DNA does cause bears to be good mascots, just not directly. The issue here is the distinction between context-free and context-specific causalities, not end causes. Bear DNA causes bears to be big and furry regardless whether humans exist, but whether bear DNA causes them to be good mascots also depends upon (1) humans existing; (2) humans playing football; and (3) humans choosing mascots that are big, furry animals. Biologists don't study mascot-worthiness genes in DNA because such genes require context that transcends the scope of biology—but the genes exist nevertheless.
What's the final cause of a final cause?
And what's the final cause of a final cause's final cause?, and so on. How do final causes work at all without leading to an infinite regression?—or are we allowing for infinite regressions?
By the way—and this isn't a question—a decrease in entropic order is an increase in information.
From the book:
…would contradict the second law of thermodynamics, which tells us that order (and thus information content) tends invariably to decrease, not increase, within a closed system.
Not to pick on Feser here because this is a common misconception: order is a
lack of information, and the amount of information in a closed system
increases with time as order decreases. As with electrons and the flow of electrical current, many people see this as a backwards way of looking at things. If you're such a person, try thinking of order as a
reduction in complexity or a kind of data compression.
2 For example, if you sort your books according to the Dewey Decimal System then you need only a simple, concise card catalog to describe where any book is; without sorting your books to some such system, you need more information to describe where any book is.
[] The backwards
terms for electrical charges are useful for pointing out that electrical current is arbitrary. In batteries what flows are protons, not electrons, and the protons do flow from positive to negative.
[] But don't think of order as lossless data compression if you want to be exact about it because lossless data compression doesn't eliminate information; instead, it squeezed a fixed amount of information into a smaller space.