Thursday, September 28, 2006

 

Vagueness in the Nineties


Something I'd like to know a lot more about is the initial reaction to Williamson's defense of epistemicism. It's hard not to get the feeling that the idea that bivalence could be seriously defended for vague discourse was regarded as pretty laughable even in the years leading up to the publication of Vagueness, and this despite well-known attempts to get the view on the table by James Cargile and Roy Sorensen as early as 1969 and 1988 respectively. (Not to mention Williamson's flirtation with the view in his first book).

Here's a quote that highlights the kind of attitude that seems to have been around at the time, taken from the introduction to Wright's Realism, Meaning & Truth, first published in 1987:

'To suggest that Bivalence is, or should be, the hallmark of realism everywhere is accordingly to be committed to claiming either that there is no such thing as realism about vague discourse, or that the vagueness of a statement, whatever it is held to consist in, is a feature consistent with its possession of a determinate truth-value. Neither suggestion is remotely plausible.' (p4)

I don't think Wright would find the view much more plausible now - but he'd certainly recognise that more work would have to be done to dismiss it.

We find Sainsbury writing in 1990's 'Concepts Without Boundaries':

'Sets have sharp boundaries, or, if you prefer, are sharp objects: for any set, and any object, either the object quite definitely belongs to the set or else it quite definitely does not. Suppose there were a set of things of which "red" is true: it would be the set of red things. However, "red" is vague: there are objects of which it is neither the case that "red" is (definitely) true nor the case that "red" is (definitely) not true. Such an object would neither definitely belong to the set of red things nor definitely fail to belong to this set. But this is impossible, by the very nature of sets. Hence there is no set of red things.

This seems to me as certain as anything in philosophy...'
(p252 in the Keefe and Smith reader. My italics)

The first edition of his book Paradoxes, published in 1987, also defined the phenomenon of vagueness in a way that left no room for epistemicism. By 1995's second edition, things have shifted:

'I found my earlier discussion of vagueness very unsatisfactory, in the main because it defined vagueness in such a way as to exclude the epistemic theory. I do not accept this theory, but Timothy Williamson has shown me that I am not able to refute a skilful and determined opponent.' (ix)

What's happened in the 5 years between the first quote and the second? First of all, Williamson has published his Joint session paper 'Vagueness and Ignorance' in 1992, arguing that the most common objections to epistemicism aren't in fact nearly as powerful as people have thought, and that's been followed by Vagueness in 1994.

How did people feel when they first realised they were going to have to take this view seriously? It really seems like it must have come as a complete shock to a lot of philosophers working on vagueness at the time.

Labels: ,


Comments:
So it's certainly true that Williamson's book provided a huge boost to the debate. But it's already receiving a lot of attention in the early nineties - that's when a lot of the debate about higher-order vagueness (between, for example, Wright, Sainsbury, Heck and Edgington) takes place. So already a lot of the momentum has been built up - it doesn't all post-date Williamson's contribution.
 
Post a Comment



<< Home

This page is powered by Blogger. Isn't yours?