PROJECT WEB DIRECTORY
DOCUMENTS WRITTEN IN THE YEARS 1962-1979 (APPROXIMATELY)
(Some of the documents were inserted under the year they will added to
this directory. See
http://www.cs.bham.ac.uk/research/projects/cogaff/#contents
PAPERS 1962-80 CONTENTS LIST
RETURN TO MAIN COGAFF INDEX FILE
This file is
http://www.cs.bham.ac.uk/research/projects/cogaff/62-80.html
Maintained by Aaron Sloman -- who does
not respond to Facebook requests.
It contains an index to files relevant to the Cognition and
Affect Project's FTP/Web directory produced or published in the years
1962-1980. Some of the papers published in this period were produced
earlier and are included in one of the lists for an earlier period. Some
older papers recently digitised have also been included.
http://www.cs.bham.ac.uk/research/cogaff/0-INDEX.html#contents
A list of PhD and MPhil theses was added in June 2003
This file Last updated: 10 Jun 2012; 7 Jul 2012
JUMP TO DETAILED LIST (After Contents)
Where published:
In: Open Peer Commentary on Shimon Ullman: `Against Direct Perception'Abstract:
Behavioral and Brain Sciences Journal, (BBS) (1980) 3, pp. 401-404The whole publication, including commentaries is:
S. Ullman, Against direct perception
The Behavioral And Brain Sciences (1980) 3, 373-415
http://dx.doi.org/10.1017/S0140525X0000546X
No abstract in paper. Will add a summary here later.Compare my more recent discussion of Gibson:
http://tinyurl.com/BhamCog/talks/#talk93
Aaron Sloman, What's vision for, and how does it work? From Marr (and earlier) to Gibson and Beyond,
Online tutorial presentation, Sep, 2011. Also at
http://www.slideshare.net/asloman/
Where published:
Commentary on 'Minds, brains, and programs' by John R. Searle
in The Behavioral and Brain Sciences Journal (BBS) (1980) 3, 417-457
http://dx.doi.org/10.1017/S0140525X00005756
Also http://www.cnbc.cmu.edu/~plaut/MindBrainComputer/papers/Searle80BBS.mindsBrainsPrograms.pdf
This commentary: pages 447-448
Abstract:
Searle's delightfully clear and provocative essay contains a subtle mistake, which is also often made by Al researchers who use familiar mentalistic language to describe their programs. The mistake is a failure to distinguish form from function.That some mechanism or process has properties that would, in a suitable context, enable it to perform some function, does not imply that it already performs that function. For a process to be understanding, or thinking, or whatever, it is not enough that it replicate some of the structure of the processes of understanding, thinking, and so on. It must also fulfil the functions of those processes. This requires it to be causally linked to a larger system in which other states and processes exist. Searle is therefore right to stress causal powers. However, it is not the causal powers of brain cells that we need to consider, but the causal powers of computational processes. The reason the processes he describes do not amount to understanding is not that they are not produced by things with the right causal powers, but that they do not have the right causal powers, since they are not integrated with the right sort of total system.
Title: The primacy of non-communicative language
Author: Aaron Sloman
In The Analysis of Meaning, Proceedings 5,Date: Originally published 1979. Added here 2 Dec 2000
(Invited talk for ASLIB Informatics Conference, Oxford, March 1979,)
ASLIB and British Computer Society, London, 1979.
Eds M. MacCafferty and K. Gray, pages 1--15.
Abstract:
How is it possible for symbols to be used to refer to or describe things? I shall approach this question indirectly by criticising a collection of widely held views of which the central one is that meaning is essentially concerned with communication. A consequence of this view is that anything which could be reasonably described as a language is essentially concerned with communication. I shall try to show that widely known facts, for instance facts about the behaviour of animals, and facts about human language learning and use, suggest that this belief, and closely related assumptions (see A1 to A3, in the paper) are false. Support for an alternative framework of assumptions is beginning to emerge from work in Artificial Intelligence, work concerned not only with language but also with perception, learning, problem-solving and other mental processes. The subject has not yet matured sufficiently for the new paradigm to be clearly articulated. The aim of this paper is to help to formulate a new framework of assumptions, synthesising ideas from Artificial Intelligence and Philosophy of Science and Mathematics.
Title:
THE COMPUTER REVOLUTION IN PHILOSOPHY:
Philosophy, science and models of
mind.
Originally published by Harvester Press and Humanities Press in 1978, but long since out of print. Now available online free of charge.Author: Aaron Sloman
Also an e-book at Amazon.
Abstract:
See the
contents list
I have discovered that a review of 'The Computer Revolution in Philosophy' by Douglas Hofstadter is available online here.
BULLETIN (New Series) OF THE AMERICAN MATHEMATICAL SOCIETY
Volume 2, Number 2, March 1980
Copyright 1980 American Mathematical Society
The computer revolution in philosophy: Philosophy, science and models of mind
by Aaron Sloman, Harvester Studies in Cognitive Science Humanities Press, Atlantic Highlands, N. J., 1978, xvi + 304 pp., cloth, $22.50.
Reviewed by Douglas R. Hofstadter
(The review rightly criticises some of the unnecessarily aggressive tone and throw-away remarks, but also gives the most thorough assessment of the main ideas of the book that I have seen.
Like many in AI he regards the philosophy of science in the first part of the book, e.g. Chapter 2, as relatively uninteresting, whereas I think understanding those issues is central to understanding how human minds work as they learn more about the world and themselves.)
Abstract:
Commentary on three articles published in Behavioral and Brain Sciences Journal 1978, 1 (4)1. Premack, D., Woodruff, G. Does the chimpanzee have a theory of mind? BBS 1978 1 (4): 515.Despite the virtues of the target articles, I find something sadly lacking: an awareness of deep problems and a search for deep explanations.
2. Griffin, D.R. Prospects for a cognitive ethology. BBS 1978 1 (4): 527.
3. Savage-Rumbaugh, E.S., Rumbaugh, D.R., Boysen, S. Linguistically-mediated tool use and exchange by chimpanzees (Pan Troglodytes). BBS 1978 1 (4): 539.Are the authors of these papers merely concerned to collect facts? Clearly not: they are also deeply concerned to learn the extent of man's uniqueness in the animal world, to refute behaviourism, and to replace anecdote with experimental rigour. But what do they have to say to someone who doesn't care whether humans are unique, who believes that behaviourism is either an irrefutable collection of tautologies or a dead horse, and who already is deeply impressed by the abilities of cats, dogs, chimps, and other animals, but who constantly wonders: HOW DO THEY DO IT?
My answer is that the papers do not have much to say about that: for that, investigation of designs for working systems is required, rather than endless collection of empirical facts, interesting as those may be.
Where published:
in Proceedings AISB/GI Conference, 18-20th July 1978,
Hamburg, Germany
Programme Chair: Derek Sleeman
Program Committee:Alan Bundy (Edinburgh) Steve Hardy (Sussex) H. -H. Nagel (Hamburg) Jacques Pitrat (Paris) Derek Sleeman (Leeds) Yorick Wilks (Essex)General chair: K. -H. NAGELPublished by: SSAISB and GI
Abstract:
(Extract from text)
Vision work in AI has made progress with relatively small problems. We are not aware of any system in which many different kinds of knowledge co-operate. Often there is essentially one kind of structure, e.g. a network of lines or regions, and the problem is simply to segment it, and/or to label parts of it. Sometimes models of known objects are used to guide the analysis and interpretation of an image, as in the work of Roberts (1965), but usually there are few such models, and there isn't a very deep hierarchy of objects composed of objects composed of objects....
By contrast, recent speech understanding systems, like HEARSAY (Lesser 1977, Hayes-Roth 1977), deal with more complex kinds of interactions between different sorts of knowledge. They are still not very impressive compared with people, but there are some solid achievements. Is the lack of similar success in vision due to inherently more difficult problems?
Some vision work has explored interactions between different kinds of knowledge, including the Essex coding-sheet project (Brady, Bornat 1976) based on the assumption that provision for multiple co-existing processes would make the tasks much easier. However, more concrete and specific ideas are required for sensible control of a complex system, and a great deal of domain-specific descriptive know-how has to be explicitly provided for many different sub-domains.The POPEYE project is an attempt to study ways of putting different kinds of visual knowledge together in one system.
NOTE:
Chapter 9 of The Computer Revolution in Philosophy provides further information about the Popeye system.
Commentary on Z. Pylyshyn:
Computational models and empirical constraints
Behavioral and Brain Sciences Vol 1 Issue 1 March 1978, pp 91 - 99This commentary: pp 115-6
Abstract:
Title: Physicalism and the Bogey of Determinism
Abstract:
This paper rehearses some relatively old arguments about how
any coherent notion of free will is not only compatible with
but depends on determinism.
Some of ideas that were in the paper and in my responses to
commentators were also presented in
The
Computer Revolution in Philosophy, including a version of
this diagram (originally pages
344-345, in the discussion section below),
discussed in more detail in
Chapter 6 of the book, and later elaborated as an architectural
theory assuming concurrent reactive, deliberative and metamanagement
processes, e.g. as explained in this 1999 paper
Architecture-Based Conceptions of Mind, and later papers.
A slightly revised version (with clearer diagrams) was published as
Chapter 8 of the 1978 book:
The Computer Revolution in Philosophy
Date:
Published/Presented 1974, installed here 3 Jan 2010.
Abstract:
Abstract:
(Extracts from paper)
In order to close a loophole in Shorter's argument I
describe a possible situation in which both physical continuity
and bodily identity are clearly separated from personal identity.
Moreover, the example does not, as Shorter's apparently does, assume the
falsity of current physical theory.
It will be a long time before engineers make a machine which will
not merely copy a tape recording of a symphony, but also correct
poor intonation, wrong notes, or unmusical phrasing. An entirely new
dimension of understanding of what is being copied is required for
this. Similarly, it may take a further thousand years, or more,
before the transcriptor is modified so that when a human body is
copied the cancerous or other diseased cells are left out and
replaced with normal healthy cells, if, by then, the survival rate
for bodies made by this modified machine were much greater than for
bodies from which tumours had been removed surgically, or treated
with drugs, then I should have little hesitation, after being
diagnosed as having incurable cancer, in agreeing to have my old
body replaced by a new healthy one, and the old one destroyed before
recovering from the anaesthetic. This would be no suicide, nor
murder.
Title: Interactions between Philosophy and Artificial Intelligence:
The role of intuition and non-logical reasoning in intelligence,
Originally published in:
This was later revised as
Chapter 7
of
The Computer Revolution in Philosophy
(1978)
Abstract:
There were several sequels to this paper including
the Afterthoughts paper
written in 1975, some further developments regarding ontologies and
criteria for adequacy
in a 1984-5 paper and several other papers
mentioned in
the section on
diagrammatic/visual reasoning here.
Related recent work includes these presentations:
Title: Tarski, Frege and the Liar Paradox
Abstract:
The paper suggests that this view of paradoxes, including the paradox of
the Liar, is superior to Tarski's analysis which required postulating a
hierarchy of meta-languages. We do not need such a hierarchy to explain
what is going on or to deal with the fact that such paradoxes exist.
Moreover, the hierarchy would not necessarily be useful for an
intelligent agent, compared with languages that contain their own
meta-language, like the one I am now using.
Abstract:
This is a sequel to the 1969 paper on
"How to derive 'Better' from 'Is'"
also online at this web site. It presupposes the analysis of 'better' in
the earlier paper, and argues that statements using the word 'ought' say
something about which of a collection of alternatives is better than the
others, in contrast with statements using 'must' or referring to
'obligations', or what is 'obligatory'. The underlying commonality
between superficially different statements like 'You should take an
umbrella with you' and 'The sun should come out soon' is explained,
along with some other philosophical puzzles, e.g. concerning why
'ought' does not imply 'can', contrary to what some philosophers have
claimed.
Curiously, the 'Ought' and 'Better' paper is mentioned at
http://semantics-online.org/blog/2005/08/
in the section on David Lodge's novel "Thinks...", which includes a
reference to
this paper
'What to Do If You Want to Go to Harlem:
Anankastic Conditionals and Related Matters' by
Kai von Fintel and Sabine Iatridou (MIT), which includes a discussion of
the paper on 'Ought' and 'Better'.
Abstract:
(extracts from paper)
In his book Speech Acts (Cambridge University Press, 1969),
Searle discusses what he calls 'the speech act fallacy' (pp.
136,ff), namely the fallacy of inferring from the fact that
The paper argues
that even if conclusion (2) is false, Searle's argument against it is
inadequate because he does not consider all the possible ways in which a
speech-act might account for non-indicative occurrences. In
particular, there are other things we can do with speech acts besides
performing them and predicating their performance, e.g. besides
promising and expressing the proposition that one is promising. E.g. you
can indicate that you are considering performing act F but are not yet
prepared to perform it, as in 'I don't promise to come'.
So the analysis proposed can be summarised thus:
If F and G are speech acts, and p and q propositional contents or other
suitable objects, then:
Abstract:
There are well-known objections to both approaches, and the aim of this
paper is to suggest an alternative which has apparently never previously
been considered, for the very good reason that at first sight it looks
so unpromising, namely the alternative of defining the problematic words
as logical constants.
This should not be confused with the programme of treating them as
undefined symbols in a formal system, which is not new. In this essay an
attempt will be made to define a logical constant "Better"
which has surprisingly many of the features of the ordinary word
"better" in a large number of contexts. It can then be shown
that other important uses of "better" may be thought of as
derived from this use of the word as a logical constant.
The new symbol is a logical constant in that its definition (i.e., the
specification of formation rules and truth-conditions for statements
using it) makes use only of such concepts as "entailment,"
"satisfying a condition," "relation," "set of
properties," which would generally be regarded as purely logical
concepts. In particular, the definition makes no reference to wants,
desires, purposes, interests, prescriptions, choice, non-descriptive
uses
of language, and the other paraphernalia of non-naturalistic (and some
naturalistic) analyses of evaluative words.
(However, some of those 'paraphernalia' can be included in
arguments/subjects to which the complex relational predicate
'better' is applied.)
Abstract: (From the introductory section)
I An adequate theory of meaning and truth must account for the following
facts, whose explanation is the topic, though not the aim, of the paper.
(i) Different signs (e.g., in different languages) may express the same
proposition.
(ii) The syntactic and semantic rules in virtue of which sentences are
able to express contingent propositions also permit the expression of
necessary propositions and generate necessary
relations between contingent propositions.
E.g. although 'It snows in Sydney or it does not snow in Sydney' can be
verified empirically (since showing one disjunct to be true would be an
empirical
verification, just as a proposition of the form 'p and not-p'
can be falsified empirically), the empirical enquiry can be
short-circuited by showing what the result must be.
(iii) At least some such restrictions on truth-values, or combinations
of truth-values (e.g., when two or more contingent propositions are
logically equivalent, or inconsistent, or
when one follows from others),
result from purely formal, or logical, or topic-neutral features of the
construction of the relevant propositions, features which have nothing
to do with precisely which concepts occur, or which objects are referred
to. Hence we call some propositions logically true, or logically
false,
and say some inferences are
valid in virtue of their logical form, which
prevents simultaneous truth of premisses and falsity of conclusion.
(iv) The truth-value-restricting logical forms are systematically
inter-related so that the whole infinite class of such forms can be
recursively generated from a relatively small subset, as illustrated in
axiomatisations of logic.
Subsequent discussion will show these statements to be over-simple.
Nevertheless, they will serve to draw attention to the range of facts
whose need of explanation is the starting point of this paper. They have
deliberately been formulated to allow that there may be cases of
non-logical necessity.
Available in two formats:
A summary of the meeting
by E. J. Lemmon, M. A. E. Dummett, and J. N. Crossley
with abstracts of papers
presented, including this one, was published in
The Journal of Symbolic Logic,
Vol. 28, No. 3. (Sep., 1963), pp. 262-272.
accessible
online here.
The full paper was published in the conference proceedings:
This paper extends Frege's concept of a function to "rogators",
which are like functions in that they take arguments and produce
results, but are unlike functions in that their results can depend
on the state of the world, in addition to which arguments they are
applied to.
Abstract (actually the opening paragraph of the paper):
Date Installed: 6 Jan 2010; Published 1964
Where published:
Abstract: (Opening paragraph)
The chapters have been copied here for
ease of
access, along with more detailed information about the contents.
The PDF files can also be obtained via this 'permanent ID'
Abstract:
Some of the ideas developed here were expanded in
(Via LaTeX: derived from a scanned version)
Title: Afterthoughts on Analogical Representations (1975)
Originally Published in
in
Theoretical Issues in Natural Language Processing (TINLAP-1),
Eds. R. Schank & B. Nash-Webber,
pp. 431--439,
MIT,
Author: Aaron Sloman
Now available online
http://acl.ldc.upenn.edu/T/T75/
Reprinted in
Readings in knowledge representation,
Eds. R.J. Brachman & H.J. Levesque,
Morgan Kaufmann,
1985.
Date installed: 28 Mar 2005
In 1971 I wrote
a paper
attempting to relate some old philosophical
issues about representation and reasoning to problems in Artificial
Intelligence. A major theme of the paper was the importance of
distinguishing ``analogical'' from ``Fregean'' representations. I still
think the distinction is important, though perhaps not as important for
current problems in A.I. as I used to think. In this paper I'll try to
explain why.
1974
(A more complete, PDF version, derived from the html version.)
Author: Aaron Sloman
Date: Published 1974, installed here 29 Dec 2005
Presented at an interdisciplinary conference on Philosophy of
Psychology at the University of Kent in 1971. Published in
the proceedings, as
A. Sloman,
'Physicalism and the Bogey of Determinism'
(along with Reply by G. Mandler and W. Kessen, and additional comments
by Alan R. White, Philippa Foot and others, and
replies to criticisms)
in
Philosophy of Psychology,
Ed S.C.Brown,
London: Macmillan, 1974, pages 293--304.
(Published by Barnes & Noble in USA.)
Commentary and discussion followed on
pages 305--348.
However the mind-brain identity theory is attacked on the grounds
that what makes a physical event an intended action A is that the
agent interprets the physical phenomena as doing A. The paper
should have referred to the monograph
Intention (1957) by Elizabeth Anscombe
(summarised
here by Jeff Speaks),
which discusses in detail the fact that the same physical event
can have multiple (true) descriptions, using different ontologies.
My point is partly analogous to
Dennett's
appeal to the 'intentional stance', though that
involves an external observer attributing rationality along
with beliefs and desires to the agent. I am adopting the
design stance not the intentional stance, for I do not assume
rationality in agents with semantic competence (e.g. insects), and
I attempt to explain
how an agent has to be designed in order to perform intentional
actions; the design must allow the agent to interpret physical
events (including events in its brain) in a way that is not just
perceiving their physical properties. That presupposes semantic
competence which is to be explained in terms of how the machine
or organism works, i.e. using the design stance, not
by simply postulating rationality and assuming beliefs and desires
on the basis of external evidence.
The html paper preserves original page divisions.
(I may later add further notes
and comments to this HTML version.)
Note added 3 May 2006
An online review of the whole book is available
here.
by Marius Schneider, O. F. M.,
The Catholic University of America, Washington, D. C., apparently
written in 1975.
Title: On learning about numbers: Some problems and speculations
In
Proceedings AISB Conference 1974, University of Sussex,
pp. 173--185,
Author: Aaron Sloman
The aim of this paper is methodological and tutorial.
It uses elementary number competence to show how reflection on the
fine structure of familiar human abilities generates requirements
exposing the inadequacy of initially plausible explanations.
We have to learn how to organise our common sense knowledge and
make it explicit, and we don't need experimental data as much as
we need to extend our model-building know-how.
1973
1972
1971
Title: New Bodies for Sick Persons: Personal Identity Without Physical Continuity
Author: Aaron Sloman
First published in
In Analysis vol 32 NO 2, December 1971, pages 52 --55
Date Installed:
9 Jan 2007 (Originally Published 1971)
In his recent Aristotelian society paper ('Personal identity, personal
relationships, and criteria' in
Proceedings the Aristotelian Society,
1970-71, pp. 165--186), J. M. Shorter argues that the connexion
between physical identity and personal identity is much less tight than
some philosophers have supposed, and, in order to drive a wedge between
the two sorts of identity, he discusses logically possible situations
in which there would be strong moral and practical reasons for treating
physically discontinuous individuals as the same person. I am sure his
main points are correct: the concept of a person serves a certain sort
of purpose and in changed circumstances it might be able to serve
that purpose only if very different, or partially different, criteria
for identity were employed. Moreover, in really bizarre, but "logically"
possible, situations there may be no way of altering the
identity-criteria, nor any other feature of the concept of
person, so as to enable the concept to have the same moral, legal,
political and other functions as before: the concept may simply
disintegrate, so that the question 'Is X really the same person as Y or
not ?', has no answer at all. For instance, this might be the case if
bodily discontinuities and reduplications occurred very frequently.
To suppose that the "essence" of the
concept of a person, or some set of
general logical principles, ensures that questions of identity always
have answers in all possible circumstances, is quite unjustified.
(with full list of references -- added June 2006)
Author:
Aaron Sloman
Proceedings IJCAI 1971
Date added: 12 May 2004
(Proceedings also available
here)
, then reprinted in
Artificial Intelligence, vol 2, 1971,
http://dx.doi.org/10.1016/0004-3702(71)90011-7
then in
J.M. Nicholas, ed.
Images, Perception, and Knowledge,
Dordrecht-Holland: Reidel.
1977
This paper echoes, from a philosophical standpoint, the claim of
McCarthy and Hayes that Philosophy and Artificial Intelligence have
important relations. Philosophical problems about the use of 'intuition'
in reasoning are related, via a concept of analogical representation,
to problems in the simulation of perception, problem-solving and the
generation of useful sets of possibilities in considering how to act.
The requirements for intelligent decision-making proposed by McCarthy
and Hayes in
Some Philosophical
Problems from the Standpoint of Artificial Intelligence (1969)
are criticised as too narrow, because they allowed for the use of only
one formalism, namely logic. Instead general requirements are suggested
showing the usefulness of other forms of representation.
Originally in Philosophy, Vol XLVI, pages 133-147, 1971
Author: Aaron Sloman
Date installed: 16 Oct 2003
The paper attempts to resolve a variety of logical and semantic
paradoxes on the
basis of Frege's ideas about compositional semantics: i.e. complex
expressions have a reference that depends on the references of the
component parts and the mode of composition, which determines a function
from the lowest level components to the value for the whole expression.
The paper attempts to show that it is inevitable within this framework
that some syntactically well formed expressions will fail to have any
reference, even though they may have a well defined sense.
This can be compared with the ways in which syntactically well-formed
programs in programming languages may fail to terminate or in some other
way fail semantically and produce run-time errors.
1970
Title: 'Ought' and 'Better'
Author: Aaron Sloman
Date Installed: 19 Sep 2005
Originally published as
Aaron Sloman, 'Ought and Better'
Mind, vol LXXIX, No 315, July 1970, pp
385--394)
1969
Title: Transformations of Illocutionary Acts (1969)
Author: Aaron Sloman
First published in
Analysis Vol 30 No 2, December 1969 pages 56-59
Date Installed:
10 Jan 2007
This paper discusses varieties of negation and other logical
operators when applied to speech acts, in response to an argument by
John Searle.
(1) in simple indicative sentences, the word W is used to perform
some speech-act A (e.g. 'good' is used to commend, 'true' is used to
endorse or concede, etc.)
the conclusion that
(2) a complete philosophical explication of the concept W is given
when we say 'W is used to perform A'.
He argues that as far as the words 'good', 'true', 'know' and 'probably'
are concerned, the conclusion is false because the speech-act analysis
fails to explain how the words can occur with the same meaning in
various grammatically different contexts, such as interrogatives ('Is it
good?'), conditionals('If it is good it will last long'), imperatives
('Make it good'), negations, disjunctions, etc.
o Utterances of the structure
'If F(p) then G(q)' express provisional commitment to performing G on q,
pending the performance of F on p
It is not claimed that 'not', 'if', etc., always are actually used in
accordance with the above analyses, merely that this is a possible type
of analysis which (a) allows a word which in simple indicative sentences
expresses a speech act to contribute in a uniform way to the meanings of
other types of sentences and (b) allows signs like 'not', 'if', the
question construction, and the imperative construction,
to have uniform
effects on signs for speech acts. This type of analysis differs from the
two considered and rejected by Searle. Further, if one puts either
assertion or commendation or endorsement
in place of the speech acts F and G in the above schemata, then the
results seem to correspond moderately well with some (though not
all) actual uses of the words and constructions in question. With
other speech acts, the result does not seem to correspond to
anything in ordinary usage: for instance, there is nothing in
ordinary English which corresponds to applying the imperative
construction to the speech act of questioning, or even commanding,
even though if this were done in accordance with the above schematic
rules the result would in theory be intelligible.
o Utterances of the form 'F(p) or
G(q) 'would express a commitment to performing (eventually) one or other
or both of the two acts though neither is performed as yet.
o The question
mark, in utterances of the form 'F(p)?' instead of expressing some new
and completely unrelated kind of speech act, would merely express
indecision concerning whether to perform F on p together with an attempt
to get advice or help in resolving the indecision.
o
The imperative form 'Bring it about that . .' followed by a suitable
grammatical transformation of F(p) would express the act of trying to
get (not cause) the hearer to bring about that particular state of
affairs in which the speaker would perform the act F on p (which is not
the same as simply bringing it about that the speaker performs the act).
Title: How to derive "better" from "is",
Author: Aaron Sloman
Originally Published as:
A. Sloman
How to derive "better" from "is"
American Philosophical Quarterly,
Vol 6, Number 1, Jan 1969,
pp 43--52.
Date Installed: 23 Oct 2002
ONE type of naturalistic analysis of words like "good,"
"ought," and "better" defines them in terms of
criteria for applicability which vary from one context to another (as in
"good men," "good typewriter," "good method of
proof"), so that their meanings vary with context. Dissatisfaction
with this "crude" naturalism leads some philosophers to
suggest that the words have a context-independent non-descriptive
meaning defined in terms of such things as expressing emotions,
commanding, persuading, or guiding actions.
1968
Title: Explaining Logical Necessity
Author: Aaron Sloman
Date Installed: 4 Dec 2007 (Published originally in 1968);
Updated 19 Dec 2009
in
Proceedings of the Aristotelian Society,
1968/9,
Volume, 69,
pp 33--50.
Summary:
I: Some facts about logical necessity stated.
II: Not all necessity is logical.
III: The need for an explanation.
IV: Formalists attempt unsuccessfully to reduce logic to syntax.
V: The no-sense theory of Wittgenstein's Tractatus merely reformulates
the problem.
VI: Crude conventionalism is circular.
VII: Extreme conventionalism is more sophisticated.
VIII: It yields some important insights.
IX: But it ignores the variety of kinds of proof.
X: Proofs show why things must be so, but different proofs show
different things. Hence there can be no
general explanation of necessity.
1967
1966
1965
Title: Functions and Rogators (1965)
Author: Aaron Sloman
Date Installed: 23 Dec 2007
This paper was originally presented at a meeting of the Association for
Symbolic Logic held in St. Anne's College, Oxford, England from 15-19
July 1963 as a NATO Advanced Study Institute with a Symposium on
Recursive Functions sponsored by the Division of Logic, Methodology and
Philosophy of Science of the International Union of the History and
Philosophy of Science.
Aaron Sloman 'Functions and Rogators', in
Abstract:
Formal Systems and Recursive Functions:
Proceedings of the Eighth Logic Colloquium Oxford, July 1963
Eds J N Crossley and M A E Dummett
North-Holland Publishing Co (1965), pp. 156--175
It was scanned in and digitised in December 2007.
(This paper was described by David
Wiggins as 'neglected but valuable' in his
'Sameness and Substance Renewed'
(2001).
(Published in E. J. Lemmon, M. A. E. Dummett, and J. N. Crossley)
(1963)
Frege, and others, have made extensive use of the notion of a
function, for example in analysing the role of quantification, the
notion of a function being defined, usually, in the manner familiar to
mathematicians, and illustrated with mathematical examples. On this view
functions satisfy extensional criteria for identity. It is not usually
noticed that in non-mathematical contexts the things which are thought
of as analogous to functions are, in certain respects, unlike the
functions of mathematics. These differences provide a reason for saying
that there are entities, analogous to functions, but which do not
satisfy extensional criteria for identity. For example, if we take the
supposed function 'x is red' and consider its value (truth or falsity)
for some such argument as the lamp post nearest my front door, then we
see that what the value is depends not only on which object is taken as
argument, and the 'function', but also on contingent facts about the
object, in particular, what colour it happens to have. Even if the lamp
post is red (and the value is truth), the same lamp post might have been
green, if it had been painted differently. So it looks as if we need
something like a function, but not extensional, of which we can say that
it might have had a value different from that which it does have. We
cannot say this of a function considered simply as a set of ordered
pairs, for if the same argument had had a different value it would not
have been the same function. These non-extensional entities are
described as 'rogators', and the paper is concerned to explain what the
function-rogator distinction is, how it differs from certain other
distinctions, and to illustrate its importance in logic, from the
philosophical point of view.
Title: 'NECESSARY', 'A PRIORI' AND 'ANALYTIC'
Author:
Aaron Sloman
Date Installed:
9 Jan 2007 (Published 1965)
First published in Analysis vol 26, No 1, pp 12-16
1965.
Abstract (actually the opening paragraph of the paper):
It is frequently taken for granted, both by people discussing logical
distinctions and by people using them, that the terms 'necessary',
'a priori', and 'analytic' are equivalent, that they mark not three
distinctions, but one. Occasionally an attempt is made to establish
that two or more of these terms are equivalent. However, it seems me far
from obvious that they are or can be shown to be equivalent, that
they cannot be given definitions which enable them to mark important and
different distinctions. Whether these different distinctions happen to
coincide or not is, as I shall show, a further question, requiring
detailed investigation. In this paper, an attempt will be made to show
in a brief and schematic way that there is an open problem here and
that it is extremely misleading to talk as if there were only one
distinction.
1964
Title: Rules of inference, or suppressed premisses? (1964)
Author:
Aaron Sloman
Date Installed:
31 Dec 2006
First published in Mind Volume LXXIII, Number 289 Pp. 84-96,
1964.
In ordinary discourse we often use or accept as valid, arguments of the
form "P, so Q", or "P, therefore Q", or "Q, because P" where the
validity of the inference from P to Q is not merely logical: the
statement of the form "If P then Q" is not a logical truth, even if it
is true. Inductive inferences and inferences made in the course of moral
arguments provide illustrations of this. Philosophers, concerned about
the justification for such reasoning, have recently debated whether the
validity of these inferences depends on special rules of inference which
are not merely logical rules, or on suppressed premisses which, when
added to the explicit premisses, yield an argument in which the
inference is logically, that is deductively, valid. In a contribution to
MIND ("Rules of Inference in Moral Reasoning", July 1961), Nelson Pike
describes such a debate concerning the nature of moral reasoning. Hare
claims that certain moral arguments involve suppressed deductive
premisses, whereas Toulmin analyses them in terms of special rules
of
inference, peculiar to the discourse of morality. Pike concludes that
the main points so far made on either side of the dispute are "quite
ineffective" (p. 391), and suggests that the problem itself is to blame,
since the reasoning of the "ordinary moralist" is too rough and ready
for fine logical distinctions to apply (pp. 398-399). In this paper an
attempt will be made to take his discussion still further and explain in
more detail why arguments in favour of either rules of inference or
suppressed premisses must be ineffective. It appears that the root of
the trouble has nothing to do with moral reasoning specifically, but
arises out of a general temptation to apply to meaningful discourse a
distinction which makes sense only in connection with purely formal
calculi.
Title: Colour Incompatibilities and Analyticity
Author: Aaron Sloman
Analysis, Vol. 24, Supplement 2. (Jan., 1964), pp. 104-119.
The debate about the possibility of synthetic necessary truths is an
old and familiar one. The question may be discussed either in a
general way, or with reference to specific examples. This essay is
concerned with the specific controversy concerning the
incompatibility of colours, or colour concepts, or colour words. The
essay is mainly negative: I shall neither assume, nor try to prove,
that colours are incompatible, or that their incompatibility is
either analytic or synthetic, but only that certain more or Less
familiar arguments intended to show that incompatibility relations
between colours are analytic fail to do so. It will follow from
this that attempts to generalise these arguments to show that no
necessary truths can be synthetic will be unsuccessful, unless they
bring in quite new sorts of considerations. The essay does, however,
have a positive purpose, namely the partial clarification of some of
the concepts employed by philosophers who discuss this sort of
question, concepts such as 'analytic' and 'true in virtue of
linguistic rules'. Such clarification is desirable since it is often
not at all clear what such philosophers think that they have
established, since the usage of these terms by philosophers is often
so loose and divergent that disagreements may be based on partial
misunderstanding. The trouble has a three-fold source : the meaning
of 'analytic' is unclear, the meaning of 'necessary' is unclear, and
it is not always clear what these terms are supposed to be applied
to. (E.g. are they sentences, statements, propositions, truths,
knowledge, ways of knowing, or what?) Not all of these confusions
can be eliminated here, but an attempt will be made to clear some of
them away by giving a definition of 'analytic' which avoids some of
the confused and confusing features of Kant's exposition without
altering the spirit of his definition.
1963
Author: Aaron Sloman
Date Installed: 23 Dec 2007
A summary of the 1963 Logic Colloquium was published
by E. J. Lemmon, M. A. E. Dummett, and J. N. Crossley
with abstracts of papers
presented, including
my 'Functions and Rogators', was published in
The Journal of Symbolic Logic,
Vol. 28, No. 3. (Sep., 1963), pp. 262-272.
accessible
online here.
1962
Title: Oxford DPhil Thesis (1962): Knowing and Understanding
Relations between meaning and truth, meaning and necessary truth,
meaning and synthetic necessary truth
Author: Aaron Sloman
This thesis was scanned in and made generally available by Oxford
University Research Archive in the form of PDF versions of the
chapters, in 2007. The text is only in image form and is viewable
and printable, but not searchable.
Date Installed: 2 May 2007
http://ora.ox.ac.uk/objects/uuid:cda7c325-e49f-485a-aa1d-7ea8ae692877
Old link (broken?)
http://ora.ouls.ox.ac.uk:8081/10030/928
(which is often extremely slow to respond.)
The avowed aim of the thesis is to show that there are some
synthetic necessary truths, or that synthetic apriori knowledge is
possible. This is really a pretext for an investigation into the
general connection between meaning and truth, or between
understanding and knowing, which, as pointed out in the preface, is
really the first stage in a more general enquiry concerning meaning.
(Not all kinds of meaning are concerned with truth.) After the
preliminaries (chapter one), in which the problem is stated and some
methodological remarks made, the investigation proceeds in two
stages. First there is a detailed inquiry into the manner in which
the meanings or functions of words occurring in a statement help to
determine the conditions in which that statement would be true (or
false). This prepares the way for the second stage, which is an
inquiry concerning the connection between meaning and necessary
truth (between understanding and knowing apriori). The first stage
occupies Part Two of the thesis, the second stage Part Three. In all
this, only a restricted class of statements is discussed, namely
those which contain nothing but logical words and descriptive words,
such as "Not all round tables are scarlet" and "Every three-sided
figure is three-angled". (The reasons for not discussing proper
names and other singular definite referring expressions are given in
Appendix I.)
See also the School of Computer Science Web page.
This file is maintained by
Aaron Sloman, and designed to be
lynx-friendly,
and
viewable with any browser.
Email A.Sloman@cs.bham.ac.uk