This paper discusses proof-theoretic semantics, the project of specifying the meanings of the logical constants in terms of rules of inference governing them. I concentrate on Michael Dummett’s and Dag Prawitz’ philosophical motivations and give precise characterisations of the crucial notions of harmony and stability, placed in the context of proving normalisation results in systems of natural deduction. I point out a problem for defining the meaning of negation in this framework and prospects for an account of (...) the meanings of modal operators in terms of rules of inference. (shrink)
In the proof-theoretic semantics approach to meaning, harmony , requiring a balance between introduction-rules (I-rules) and elimination rules (E-rules) within a meaning conferring natural-deduction proof-system, is a central notion. In this paper, we consider two notions of harmony that were proposed in the literature: 1. GE-harmony , requiring a certain form of the E-rules, given the form of the I-rules. 2. Local intrinsic harmony : imposes the existence of certain transformations of derivations, known (...) as reduction and expansion . We propose a construction of the E-rules (in GE-form) from given I-rules, and prove that the constructed rules satisfy also local intrinsic harmony. The construction is based on a classification of I-rules, and constitute an implementation to Gentzen’s (and Pawitz’) remark, that E-rules can be “read off” I-rules. (shrink)
Many prominent writers on the philosophy of logic, including Michael Dummett, Dag Prawitz, Neil Tennant, have held that the introduction and elimination rules of a logical connective must be ‘in harmony ’ if the connective is to possess a sense. This Harmony Thesis has been used to justify the choice of logic: in particular, supposed violations of it by the classical rules for negation have been the basis for arguments for switching from classical to intuitionistic logic. The Thesis (...) has also had an influence on the philosophy of language: some prominent writers in that area, notably Dummett and Robert Brandom, have taken it to be a special case of a more general requirement that the grounds for asserting a statement must cohere with its consequences. This essay considers various ways of making the Harmony Thesis precise and scrutinizes the most influential arguments for it. The verdict is negative: all the extant arguments for the Thesis are weak, and no version of it is remotely plausible. (shrink)
In this paper, I'll present a general way of "reading off" introduction/elimination rules from elimination/introduction rules, and define notions of harmony and stability on the basis of it.
The paper briefly surveys the sentential proof-theoretic semantics for fragment of English. Then, appealing to a version of Frege’s context-principle (specified to fit type-logical grammar), a method is presented for deriving proof-theoretic meanings for sub-sentential phrases, down to lexical units (words). The sentential meaning is decomposed according to the function-argument structure as determined by the type-logical grammar. In doing so, the paper presents a novel proof-theoretic interpretation of simple type, replacing Montague’s model-theoretic type (...) interpretation (in arbitrary Henkin models). The domains of derivations are collections of derivations in the associated “dedicated” natural-deduction proof-system, and functions therein (with no appeal to models, truth-values and elements of a domain). The compositionality of the semantics is analyzed. (shrink)
The impossibility results in judgement aggregation show a clash between fair aggregation procedures and rational collective outcomes. In this paper, we are interested in analysing the notion of rational outcome by proposing a proof-theoretical understanding of collective rationality. In particular, we use the analysis of proofs and inferences provided by linear logic in order to define a fine-grained notion of group reasoning that allows for studying collective rationality with respect to a number of logics. We analyse the well-known paradoxes (...) in judgement aggregation and we pinpoint the reasoning steps that trigger the inconsistencies. Moreover, we extend the map of possibility and impossibility results in judgement aggregation by discussing the case of substructural logics. In particular, we show that there exist fragments of linear logic for which general possibility results can be obtained. (shrink)
Prawitz conjectured that proof-theoretic validity offers a semantics for intuitionistic logic. This conjecture has recently been proven false by Piecha and Schroeder-Heister. This article resolves one of the questions left open by this recent result by showing the extensional alignment of proof-theoretic validity and general inquisitive logic. General inquisitive logic is a generalisation of inquisitive semantics, a uniform semantics for questions and assertions. The paper further defines a notion of quasi-proof-theoretic validity by restricting (...) class='Hi'>proof-theoretic validity to allow double negation elimination for atomic formulas and proves the extensional alignment of quasi-proof-theoretic validity and inquisitive logic. (shrink)
In this dissertation, we shall investigate whether Tennant's criterion for paradoxicality(TCP) can be a correct criterion for genuine paradoxes and whether the requirement of a normal derivation(RND) can be a proof-theoretic solution to the paradoxes. Tennant’s criterion has two types of counterexamples. The one is a case which raises the problem of overgeneration that TCP makes a paradoxical derivation non-paradoxical. The other is one which generates the problem of undergeneration that TCP renders a non-paradoxical derivation paradoxical. Chapter 2 (...) deals with the problem of undergeneration and Chapter 3 concerns the problem of overgeneration. Chapter 2 discusses that Tenant’s diagnosis of the counterexample which applies CR−rule and causes the undergeneration problem is not correct and presents a solution to the problem of undergeneration. Chapter 3 argues that Tennant’s diagnosis of the counterexample raising the overgeneration problem is wrong and provides a solution to the problem. Finally, Chapter 4 addresses what should be explicated in order for RND to be a proof-theoretic solution to the paradoxes. (shrink)
This paper considers Rumfitt’s bilateral classical logic (BCL), which is proposed to counter Dummett’s challenge to classical logic. First, agreeing with several authors, we argue that Rumfitt’s notion of harmony, used to justify logical rules by a purely proof theoretical manner, is not sufficient to justify coordination rules in BCL purely proof-theoretically. For the central part of this paper, we propose a notion of proof-theoretical validity similar to Prawitz for BCL and proves that BCL is sound (...) and complete respect to this notion of validity. The major difficulty in defining validity for BCL is that validity of positive +A appears to depend on negative −A, and vice versa. Thus, the straightforward inductive definition does not work because of this circular dependance. However, Knaster-Tarski’s fixed point theorem can resolve this circularity. Finally, we discuss the philosophical relevance of our work, in particular, the impact of the use of fixed point theorem and the issue of decidability. (shrink)
Dummett’s justification procedures are revisited. They are used as background for the discussion of some conceptual and technical issues in proof-theoretic semantics, especially the role played by assumptions in proof-theoretic definitions of validity.
This paper considers proof-theoretic semantics for necessity within Dummett's and Prawitz's framework. Inspired by a system of Pfenning's and Davies's, the language of intuitionist logic is extended by a higher order operator which captures a notion of validity. A notion of relative necessary is defined in terms of it, which expresses a necessary connection between the assumptions and the conclusion of a deduction.
I explore, from a proof-theoretic perspective, the hierarchy of classical and paraconsistent logics introduced by Barrio, Pailos and Szmuc in. First, I provide sequent rules and axioms for all the logics in the hierarchy, for all inferential levels, and establish soundness and completeness results. Second, I show how to extend those systems with a corresponding hierarchy of validity predicates, each one of which is meant to capture “validity” at a different inferential level. Then, I point out two potential (...) philosophical implications of these results. Since the logics in the hierarchy differ from one another on the rules, I argue that each such logic maintains its own distinct identity. Each validity predicate need not capture “validity” at more than one metainferential level. Hence, there are reasons to deny the thesis ) that the validity predicate introduced in by Beall and Murzi in, 143–165, 2013) has to express facts not only about what follows from what, but also about the metarules, etc. (shrink)
Ian Rumfitt has proposed systems of bilateral logic for primitive speech acts of assertion and denial, with the purpose of ‘exploring the possibility of specifying the classically intended senses for the connectives in terms of their deductive use’ : 810f). Rumfitt formalises two systems of bilateral logic and gives two arguments for their classical nature. I assess both arguments and conclude that only one system satisfies the meaning-theoretical requirements Rumfitt imposes in his arguments. I then formalise an intuitionist system of (...) bilateral logic which also meets those requirements. Thus Rumfitt cannot claim that only classical bilateral rules of inference succeed in imparting a coherent sense onto the connectives. My system can be extended to classical logic by adding the intuitionistically unacceptable half of a structural rule Rumfitt uses to codify the relation between assertion and denial. Thus there is a clear sense in which, in the bilateral framework, the difference between classicism and intuitionism is not one of the rules of inference governing negation, but rather one of the relation between assertion and denial. (shrink)
Which rules for aggregating judgments on logically connected propositions are manipulable and which not? In this paper, we introduce a preference-free concept of non-manipulability and contrast it with a preference-theoretic concept of strategy-proofness. We characterize all non-manipulable and all strategy-proof judgment aggregation rules and prove an impossibility theorem similar to the Gibbard--Satterthwaite theorem. We also discuss weaker forms of non-manipulability and strategy-proofness. Comparing two frequently discussed aggregation rules, we show that “conclusion-based voting” is less vulnerable to manipulation than (...) “premise-based voting”, which is strategy-proof only for “reason-oriented” individuals. Surprisingly, for “outcome-oriented” individuals, the two rules are strategically equivalent, generating identical judgments in equilibrium. Our results introduce game-theoretic considerations into judgment aggregation and have implications for debates on deliberative democracy. (shrink)
Gaisi Takeuti (1926–2017) is one of the most distinguished logicians in proof theory after Hilbert and Gentzen. He extensively extended Hilbert's program in the sense that he formulated Gentzen's sequent calculus, conjectured that cut-elimination holds for it (Takeuti's conjecture), and obtained several stunning results in the 1950–60s towards the solution of his conjecture. Though he has been known chiefly as a great mathematician, he wrote many papers in English and Japanese where he expressed his philosophical thoughts. In particular, he (...) used several keywords such as "active intuition" and "self-reflection" from Nishida's philosophy. In this paper, we aim to describe a general outline of our project to investigate Takeuti's philosophy of mathematics. In particular, after reviewing Takeuti's proof-theoretic results briefly, we describe some key elements in Takeuti's texts. By explaining these texts, we point out the connection between Takeuti's proof theory and Nishida's philosophy and explain the future goals of our project. (shrink)
This paper deals with a collection of concerns that, over a period of time, led the author away from the Routley–Meyer semantics, and towards proof- theoretic approaches to relevant logics, and indeed to the weak relevant logic MC of meaning containment.
Takeuti and Titani have introduced and investigated a logic they called intuitionistic fuzzy logic. This logic is characterized as the first-order Gödel logic based on the truth value set [0,1]. The logic is known to be axiomatizable, but no deduction system amenable to proof-theoretic, and hence, computational treatment, has been known. Such a system is presented here, based on previous work on hypersequent calculi for propositional Gödel logics by Avron. It is shown that the system is sound and (...) complete, and allows cut-elimination. A question by Takano regarding the eliminability of the Takeuti-Titani density rule is answered affirmatively. (shrink)
Roughly, a proof of a theorem, is “pure” if it draws only on what is “close” or “intrinsic” to that theorem. Mathematicians employ a variety of terms to identify pure proofs, saying that a pure proof is one that avoids what is “extrinsic,” “extraneous,” “distant,” “remote,” “alien,” or “foreign” to the problem or theorem under investigation. In the background of these attributions is the view that there is a distance measure (or a variety of such measures) between mathematical (...) statements and proofs. Mathematicians have paid little attention to specifying such distance measures precisely because in practice certain methods of proof have seemed self- evidently impure by design: think for instance of analytic geometry and analytic number theory. By contrast, mathematicians have paid considerable attention to whether such impurities are a good thing or to be avoided, and some have claimed that they are valuable because generally impure proofs are simpler than pure proofs. This article is an investigation of this claim, formulated more precisely by proof- theoretic means. After assembling evidence from proof theory that may be thought to support this claim, we will argue that on the contrary this evidence does not support the claim. (shrink)
This book tells the story of modern ethics, namely the story of a discourse that, after the Renaissance, went through a methodological revolution giving birth to Grotius’s and Pufendorf’s new science of natural law, leaving room for two centuries of explorations of the possible developments and implications of this new paradigm, up to the crisis of the Eighties of the eighteenth century, a crisis that carried a kind of mitosis, the act of birth of both basic paradigms of the two (...) following centuries: Kantian ethics and utilitarianism. The new science of natural law carried a fresh start for ethics, resulting from a mixture of the Old and the New. It was, as suggested by Schneewind, an attempt at rescuing the content of Scholastic and Stoic doctrines on a new methodological basis. The former was the claim of existence of objective and universal moral laws; the latter was the self-aware attempt at justifying a minimal kernel of such laws facing skeptical doubt. What Bentham and Kant did was precisely carrying this strategy further on, even if restructuring it each of them around one out of two alternative basic claims. The nineteenth- and twentieth-century critics of the Enlightenment attacked both not on their alleged failure in carrying out their own projects, but precisely on having adopted Grotius’s and Pufendorf’s project. What counter-enlightenment has been unable to spell out is which alternative project could be carried out facing the modern condition of pluralism, while on the contrary, if we takes a closer look at developments in twentieth-century ethics or at on-going discussions on practical issues, we might feel inclined to believe that Grotius’s and Pufendorf’s project is as up-to-date as ever. -/- Table of Contents -/- Preface I. Fathers of the Reformation and Schoolmen 1.1. Luther: passive justice and the good deeds; 1.2. Calvin: voluntarism and predestination; 1.3. Baroque Scholasticism; 1.4. Casuistry and Institutiones morales -/- II Neo-Platonists, neo-Stoics, neo-Sceptics 2.1. Aristotelian, neo-Platonic, neo-Epicurean and neo-Cynic Humanists; 2.2. Oeconomica and the art of living; 2.3. Neo-Stoics; 2.4. Neo-Sceptics; 2.5. Moralistic literature -/- III Neo-Augustinians 3.l. The Jansenists on natura lapsa, sufficient grace, pure love; 3.2. Nicole on the impossibility of self-knowledge; 3.3. Nicole on self-love and charity; 3.4. Nicole against civic virtue, for Christian civility; 3.5. Malebranche on general laws and necessary evil; 3.6. Malebranche on Neo-Augustinianism and Platonism. -/- IV Grotius, Pufendorf and the new moral science 4.1. Grotius against Aristotle and the sceptics; 4.2. Mersenne and Gassendi; 4.3. Descartes on ethics as the last branch of philosophy’s tree; 4.4. Hobbes on scepticism and the new moral science; 4.5. Spinoza on the new moral science as a descriptive science;4.6. Locke on voluntarism and probabilism; 4.7. Pufendorf on natural law as an exact science; 4.8. Pufendorf on physical and moral entities; 10. Pufendorf on self-preservation -/- V The empiricist version of the new moral science: from Cumberland to Paley 5.1. Cumberland against Hobbesian voluntarism; 5.2. Cumberland and theological consequentialism; 5.3. Cumberland on universal benevolence and self-love; 5.4. Shaftesbury on the moral sense; 5.5. Hutcheson on natural law and moral faculties; 5.6. Gay, Brown, Paley and theological consequentialism. -/- VI The rationalist version of the new moral science: from Cudworth to Price 6.1. The Cambridge Platonists; 6.2. Shaftesbury on the moral sense; 6.3. Butler and a third way between voluntarism and scepticism; 6.4. Price and the rational character of moral truths; -/- VII Leibniz’s compromise between the new moral science and Aristotelianism 1.Leibniz against voluntarism; 2.Leibniz against the division between the physical and the moral good; 3.Leibniz on la place d’autrui and theological consequentialism; 4.Thomasius, Wolff, Crusius -/- VIII French eighteenth-century philosophers without the new moral science 8.1. The genealogy of our ideas of virtue and vice; 8.2. Maupertuis and moral arithmetic 8.3. The philosophes and the harmony of interests; 8.4. Rousseau on corruption, self-love, and virtue; 8.5. Sade on the merits of vice -/- IX Experimental moral science: Hume and Adam Smith 9.1. Mandeville’s paradox; 9.2. Hutcheson on the law of nature and moral faculties; 9.3. Hume on experimental moral philosophy and the intermediate principles; 9.4. Hume’s Law; 9.5. Hume on the fellow-feeling; 9.6. Hume on natural and artificial virtues and disinterested pleasure for utility; 9.7. Adam Smith’s anti-realist metaethics; 9.8. Adam Smith on self-deception and the paradox of happiness; 9.9. Adam Smith on sympathy and the impartial spectator; 9.10. Adam Smith on the twofold criterion for moral judgement and its paradox; 9.11. Reid on the refutation of scepticism and the self-evidence of duty -/- X Kantian ethics 10.1. Kantian metaethics: moral epistemology; 10.2. Kantian metaethics: moral ontology; 10.3. Kantian metaethics: moral psychology; 10.4. Kantian normative ethics; 10.5. Kant on the impracticability of applied ethics; 10.6. Kantian moral anthropology; 10.7. Civilisation and moralisation; 10.8. Theology on a moral basis and the origins of evil; 10.9. Fichte and the transformation of theoretical philosophy into practical philosophy XI Bentham and utilitarianism 11.1. Bentham’s linguistic theory; 11.2. Bentham’s moral ontology, psychology, and theory of action; 11.3. The principle of greatest happiness; 11.4. The critique of religious ethics; 11.5. The new morality; 11.6. Interest and duty; 11.7. Virtues; 11.8. Private ethics and legislation -/- XII Followers of the Enlightenment: liberal Judaism and Liberal Theology 12.1. Mendelssohn; 12.2. Salomon Maimon; 12.3. Haskalā and liberal Judaism; 12.4. Liberal Theology. -/- XIII Counter-Enlighteners 13.1.Romanticism and the fulfilment of individuality as the Summum Bonum; 13.2. Hegel on history as the making of liberty; 13.3. Hegel on the unhappy consciousness and the beautiful soul; 13.4. Hegel on Morality and Sittlichkeit; 13.5. Marx on ideology, alienation, and praxis; 13.6. Schopenhauer on compassion; 13.7. Kierkegaard on faith beyond ethics. -/- XIV Followers of the Enlightenment: intuitionists and utilitarian 14.1 Whewell‘s criticism of utilitarianism; 14.2 Whewell on morality and the philosophy of morality; 14.3 Whewell on the Supreme Norm; 14.4 Whewell on the conflict between duties; 14.5 Mill and the proof of the principle of utility; 14.6 Mill’s eudemonistic utilitarianism; 14.7 Mill on rules -/- XV Followers of the Enlightenment: neo-Kantians and positivists 15.1. French spiritualism; 15.2. Neo-Kantians: the Marburg school; 15.3. Neo-Kantians: the Marburg school; 15.4. Comte’s positivism and the invention of altruism; 15.5. Social Darwinism; 15.6. Wundt and an ethic of humankind -/- XVI Post-enlighteners: Sidgwick 16.1. Criticism of intuitionism; 16.2. On ethical egoism; 16.3. Criticism of utilitarianism -/- XVII Post-enlighteners: Durkheim 17.1. Sociology as physics of customs; 17.2. Morality as physics of customs and as practical science; 17.3. On Kantian ethics and utilitarianism; 17.4. The variability of moralities;17.5. Social solidarity as end and justification of morality; 17.6. Secular morality as “sociodicy”; XVIII Post-enlighteners: Nietzsche 18.1. On the Dionysian; 18.2. On the deconstruction of the world of values 18.3 On the twofold genealogy of moralities; 18.4. On ascetics and nihilism; 18.5. Normative ethics of self-fulfilment -/- Bibliography / Index of names / Index of concepts -/- . (shrink)
I intend to: a) clarify the origins and de facto meanings of the term relativism; b) reconstruct the reasons for the birth of the thesis named “cultural relativism”; d) reconstruct ethical implications of the above thesis; c) revisit the recent discussion between universalists and particularists in the light of the idea of cultural relativism.. -/- 1.Prescriptive Moral Relativism: “everybody is justified in acting in the way imposed by criteria accepted by the group he belongs to”. Universalism: there are at least (...) some judgments which are valid inter-culturally Absolutism: there are at least some particular prescriptions which are valid without exception everywhere and always -/- 2. The traditional proof of prescriptive moral relativism: the argument from variability: Judgments, rules, and shared values are de facto variable in time and space. The traditional counter-proof: examples of variability do not prove what skeptics contend. -/- 3. Pre-history of the doctrine -Ancient sophists: either immoralist or contractualist -Modern moral scepticism (xvii c.): variability as an historical and ethnographic fact supports a sceptical conclusion more moderate than sheer immoralism. - Voltaire, Kant, Reid counter-attack pointing at a universally shared moral sense - Romantics and idealists stage an even more moderate reformulation: instead of universally shared moral sense they point at the Spirit of a People which is: a)alternative to abstract and universal philosophical systems as far as it is lived ‘culture’; b) indivisible unity with an inner harmony and a source of normative standards; c) dynamic, in so far as it is a manifestation of the Spirit through the becoming of National cultures. -/- 4. The birth of Cultural Relativism and its ethical implications 4.1. The 18th c. doctrine was the noble savage (a non-historical doctrine: state of nature vs. social state) 4.2 Edward Tylor (1832-1817) and ethnocentric historicism Savage moral standards are real enough, but they are far and weaker than ours. 4.3 Boas and Malinowski and an holistic reaction to ethnocentric historicism -/- Franz Boas (1858-1942): a) Development of civilizations is not ruled by technical progress nor does it follow a one-way path; instead there are parallel developments (for ex. Agriculture does not follow stock-raising); b) racial characters have no relevance in development of civilization; c) we are not yet in a position to compare externally identical kinds of behaviour till we have not yet understood beliefs and intentions laying at their roots (for ex.: “From an ethnological point of view murder cannot be considered as a single phenomenon”; d) we should distinguish among different practices which are only superficially similar (fro ex. practices traditionally classified under the label “tabù”); e) there is as a fact just one normative ethic, constant in its contents but varying in its extension; f) the implication is not that we cannot judge behavior by members of other groups; it is only a recommendation of caution. -/- Bronislaw Malinowski (1884-1942): a) against Tylor’s and Frazer’s “magpie” methodology, field-work is required, a culture as a whole should be observed from inside; individual elements are incomprehensible; b) a culture is an organic whole; c) its elements are accounted for by their function (economy), avoiding non-observables (empio-criticism). -/- Ruth Benedict and Melville Herskovitz identify Boas’s approach with “cultural relativism”. Benedict: what is normal and abnormal is to be judged on a culture’s own standards, not on our own (“Anthropology and the Abnormal”). Herskovits: “Boas adumbrates what we have come to call cultural relativism” (The Mind, p. 10); “Judgements are based on experience, and experience is interpreted by each individual in terms of his own enculturation” (Man and his Works). -/- 4. How analytic philosophy understood and misunderstood the discussion 4.1. At the beginning of the 20th c., the new view in ethics was non-cognitivism (emotivist and subjectivist). Eric Westermark combines this view with an old-style ethnographic approach in support of relativity of moralities. Moralities are codes, or systems of emotive ‘disinterested’ reactions selected by evolution on their usefulness in terms of survival value for the society that is the carrier of such systems or codes. The moral relativity thesis: there are cases of disagreement that cannot be settled even after agreement about facts. 4.2 Anti-realists Brandt, Mackie, Gilbert, Harman adopt Westermark’s approach in a more sophisticated version: a) moralities are codes with an overall function and may be appraised only as wholes; b) variability is an argument for moral subjectivism; c) apparent legitimacy of deriving shift from ought is legitimized only within one institution d) morality should not be described but instead made, and existing moralities may be improved. Is it ‘real’ relativism? It is clearly subjectivism (a metaethical thesis). The normative thesis is that there better and worse codes, and survival values is the normative standard. -/- 4.3 Particularists MacIntyre, Sandel, Taylor, Wiggins, McDowell ‘Wittgensteinian’ prospectivist arguments bent to support weak-relativist claims MacIntyre: there is ‘incommensurability’ between different theoretical systems in both science and ethics. No argument is possible through different systems Different traditions may coexist for a long time without being able to bring their conflicts to a rational solution. -/- 4.4 Kantian universalists Baier, Gewirth, Rawls, Apel, Habermas Shared claim: justice concerns the right and is universal in so far as it may be based on minimal assumptions Other virtues are relative to context in so far as they are related to comprehensive views of the good - O’Neill criticism: a) it is an assumption shared by both alignments; b) after an alleged crisis brought about by alleged loss of metaphysical certainties, theories of justice have dropped demanding assumptions and kept universalism, virtue theories have kept demanding assumptions and dropped universalism; c) the opposition of virtue and justice has arisen in an unjustified way. O’Neill’s positive proposal: ‘constructive’ procedures may be adopted both (i) concerning all the range of virtues and (ii) across cultures once we abandon idealization and confine ourselves to abstraction from real-world cases. -/- 4.5 A metaethical relativist and anti-relativist normative ethicists: Bernard Williams Williams: vulgar relativism may be assumed to claim that: a) 'just' means 'just in a given society'; b) 'just in a given society' is to be understood in functionalist sense; c) it is wrong for one society’s members to condemn another society’s values. It is inconsistent since in (c) uses ‘just’ in a non-relative way that has been excluded in (a). William’s positive proposal: i) keep a number of substantive or thick ethical concepts that will be different in space and time; ii) admit that public choices are to be legitimized through recourse to more abstract procedures and relying on more thin ethical concepts. -/- 5. Critical remarks 5.1 The only real relativism available is ‘vulgar’ relativism (Westermark?) 5.2. Descriptive universalism (or absolutism) has a long pedigree, from Cicero on, reaching Boas himself but it is useless as an answer to normative questions 5.3. Twentieth-century philosophical discussion seems to discuss an ad hoc doctrine reconstructed by assembling obsolete philosophical ideas but ignoring the real theory of cultural relativism as formulated by anthropologists. -/- 6. A distinction between ethoi and ethical theories as a way out of confusions a)There are systems of conventions de facto existing. These may be studies from outside as phenomena or facts. b)There is moral argument and this, when studies from outside, is a fact, but this does not influence in any degree the possible validity of claims advanced. c) the difference between the above claims and Mackie’s criticism to Searle’s argument of the promising game is that promises, arguments etc. are also phenomena, but they are also communicative phenomena with a logical and pragmatic structure. -/- 7.Conclusions: a) cultural relativism, as a name for Boas’s methodology is a valuable discovery, and in this sense we are all relativists; b) ethical relativism, as an alleged implication of cultural relativism, has been argued in a philosophically quite unsophisticated way by Benedict and Herskovits; philosophers apparently discussed ethical relativism in the basis of a rather faint impression of what cultural relativism had been. c) a full-fledged ethical relativism has hardly been defended by anybody among philosophers; virtually no modern philosopher really argued a prescriptive version of the thesis; d) we may accept the grain of truth in ethical relativism by including relativist critique to ethical absolutism into a universalist normative doctrine that be careful in separating open-textured formulations of universal claims from culturally conditioned particular prescriptions. -/- . (shrink)
Edward Feser defends the ‘Aristotelian proof’ for the existence of God, which reasons that the only adequate explanation of the existence of change is in terms of an unchangeable, purely actual being. His argument, however, relies on the falsity of the Existential Inertia Thesis, according to which concrete objects tend to persist in existence without requiring an existential sustaining cause. In this article, I first characterize the dialectical context of Feser’s Aristotelian proof, paying special attention to EIT and (...) its rival thesis—the Existential Expiration Thesis. Next, I provide a more precise characterization of EIT, after which I outline two metaphysical accounts of existential inertia. I then develop new lines of reasoning in favor of EIT that appeal to the theoretical virtues of explanatory power and simplicity. Finally, I address the predominant criticisms of EIT in the literature. (shrink)
The Born’s rule to interpret the square of wave function as the probability to get a specific value in measurement has been accepted as a postulate in foundations of quantum mechanics. Although there have been so many attempts at deriving this rule theoretically using different approaches such as frequency operator approach, many-world theory, Bayesian probability and envariance, literature shows that arguments in each of these methods are circular. In view of absence of a convincing theoretical proof, recently some researchers (...) have carried out experiments to validate the rule up-to maximum possible accuracy using multi-order interference (Sinha et al, Science, 329, 418 [2010]). But, a convincing analytical proof of Born’s rule will make us understand the basic process responsible for exact square dependency of probability on wave function. In this paper, by generalizing the method of calculating probability in common experience into quantum mechanics, we prove the Born’s rule for statistical interpretation of wave function. (shrink)
Considered in light of the readers expectation of a thoroughgoing criticism of the pretensions of the rational psychologist, and of the wealth of discussions available in the broader 18th century context, which includes a variety of proofs that do not explicitly turn on the identification of the soul as a simple substance, Kants discussion of immortality in the Paralogisms falls lamentably short. However, outside of the Paralogisms (and the published works generally), Kant had much more to say about the arguments (...) for the souls immortality as he devoted considerable time to the topic throughout his career in his lectures on metaphysics. In fact, as I show in this paper, the student lecture notes prove to be an indispensable supplement to the treatment in the Paralogisms, not only for illuminating Kants criticism of the rational psychologists views on the immortality of the soul, but also in reconciling this criticism with Kants own positive claims regarding certain theoretical proofs of immortality. (shrink)
This paper formulates a bilateral account of harmony, which is an alternative to the one proposed by Francez. It builds on an account of harmony for unilateral logic proposed by Kürbis and the observation that reading some of the rules for the connectives of bilateral logic bottom up gives the grounds and consequences of formulas with the opposite speech act. Thus the consequences of asserting a formula give grounds for denying it, namely if the opposite speech act is (...) applied to the consequences. Similarly, the consequences of denying a formula give grounds for asserting the formula. I formulate a process of inversion, which allows the determination of assertive elimination rules from assertive introduction rules, and rejective elimination rules from rejective introduction rules, and conversely. It corresponds to Francez's notion of vertical harmony. I also formulate a process of conversion, which allows the determination of rejective introduction rules from certain assertive elimination rules and conversely, and the determination for assertive introduction rules from certain rejective elimination rules and conversely. It corresponds to Francez's notion of horizontal harmony. (shrink)
Semantics plays a role in grammar in at least three guises. (A) Linguists seek to account for speakers‘ knowledge of what linguistic expressions mean. This goal is typically achieved by assigning a model theoretic interpretation2 in a compositional fashion. For example, No whale flies is true if and only if the intersection of the sets of whales and fliers is empty in the model. (B) Linguists seek to account for the ability of speakers to make various inferences based on (...) semantic knowledge. For example, No whale flies entails No blue whale flies and No whale flies high. (C) The wellformedness of a variety of syntactic constructions depends on morpho-syntactic features with a semantic flavor. For example, Under no circumstances would a whale fly is grammatical, whereas Under some circumstances would a whale fly is not, corresponding to the downward vs. upward monotonic features of the preposed phrases. (shrink)
Introduction to the Scientific Proof of the Natural Moral Law This paper proves that Aquinas has a means of demonstrating and deriving both moral goodness and the natural moral law from human nature alone. Aquinas scientifically proves the existence of the natural moral law as the natural rule of human operations from human nature alone. The distinction between moral goodness and transcendental goodness is affirmed. This provides the intellectual tools to refute the G.E. Moore (Principles of Ethics) attack against (...) the natural law as committing a "naturalistic fallacy". This article proves that instead Moore commits the fallacy of equivocation between moral goodness and transcendental goodness in his very assertion of a "naturalistic fallacy" by the proponents of the natural moral law. In the process the new deontological/kantian theory of natural law as articulated by John Finnis, Robert George, and Germain Grisez is false historically and philosophically. Ethical naturalism is affirmed as a result. (shrink)
A graph-theoretic account of fibring of logics is developed, capitalizing on the interleaving characteristics of fibring at the linguistic, semantic and proof levels. Fibring of two signatures is seen as a multi-graph (m-graph) where the nodes and the m-edges include the sorts and the constructors of the signatures at hand. Fibring of two models is a multi-graph (m-graph) where the nodes and the m-edges are the values and the operations in the models, respectively. Fibring of two deductive systems (...) is an m-graph whose nodes are language expressions and the m-edges represent the inference rules of the two original systems. The sobriety of the approach is confirmed by proving that all the fibring notions are universal constructions. This graph-theoretic view is general enough to accommodate very different fibrings of propositional based logics encompassing logics with non-deterministic semantics, logics with an algebraic semantics, logics with partial semantics and substructural logics, among others. Soundness and weak completeness are proved to be preserved under very general conditions. Strong completeness is also shown to be preserved under tighter conditions. In this setting, the collapsing problem appearing in several combinations of logic systems can be avoided. (shrink)
Gaisi Takeuti extended Gentzen's work to higher-order case in 1950's–1960's and proved the consistency of impredicative subsystems of analysis. He has been chiefly known as a successor of Hilbert's school, but we pointed out in the previous paper that Takeuti's aimed to investigate the relationships between "minds" by carrying out his proof-theoretic project rather than proving the "reliability" of such impredicative subsystems of analysis. Moreover, as briefly explained there, his philosophical ideas can be traced back to Nishida's philosophy (...) in Kyoto's school. For the proving the consistency of such systems, it is crucial to prove the well-foundedness of ordinals called "ordinal diagrams" developed for it. Takeuti presented such arguments several times in order to show that they are admitted in his stand point. As a starting point of investigating his finitist stand point, we formulate the system of ordinal notations up to ε0 and reconstruct the well-foundedness arguments of them. (shrink)
This paper discusses critically what simulation models of the evolution ofcooperation can possibly prove by examining Axelrod’s “Evolution of Cooperation” and the modeling tradition it has inspired. Hardly any of the many simulation models of the evolution of cooperation in this tradition have been applicable empirically. Axelrod’s role model suggested a research design that seemingly allowed to draw general conclusions from simulation models even if the mechanisms that drive the simulation could not be identified empirically. But this research design was (...) fundamentally flawed, because it is not possible to draw general empirical conclusions from theoretical simulations. At best such simulations can claim to prove logical possibilities, i.e. they prove that certain phenomena are possible as the consequence of the modeling assumptions built into the simulation, but not that they are possible or can be expected to occur in reality I suggest several requirements under which proofs of logical possibilities can nevertheless be considered useful. Sadly, most Axelrod-style simulations do not meet these requirements. I contrast this with Schelling’s neighborhood segregation model, thecore mechanism of which can be retraced empirically. (shrink)
Andrew Wiles' analytic proof of Fermat's Last Theorem FLT, which appeals to geometrical properties of real and complex numbers, leaves two questions unanswered: (i) What technique might Fermat have used that led him to, even if only briefly, believe he had `a truly marvellous demonstration' of FLT? (ii) Why is x^n+y^n=z^n solvable only for n<3? In this inter-disciplinary perspective, we offer insight into, and answers to, both queries; yielding a pre-formal proof of why FLT can be treated as (...) a true arithmetical proposition (one which, moreover, might not be provable formally in the first-order Peano Arithmetic PA), where we admit only elementary (i.e., number-theoretic) reasoning, without appeal to analytic properties of real and complex numbers. We cogently argue, further, that any formal proof of FLT needs---as is implicitly suggested by Wiles' proof---to appeal essentially to formal geometrical properties of formal arithmetical propositions. (shrink)
This workbook of "études" offers a collection of experimental texts for communal dialogue and discovery that crosses multiple academic disciplines, including: foundations of physics, metaphysics, theoretical biology, semiotics, cognitive science, linguistics, phenomenology, logic & mathematics, poetry and theology. Each étude probes limits, horizons and boundaries by implicitly bring into relation foundational issues that characterize different academic disciplines or systems of meaning formation. Some formal techniques are deployed the études. Most notable is the use of the “logic of three” to overcome (...) falsely totalizing images and inexorable dualities. This technique involves a particular kind of attunement to the “betweenness” of mediation that is not common in modern science. The attunement draws on the formal and precise movements of analysis in concert with the metaphoric and singular movements of synthesis. (shrink)
SYMMETRY IN PHYSICS: FROM PROPORTION AND HARMONY TO THE TERM OF METALENGUAJE -/- Ruth Castillo Universidad Central de Venezuela -/- The revolutionary changes in physics require a careful exploration of the way in which concepts depend on the theoretical structure in which they are immerse. A historical reconstruction allows us to show how the notion of symmetry evolves from the definition as proportion and harmony to its consideration within the language of contemporary physics, as a linguistic meta-theoretical requirement (...) in physical theories. In contemporary terms, symmetry is a fundamental category of research to which the usual categories of the natural sciences can be reduce in: space, time, causality, interaction, matter, strength, etc ... Thus, symmetry is a concept with different meanings: heuristically symmetric models inspire scientists in the search for solutions to different problems. Methodologically, symmetric structures are use to make theories, laws with invariant properties. A description of nature in terms of symmetric structures and symmetry ruptures seems to be the proper way to describe the complexity of reality. (shrink)
We present epistemic multilateral logic, a general logical framework for reasoning involving epistemic modality. Standard bilateral systems use propositional formulae marked with signs for assertion and rejection. Epistemic multilateral logic extends standard bilateral systems with a sign for the speech act of weak assertion (Incurvati and Schlöder 2019) and an operator for epistemic modality. We prove that epistemic multilateral logic is sound and complete with respect to the modal logic S5 modulo an appropriate translation. The logical framework developed provides the (...) basis for a novel, proof-theoretic approach to the study of epistemic modality. To demonstrate the fruitfulness of the approach, we show how the framework allows us to reconcile classical logic with the contradictoriness of so-called Yalcin sentences and to distinguish between various inference patterns on the basis of the epistemic properties they preserve. (shrink)
This chapter investigates the precise ways in which Antonio Gramsci engaged with ancient philosophy. A brief examination of the longest discussion in the Prison Notebooks of any ancient philosopher or text, Plato’s Republic (Q8, §22), raises many questions about Gramsci’s approach to ancient philosophy. These questions motivate an investigation into Gramsci’s surprisingly minimal discussion of ancient philosophy and philosophers, which is best explained in the light of his theoretical commitments to his distinctive species of historical materialism. Rather than responding to (...) specific critical insights advanced about politics or human nature by ancient philosophers, such as Plato and Aristotle, Gramsci is shown to appropriate ancient philosophy into a broad-spectrum project of the history of philosophy, which interrogates the specific conditions under which historical epochs and the philosophical ideas that characterise them emerge. Tracking the conditions and characteristic ideas is revealed to be the project of the ‘specialist philosopher’, who must grasp both general methodological principles and particular historical examples. Gramsci’s treatment of ancient philosophy reveals more about his universal theories of history, and his engagement with the ideas of his contemporaries (including Benedetto Croce), than a special concern with ancient philosophy itself. (shrink)
While non-classical theories of truth that take truth to be transparent have some obvious advantages over any classical theory that evidently must take it as non-transparent, several authors have recently argued that there's also a big disadvantage of non-classical theories as compared to their “external” classical counterparts: proof-theoretic strength. While conceding the relevance of this, the paper argues that there is a natural way to beef up extant internal theories so as to remove their proof-theoretic disadvantage. (...) It is suggested that the resulting internal theories should seem preferable to their external counterparts. (shrink)
The problem analysed in this paper is whether we can gain knowledge by using valid inferences, and how we can explain this process from a model-theoretic perspective. According to the paradox of inference (Cohen & Nagel 1936/1998, 173), it is logically impossible for an inference to be both valid and its conclusion to possess novelty with respect to the premises. I argue in this paper that valid inference has an epistemic significance, i.e., it can be used by an agent (...) to enlarge his knowledge, and this significance can be accounted in model-theoretic terms. I will argue first that the paradox is based on an equivocation, namely, it arises because logical containment, i.e., logical implication, is identified with epistemological containment, i.e., the knowledge of the premises entails the knowledge of the conclusion. Second, I will argue that a truth-conditional theory of meaning has the necessary resources to explain the epistemic significance of valid inferences. I will explain this epistemic significance starting from Carnap’s semantic theory of meaning and Tarski’s notion of satisfaction. In this way I will counter (Prawitz 2012b)’s claim that a truth-conditional theory of meaning is not able to account the legitimacy of valid inferences, i.e., their epistemic significance. (shrink)
Philosophers are divided on whether the proof- or truth-theoretic approach to logic is more fruitful. The paper demonstrates the considerable explanatory power of a truth-based approach to logic by showing that and how it can provide (i) an explanatory characterization —both semantic and proof-theoretical—of logical inference, (ii) an explanatory criterion for logical constants and operators, (iii) an explanatory account of logic’s role (function) in knowledge, as well as explanations of (iv) the characteristic features of logic —formality, strong (...) modal force, generality, topic neutrality, basicness, and (quasi-)apriority, (v) the veridicality of logic and its applicability to science, (v) the normativity of logic, (vi) error, revision, and expansion in/of logic, and (vii) the relation between logic and mathematics. The high explanatory power of the truth-theoretic approach does not rule out an equal or even higher explanatory power of the proof-theoretic approach. But to the extent that the truth-theoretic approach is shown to be highly explanatory, it sets a standard for other approaches to logic, including the proof-theoretic approach. (shrink)
We review a rough scheme of quantum mechanics using the Clifford algebra. Following the steps previously published in a paper by another author [31], we demonstrate that quantum interference arises in a Clifford algebraic formulation of quantum mechanics. In 1932 J. von Neumann showed that projection operators and, in particular, quantum density matrices can be interpreted as logical statements. In accord with a previously obtained result by V. F Orlov , in this paper we invert von Neumann’s result. Instead of (...) constructing logic from quantum mechanics , we construct quantum mechanics from an extended classical logic. It follows that the origins of the two most fundamental quantum phenomena , the indeterminism and the interference of probabilities, lie not in the traditional physics by itself but in the logical structure as realized here by the Clifford algebra. (shrink)
We examine the proof-theoretic verificationist justification procedure proposed by Dummett. After some scrutiny, two distinct interpretations with respect to bases are advanced: the independent and the dependent interpretation. We argue that both are unacceptable as a semantics for propositional intuitionistic logic.
The fundamental assumption of Dummett’s and Prawitz’ proof-theoretic justification of deduction is that ‘if we have a valid argument for a complex statement, we can construct a valid argument for it which finishes with an application of one of the introduction rules governing its principal operator’. I argue that the assumption is flawed in this general version, but should be restricted, not to apply to arguments in general, but only to proofs. I also argue that Dummett’s and Prawitz’ (...) project of providing a logical basis for metaphysics only relies on the restricted assumption. (shrink)
This paper studies the relationship between labelled and nested calculi for propositional intuitionistic logic, first-order intuitionistic logic with non-constant domains and first-order intuitionistic logic with constant domains. It is shown that Fitting’s nested calculi naturally arise from their corresponding labelled calculi—for each of the aforementioned logics—via the elimination of structural rules in labelled derivations. The translational correspondence between the two types of systems is leveraged to show that the nested calculi inherit proof-theoretic properties from their associated labelled calculi, (...) such as completeness, invertibility of rules and cut admissibility. Since labelled calculi are easily obtained via a logic’s semantics, the method presented in this paper can be seen as one whereby refined versions of labelled calculi (containing nested calculi as fragments) with favourable properties are derived directly from a logic’s semantics. (shrink)
The focus of this paper are Dummett's meaning-theoretical arguments against classical logic based on consideration about the meaning of negation. Using Dummettian principles, I shall outline three such arguments, of increasing strength, and show that they are unsuccessful by giving responses to each argument on behalf of the classical logician. What is crucial is that in responding to these arguments a classicist need not challenge any of the basic assumptions of Dummett's outlook on the theory of meaning. In particular, I (...) shall grant Dummett his general bias towards verificationism, encapsulated in the slogan 'meaning is use'. The second general assumption I see no need to question is Dummett's particular breed of molecularism. Some of Dummett's assumptions will have to be given up, if classical logic is to be vindicated in his meaning-theoretical framework. A major result of this paper will be that the meaning of negation cannot be defined by rules of inference in the Dummettian framework. (shrink)
This paper shows how to derive nested calculi from labelled calculi for propositional intuitionistic logic and first-order intuitionistic logic with constant domains, thus connecting the general results for labelled calculi with the more refined formalism of nested sequents. The extraction of nested calculi from labelled calculi obtains via considerations pertaining to the elimination of structural rules in labelled derivations. Each aspect of the extraction process is motivated and detailed, showing that each nested calculus inherits favorable proof-theoretic properties from (...) its associated labelled calculus. (shrink)
In “Proof-Theoretic Justiﬁcation of Logic”, building on work by Dummett and Prawitz, I show how to construct use-based meaning-theories for the logical constants. The assertability-conditional meaning-theory takes the meaning of the logical constants to be given by their introduction rules; the consequence-conditional meaning-theory takes the meaning of the logical constants to be given by their elimination rules. I then consider the question: given a set of introduction rules \, what are the strongest elimination rules that are validated by (...) an assertability conditional meaning-theory based on \? I prove that the intuitionistic introduction rules are the strongest rules that are validated by the intuitionistic elimination rules. I then prove that intuitionistic logic is the strongest logic that can be given either an assertability-conditional or consequence-conditional meaning-theory. In “Grounding Grounding” I discuss the notion of grounding. My discussion revolves around the problem of iterated grounding-claims. Suppose that \ grounds \; what grounds that \ grounds that \? I argue that unless we can get a satisfactory answer to this question the notion of grounding will be useless. I discuss and reject some proposed accounts of iterated grounding claims. I then develop a new way of expressing grounding, propose an account of iterated grounding-claims and show how we can develop logics for grounding. In “Is the Vagueness Argument Valid?” I argue that the Vagueness Argument in favor of unrestricted composition isn’t valid. However, if the premisses of the argument are true and the conclusion false, mereological facts fail to supervene on non-mereological facts. I argue that this failure of supervenience is an artifact of the interplay between the necessity and determinacy operators and that it does not mean that mereological facts fail to depend on non-mereological facts. I sketch a deﬂationary view of ontology to establish this. (shrink)
I argue for a kind of logical pluralism on the basis of a difficulty with defining the meaning of negation in the framework of Dummett's and Prawitz' proof-theoretic semantics.
This is part one of a two-part paper, in which we develop an axiomatic theory of the relation of partial ground. The main novelty of the paper is the of use of a binary ground predicate rather than an operator to formalize ground. This allows us to connect theories of partial ground with axiomatic theories of truth. In this part of the paper, we develop an axiomatization of the relation of partial ground over the truths of arithmetic and show that (...) the theory is a proof-theoretically conservative extension of the theory PT of positive truth. We construct models for the theory and draw some conclusions for the semantics of conceptualist ground. (shrink)
Deductive inference is usually regarded as being “tautological” or “analytical”: the information conveyed by the conclusion is contained in the information conveyed by the premises. This idea, however, clashes with the undecidability of first-order logic and with the (likely) intractability of Boolean logic. In this article, we address the problem both from the semantic and the proof-theoretical point of view. We propose a hierarchy of propositional logics that are all tractable (i.e. decidable in polynomial time), although by means of (...) growing computational resources, and converge towards classical propositional logic. The underlying claim is that this hierarchy can be used to represent increasing levels of “depth” or “informativeness” of Boolean reasoning. Special attention is paid to the most basic logic in this hierarchy, the pure “intelim logic”, which satisfies all the requirements of a natural deduction system (allowing both introduction and elimination rules for each logical operator) while admitting of a feasible (quadratic) decision procedure. We argue that this logic is “analytic” in a particularly strict sense, in that it rules out any use of “virtual information”, which is chiefly responsible for the combinatorial explosion of standard classical systems. As a result, analyticity and tractability are reconciled and growing degrees of computational complexity are associated with the depth at which the use of virtual information is allowed. (shrink)
In recent years there has been a revitalised interest in non-classical solutions to the semantic paradoxes. In this paper I show that a number of logics are susceptible to a strengthened version of Curry's paradox. This can be adapted to provide a prooftheoretic analysis of the omega-inconsistency in Lukasiewicz's continuum valued logic, allowing us to better evaluate which logics are suitable for a naïve truth theory. On this basis I identify two natural subsystems of Lukasiewicz logic which (...) individually, but not jointly, lack the problematic feature. (shrink)
I use the Corcoran–Smiley interpretation of Aristotle's syllogistic as my starting point for an examination of the syllogistic from the vantage point of modern proof theory. I aim to show that fresh logical insights are afforded by a proof-theoretically more systematic account of all four figures. First I regiment the syllogisms in the Gentzen–Prawitz system of natural deduction, using the universal and existential quantifiers of standard first-order logic, and the usual formalizations of Aristotle's sentence-forms. I explain how the (...) syllogistic is a fragment of my system of Core Logic. Then I introduce my main innovation: the use of binary quantifiers, governed by introduction and elimination rules. The syllogisms in all four figures are re-proved in the binary system, and are thereby revealed as all on a par with each other. I conclude with some comments and results about grammatical generativity, ecthesis, perfect validity, skeletal validity and Aristotle's chain principle. (shrink)
This thesis introduces the "method of structural refinement", which serves as a means of transforming the relational semantics of a modal and/or constructive logic into an 'economical' proof system by connecting two proof-theoretic paradigms: labelled and nested sequent calculi. The formalism of labelled sequents has been successful in that cut-free calculi in possession of desirable proof-theoretic properties can be automatically generated for large classes of logics. Despite these qualities, labelled systems make use of a complicated (...) syntax that explicitly incorporates the semantics of the associated logic, and such systems typically violate the subformula property to a high degree. By contrast, nested sequent calculi employ a simpler syntax and adhere to a strict reading of the subformula property, making such systems useful in the design of automated reasoning algorithms. However, the downside of the nested sequent paradigm is that a general theory concerning the automated construction of such calculi (as in the labelled setting) is essentially absent, meaning that the construction of nested systems and the confirmation of their properties is usually done on a case-by-case basis. The refinement method connects both paradigms in a fruitful way, by transforming labelled systems into nested (or, refined labelled) systems with the properties of the former preserved throughout the transformation process. To demonstrate the method of refinement and some of its applications, we consider grammar logics, first-order intuitionistic logics, and deontic STIT logics. The introduced refined labelled calculi will be used to provide the first proof-search algorithms for deontic STIT logics. Furthermore, we employ our refined labelled calculi for grammar logics to show that every logic in the class possesses the effective Lyndon interpolation property. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.