Philosophy Reading Project

I have no formal degree in Philosophy. My Master of Applied Ethics hardly counts as a solid educational background in the big philosophical questions of our age. Yet, I have a genuine private and professional interest in understanding approaches to finding answers for a variety of philosophical questions. I don’t have time to study in-depths […]

SEP 1-43 – The Analysis of Knowledge


Title: The Analysis of Knowledge

Authors: Jonathan Jenkins Ichikawa, Matthias Steup

Date: Published on February 6th 2001, last revision on March 7th 2017

Expectation: Why this topic?

I approached the topic epistemology with a motivation to apply the insights as tools for knowledge generation and refinement. Epistemology as a methodology. I am not a skeptic who needs to be convinced by epistemological means that or how knowledge is possible; I take that for granted. I am more interested in the strategies that allow the evaluation and validation of knowledge claims. When my fellow scientist comes up with a conclusion in which he infers from collected data that XYZ, I’d like to be able to scrutinize this claim on reasonable and methodologically plausible grounds. When this scientist – as a committed realist and naturalist – rejects any normativity in science, I’d like to have convincing reasons at hand that argue somehow along the lines of constructivism, virtue epistemology, the impact of meaning construction on truth, the role of language, culture and society, and the cognitive faculties that are at play in scientific reasoning. Everything learned so far, from contextualism, foundationalism and coherentism, reliabilism and evidentialism, externalism and internalism, rationalism and empiricism, subjectivism and objectivism, logic and truth, to various approaches in epistemology (VE, NE, FE, EE,…), should ultimately be applicable in a process labelled analysis of knowledge.


The article starts with a quick introduction of the JTB=knowledge condition and the problems pointed out by Gettier (chapters 1-3). It concludes this introduction with the outlook that many contemporary endeavours in the analysis of knowledge are directed at finding out which 4th condition is necessary in addition to (i) p is true, (ii) S believes that p, and (iii) S is justified in believing that p. What is the X in JTB+X?

The suggestion of ‘no false lemmas’ – S‘s belief that p is not inferred from any falsehood – doesn’t do the job. Some other suggestions build on modal conditions. Sensitivity is the requirement that, in the nearest possible worlds in which not-p, the subject does not believe that p: S’s belief that p is sensitive if and only if, if p were false, S would not believe that p. The chief motivation against a sensitivity condition is that, given plausible assumptions, it leads to unacceptable implications called abominable conjunctions. Another modal condition was proposed by Sosa, calling it safety: If S were to believe that p, p would not be false (or, in modal terms: In all nearby worlds where S believes that p, p is not false.). A third approach to modal conditions on knowledge worthy of mention is the requirement that for a subject to know that p, she must rule out all relevant alternatives to p.

The motivation for including a justification condition in an analysis of knowledge is to rule out cases of luck. An alternative approach to adding more conditions (JTB+X) to respond to Gettier cases is to reject the notion of justification altogether and include other conditions instead (XTB). Approaches in this direction are reliabilism (stating that (iii) should be ‘S’s belief that p was produced by a reliable cognitive process), and causal theories of knowledge ((iii) = S’s belief that p is caused by the fact that p).

Some epistemologists claim that knowledge is, after all, unanalyzable, because the Gettier problem is inescapable (for example, Zagzebski). A way to circumvent this dead end is to include an explicit anti-luck condition ((iv) = S’s belief is not true merely by luck.) that entails the truth and belief conditions. Some epistemologists pursue other methodological strategies that are not relying on tests of intuitions that contradict constructed cases. A wider philosophical trend away from conceptual analysis more broadly contributed to this change. Some of the more recent attempts to analyse knowledge have been motivated in part by broader considerations about the role of knowledge, or of discourse about knowledge.

The virtue-theoretic approach to knowledge is in some respects similar to the safety and anti-luck approaches. Sosa, prominent VE endorser, made use of an analogy of a skilled archer shooting at a target:

  1. Was the shot successful? Did it hit its target? à Accuracy: A belief is accurate if and only if it is true.
  2. Did the shot’s execution manifest the archer’s skill? Was it produced in a way that makes it likely to succeed? à Adroitness: A belief is adroit if and only if it is produced skillfully.
  3. Did the shot’s success manifest the archer’s skill? à Aptness: A belief is apt if and only if it is true in a way manifesting, or attributable to, the believer’s (epistemic) skill.

Something that all of the potential conditions on knowledge seem to have in common is that they have some sort of intimate connection with the truth of the relevant belief. Some have argued that focus on truth-relevant factors leaves important pragmatic factors out of our picture of knowledge and proposed pragmatic encroachment instead: A difference in pragmatic circumstances can constitute a difference in knowledge. This is not an analysis of knowledge; it is merely the claim that pragmatic factors are relevant for determining whether a subject’s belief constitutes knowledge. Some, but not all, pragmatic encroachment theorists will endorse a necessary biconditional that might be interpreted as an analysis of knowledge. For example, a pragmatic encroachment theorist might claim that ‘S knows that p if and only if no epistemic weakness vis-á-vis p prevents S from properly using p as a reason for action.

One final topic is contextualism about knowledge attributions, according to which the word “knows” and its cognates are context-sensitive. The relationship between contextualism and the analysis of knowledge is not at all straightforward. Arguably, they have different subject matters (the former a word, and the latter a mental state). Nevertheless, the methodology of theorizing about knowledge may be helpfully informed by semantic considerations about the language in which such theorizing takes place. And if contextualism is correct, then a theorist of knowledge must attend carefully to the potential for ambiguity.

Conclusion and insights

Reading this SEP entry did not reveal any new fundamental insights. It serves well as a confirming re-iteration of the previously studied, a summary that is easy to understand after going in depth into the various topics before. The one astonishing point that I came across here is the introduction of reliabilism as an alternative to justification, not as one approach to justification (next to evidentialism). The article certainly helped organizing the concepts and setting them into a larger matrix of knowledge assessment (or ‘analysis’). Yet, it remains on a superficial level compared to the in-depth essays on each particular concept and, thus, doesn’t deliver new insight.

Evaluation and comments on the entry

This entry by Ichikawa and Steup is somehow complementary to and in line with Steup’s general Epistemology overview in this encyclopedia. The writing style is clear and easy to understand. It aims at informing a non-expert reader on what important concepts are and how they are connected, but leaves it to the reader to decide in which direction to go into detail. As mentioned above, if Epistemology is a perfect introduction to this section of the reading project, then The Analysis of Knowledge is a perfect conclusion or summary.

SEP 1-42 – Information


Title: Information

Author: Pieter Adriaans

Date: Published on October 26th 2012, last revision on December 14th 2018

Expectation: Why this topic?

This SEP entry is one of the longest of this reading project (55 PDF pages). Why do I want to read all that? In the context of my research, both as chemist years ago and as S&T ethicist more recently, I frequently come across the distinction of information from knowledge, usually with the notion that mere information is something inferior to knowledge. My rough understanding is that data become information by categorization and classification (the ascription of a label or meaning), and that information becomes knowledge by contextualization and application. Knowledge, then, as orientation for action leads to insight as in This is where we are, that’s where we want to get. Wisdom means to identify the best way how to get there.


This implies a sort of hierarchy. One way to put it is that from data to wisdom we need to engage in one or more extra steps with a growing challenge and more demanding complexity: Data is obtained by mere observation and recording; information needs an additional thought (a referencing or interpretation); knowledge requires a justification and verification process as well as a contextualization and validation methodology; insight is an alignment of this knowledge with epistemic demands in specific affairs; wisdom requires reflection and evaluation as transition towards decision, attitude and action (real, sometimes physical, implications). Models of this hierarchy have been suggested often in the past decades. While Cleveland (1982) and Ackoff (1988) call the operations from one level to the next simply ‘process’ (with Ackoff being more sophisticated and detailed with the levels), Bellinger (1997) proposes that the identification of relations connect data to get information,  the formulation of patterns in the information yields knowledge, and principles within clusters of knowledge allow for wisdom. The widely acknowledged model of Carpenter and Cannady (2004) starts from environment. Data are indexed bits of this environment (for example, a pixel of an image). Information is data clustered along certain rules (for example, all pixels of an image in a proper arrangement showing a symbol or icon or thing). Modelling, the ascription of meaning or an appropriate explanation, leads to knowledge (for example, the recognition that the image shows the tower of Pisa). Knowledge given a goal is applied as wisdom (for example, that the photo could be used to ask a local for the way to get to the tower of Pisa). Wisdom paired with one’s values creates vision (for example, once in my lifetime I want to go and visit the famous tower of Pisa).


In many professional (but also in personal) contexts, a smooth transition from data to information to knowledge to action is of utmost importance for the success of an operation. The scientist collects data, assigns meaning to it and sets it into an interpretation framework. The business operator or manager attempts to gain an advantage by efficient knowledge management. It might be noteworthy that the transition works both ways: Data is collected, classified, contextualized, and applied as knowledge for proper decision-making. This is operative knowledge management: We collect as much data as we can, so that we can build a productive knowledge pool on a solid information basis. Or, knowledge is collected from the organizational intelligence (what people do, how they do it, tacit knowledge) and documented. For this purpose, the knowledge must be operationalized (or externalized), so that the information can be captured, stored and communicated in the form of data. This is strategic knowledge management.


How does this relate to knowledge in the sense of all the previous articles related to epistemology? Is there a relation between belief and information that is analogous to the belief-knowledge relation? Does information have to be justified? Is false information necessarily non-information? Is information something neutral (whereas knowledge can be normatively assessed)? Is information something that is at play in the justification process that yields JTB-kind of knowledge? Maybe Prof. Adriaans has some answers.


Philosophy of Information deals with the philosophical analysis of the notion of information both from a historical and a systematic perspective. With the emergence of the empiricist theory of knowledge in early modern philosophy, the development of various mathematical theories of information in the twentieth century and the rise of information technology, the concept of information has conquered a central place in society, in the sciences and in a broad range of philosophical disciplines varying from logic, epistemology, ethics and aesthetics to ontology.

The term information in colloquial speech is currently predominantly used as an abstract mass-noun to denote any amount of data, code or text that is stored, sent, received or manipulated in any medium. The exact meaning of the term information varies in different philosophical traditions throughout history, and its colloquial use varies geographically and over different pragmatic contexts. Although an analysis of the notion of information has been a theme in Western philosophy from its early inception, the explicit analysis of information as a philosophical concept is recent, and dates back to the second half of the twentieth century. For example, in classical philosophy, information was a technical notion associated with a theory of knowledge and ontology that originated in Plato’s theory of forms and in Aristotle’s doctrine of the four causes:

  • Material Cause: that as the result of whose presence something comes into being—e.g., the bronze of a statue and the silver of a cup, and the classes which contain these (today: matter);
  • Formal Cause: the form or pattern; that is, the essential formula and the classes which contain it—e.g., numbers and ratios are the cause of a shape (today: geometric form in space);
  • Efficient Cause: the source of the first beginning of change or rest; e.g., the man who plans is a cause, and the father is the cause of the child, and in general that which produces is the cause of that which is produced, and that which changes of that which is changed (early scientific era: mechanical interaction between material bodies, today: more complex and holistic conceptions of interactions);
  • Final Cause: the same as end; e.g. the end of walking is health (early science era: dismissed as unscientific, today: normative judgments as expression of will or result of deliberative discourse involving communicative rationality).

After a start as a technical term in classical and medieval texts the term information almost vanished from the philosophical discourse in modern philosophy, but gained popularity in colloquial speech. Gradually the term obtained the status of an abstract mass-noun, a meaning that is orthogonal to the classical process-oriented meaning. In this form it was picked up by several researchers in the twentieth century who introduced formal methods to measure information. This, in its turn, lead to a revival of the philosophical interest in the concept of information. This complex history seems to be one of the main reasons for the difficulties in formulating a definition of a unified concept of information that satisfies all our intuitions. At least three different meanings of the word information are historically relevant:

  • Information as the process of being informed. When I recognize a horse as such, then the form of a horse is planted in my mind. This process is my information of the nature of the horse. Also the act of teaching could be referred to as the information of a pupil. In the same sense one could say that a sculptor creates a sculpture by informing a piece of marble. The task of the sculptor is the information of the statue.
  • Information as the state of the agent. If one teaches a pupil the theorem of Pythagoras then, after this process is completed, the student can be said to have the information about the theorem of Pythagoras. (Note the difficulties, as Fromm would agree, that arise from the idea of having information as the result of being informed.)
  • Information as the disposition to inform. A text in which Pythagoras’ theorem is explained contains this information. The text has the capacity to inform me when I read it. In the same sense, when I have received information from a teacher, I am capable of transmitting this information to another student. Thus, information becomes something that can be stored and measured.

Everything we know about the world is based on information we received or gathered and every science in principle deals with information. There is a network of related concepts of information, with roots in various disciplines like physics, mathematics, logic, biology, economy and epistemology. All these notions cluster around two central properties:

Information is extensive and additive. The notion of extensiveness emerges naturally in our interactions with the world around us when we count and measure objects and structures. Basic conceptions of more abstract mathematical entities, like sets, multisets and sequences, were developed early in history on the basis of structural rules for the manipulation of symbols. The mathematical formalisation of extensiveness in terms of the log function took place in the context of research into thermodynamics in the nineteenth (Boltzmann 1866) and early twentieth century (Gibbs 1906).

S = k log W

This formula describes the entropy S of a system in terms of the logarithm of the number of possible microstates W, consistent with the observable macroscopic states of the system, where k is the well-known Boltzmann constant. In all its simplicity the value of this formula for modern science can hardly be overestimated. The expression log W can, from the perspective of information theory, be interpreted in various ways:

  • As the amount of entropy in the system.
  • As the length of the number needed to count all possible microstates consistent with macroscopic observations.
  • As the length of an optimal index we need to identify the specific current unknown microstate of the system, i.e., it is a measure of our “lack of information”.
  • As a measure for the probability of any typical specific microstate of the system consistent with macroscopic observations.

Thus it connects the additive nature of logarithm with the extensive qualities of entropy, probability, typicality and information and it is a fundamental step in the use of mathematics to analyze nature.

When coded in terms of more advanced multi-dimensional numbers systems (complex numbers, quaternions, octonions) the concept of extensiveness generalizes into more subtle notions of additivity that do not meet our everyday intuitions. Yet, they play an important role in recent developments of information theory based on quantum physics.

Information reduces uncertainty. The amount of information we get grows linearly with the amount by which it reduces our uncertainty until the moment that we have received all possible information and the amount of uncertainty is zero. The relation between uncertainty and information was probably first formulated by the empiricists. Hume explicitly observes that a choice from a larger selection of possibilities gives more information. This observation reached its canonical mathematical formulation in the function proposed by Hartley (1928) that defines the amount of information we get when we select an element from a finite set. The only mathematical function that unifies these two intuitions about extensiveness and probability is Shannon’s proposal that defines the information in terms of the negative log of the probability: I(x) = −logP(x).

It implies:

  • A message x has a certain probability P(x) between 0 and 1 of occurring.
  • If P(x) = 1 then I(x) = 0. If we are certain to get a message it literally contains no news at all. The lower the probability of the message is, the more information it contains. A message like “The sun will rise tomorrow” seems to contain less information than the message “Jesus was Caesar” exactly because the second statement is much less likely to be defended by anyone.
  • If two messages x and y are unrelated then I(x and y) = I(x) + I(y). Information is extensive. The amount of information in two combined messages is equal to the sum of the amount of information in the individual messages.

The elegance of this formula, however, does not shield us from the conceptual problems it harbors. In the twentieth century, various proposals for formalization of concepts of information were made:

Qualitative Theories of Information:

  1. Semantic Information: Floridi defines semantic information as well-formed, meaningful and truthful data (see his SEP entry on Semantic conceptions of information, not part of this reading project). Semantic information is close to our everyday naive notion of information as something that is conveyed by true statements about the world.
  2. Information as a state of an agent: The formal logical treatment of notions like knowledge and belief was initiated by Hintikka. Dretske, van Benthem and van Rooij studied these notions in the context of information theory. Also Dunn seems to have this notion in mind when he defines information as what is left of knowledge when one takes away believe, justification and truth. Vigo proposed a Structure-Sensitive Theory of Information based on the complexity of concept acquisition by agents.

Quantitative Theories of Information

  1. Nyquist’s function: Nyquist (1924) was probably the first to express the amount of “intelligence” that could be transmitted given a certain line speed of a telegraph systems in terms of a log function: =log m, where W is the speed of transmission, k is a constant, and m are the different voltage levels one can choose from.
  2. Fisher information: the amount of information that an observable random variable X carries about an unknown parameter θ upon which the probability of X depends (Fisher 1925).
  3. The Hartley function: The amount of information we get when we select an element from a finite set S under uniform distribution is the logarithm of the cardinality of that set.
  4. Shannon information: the entropy, H, of a discrete random variable X is a measure of the amount of uncertainty associated with the value of X (Shannon 1948; Shannon & Weaver 1949).
  5. Kolmogorov complexity: the information in a binary string x is the length of the shortest program p that produces x on a reference universal Turing machine U.
  6. Entropy measures in Physics (Boltzmann, Gibbs, Shannon, Tsallis, Rényi): Although they are not in all cases strictly measures of information, the different notions of entropy defined in physics are closely related to corresponding concepts of information.
  7. Quantum Information: The qubit is a generalization of the classical bit and is described by a quantum state in a two-state quantum-mechanical system, which is formally equivalent to a two-dimensional vector space over the complex numbers.

Until recently the possibility of a unification of these theories was generally doubted, but after two decades of research, perspectives for unification seem better. The contours of a unified concept of information emerges along the following lines:

  • Philosophy of information is a sub-discipline of philosophy, intricately related to the philosophy of logic and mathematics. Philosophy of semantic information, again, is a sub-discipline of philosophy of information. From this perspective, philosophy of information is interested in the investigation of the subject at the most general level: data, well-formed data, environmental data, etc. Philosophy of semantic information adds the dimensions of meaning and truthfulness. It is possible to interpret quantitative theories of information in the framework of a philosophy of semantic information.
  • Various quantitative concepts of information are associated with different narratives (counting, receiving messages, gathering information, computing) rooted in the same basic mathematical framework. Many problems in philosophy of information center around related problems in philosophy of mathematics. Conversions and reductions between various formal models have been studied. The situation that seems to emerge is not unlike the concept of energy: there are various formal sub-theories about energy (kinetic, potential, electrical, chemical, nuclear) with well-defined transformations between them. Apart from that, the term “energy” is used loosely in colloquial speech.
  • Agent based concepts of information emerge naturally when we extend our interest from simple measurement and symbol manipulation to the more complex paradigm of an agent with knowledge, beliefs, intentions and freedom of choice. They are associated with the deployment of other concepts of information.

The emergence of a coherent theory to measure information quantitatively in the twentieth century is closely related to the development of the theory of computing. Central in this context are the notions of Universality, Turing equivalence and Invariance: because the concept of a Turing system defines the notion of a universal programmable computer, all universal models of computation seem to have the same power. This implies that all possible measures of information definable for universal models of computation (Recursive Functions, Turing Machine, Lambda Calculus etc.) are asymptotically invariant. This gives a perspective on a unified theory of information that might dominate the research program for the years to come. Related questions and problems are:

  • What is the relation or interaction between information and computation?
  • What is a computational process from a thermodynamical point of view?
  • Is computation in the real world fundamentally non-deterministic?
  • Can meaning be reduced to computation?
  • What is the relation between symbol manipulation on a macroscopic scale and the world of quantum physics?
  • What is a good model of quantum computing and how do we control its power?
  • Is there information beyond the world of quanta?

Conclusion and insights

There is quite a gap between what I thought this article must be about and what it actually is about. Even though it mentions epistemology, it is not an entry within that category, but rather in the philosophy of mathematics. My considerations in the expectations section are not at all related to the content of this SEP entry. I believe that Floridi’s SEP entry on semantic theories of information is closer to what I think is the relevancy of information for knowledge and epistemology. Maybe someday I will read the Routledge Handbook of Philosophy of Information, edited by Floridi (see Further Reading below).

However, there are some insights from this article, of course. 16 years after studying chemistry, particularly physical chemistry and thermodynamics, I understand the outstanding significance of the Boltzmann equation that allows a statistical description of states of a system without knowing what the actual momentary state is. It formalizes information very elegantly and circumvents the difficulty of having too much information present at a given moment in the universe. What a genius!

I found it strikingly clear how Aristotle’s four causes concept translates into modern views, especially how (and why) early scientific era thinkers had to dismiss the final cause as unscientific because it was not commensurable with their worldview. I believe we are one step further today and can make sense of normative choices (ends) as knowledge and, thus, as causal elements in decision theories. I say that as a proponent of naturalism and constructivism in meta-ethics. I find that highly important and practically relevant, because it allows for the information-technological treatment of normative judgments and decision-making, because normative knowledge, formerly often treated as tacit knowledge, can be formalized as descriptive knowledge and, thus, informationalised. Possibly, this will be a key step towards ethical decision-making of artificial entities like self-driving cars or control routines in cloud computing or internet of things.

Evaluation and comments on the entry

This is a weird article. The historical survey (chapter 1) is much too general and lengthy when it describes the ancient Greek, the empiricists, the rationalism-empiricism divide, or the early scientific era. Chapter 5 on the mathematical accounts of information, on the contrary, is way too specific and technical for an encyclopedia article (in my humble opinion). The author clearly demonstrates a profound in-depth understanding of the field, no doubts! Yet, I have the feeling he wrote this entry with the intention to do just that: show off with what he knows. The effect was that I simply skimmed through more than half of the article, knowing that I wouldn’t miss anything philosophically important. Even though, as explained above, the epistemological insights of this essay are rather limited, Adriaans provides an impressive survey of philosophy of information that, for most readers, I assume, is quite a challenge.

Further Reading

Luciano Floridi (ed.), The Routledge Handbook of Philosophy of Information, Routledge, Abingdon, UK, 2016.

Germans’ Problem with Face Masks

Let me start with some numbers from the SARS-CoV-2 pandemic:

  • Germany (83 million inhabitants): 199919 confirmed infections, 9071 covid-19 related deaths.
  • Taiwan (24 million inhabitants): 451 confirmed infections, 7 covid-19 related deaths.

The reasons for this obvious discrepancy are manifold and complex (and certainly NOT the result of Taiwan not testing enough). Sensitised by the SARS pandemic years ago, Taiwan was one of the first countries to take measures against the spread of the new coronavirus. It had no lockdown, and with the help of digital technologies (like health tracking apps) even schools and universities could maintain onsite (F2F) classes all the time. Taiwan is extremely densely populated, with 90% of the population packed in cities bigger than 200000 inhabitants. So, what’s the difference? It most likely has to do with wearing face masks.


Asia: Nothing special in this image.


Protests against corona protection regulations in Munich: “Face mask duty? Gone mad! #resistance2020”

It is known even in Taiwan that Germans (but also other Western nations) are reluctant to wear face masks. My friends here often ask me why, and I have difficulties answering. The following thoughts are not scientifically backed up since I am not an expert on this. Academic fields to study this in depth are sociology, psychology, and possible culture studies and philosophy. First, some observations:

  • German almost never wear face masks in public. When they see pictures from East Asian cities where most faces are covered, they are confirmed in their worldview that something must go wrong there and that we here in Germany are smart enough to maintain a healthy and clean environment. Not having to wear face masks triggers German pride and arrogance. Moreover, many German think that only sick people wear masks. My Taiwanese friend in Tübingen (South Germany) got scolded on a bus because she wore a mask to protect herself: “If you are sick, why don’t you stay home?!” – “I am not sick!” – “Then why do you wear a mask??
  • Since there is no market for face masks, they are hard to get in stores. The production is low. That’s why Germany had quite some problems getting enough masks in the early phase of the virus outbreak.
  • Even though different sources communicated different recommendations on wearing face masks, in the early phase of the outbreak the view that they are useless dominated, and the majority of people referred to the opponents of face mask duty. A turning point was reach around the end of April. Meanwhile, studies indicate that many death cases could have been prevented if masks had been made obligatory in public interaction earlier.
  • On the major German News outlet, I sometimes read the comments under the News articles. It’s usually devastating, because 90% of the comments are stupid and idiotic and make me question the cognitive and intellectual sanity of the German population. Whoever had this idea to add a comment function to News articles: Please remove the function again! The general public is not capable of it! However, if these comments illustrate something like a common public mood, most people not only hate wearing face masks, they are also extremely stubborn in ignoring scientific insights. Even the German society is more and more infested by postfactualism and a notorious anti-expert attitude (“My opinion is as good as that of those incompetent wisecrackers!“). Even if all evidence speaks in favour of wearing face masks, people question the numbers and refer to their personal freedom. This is not only asocial and unethical, it is also a very inconsistent conception of freedom.

That last point might be a good indication why Germans have such a big problem with face masks. I believe there is much more behind it than simple inconvenience (“I can’t breathe properly, the string hurts, and my glasses get blurry!“) or economic (“I don’t want to spend money on it!“), environmental (“It produces too much trash!“) and aesthetic (“They look ugly!“) considerations. It seems to me it has to do with face, in an almost Confucian sense, but somehow much stronger. In the individualistic and humanistic conception of self-identity, the face is almost identical with personhood. The face is not only channel of emotional and attitudinal communication, transmitting and receiving signals from the other, it is also a kind of ID tag that conveys either safety or threat. We Germans need to and want to see the face expression of the person in front of us, and based on that image we judge if that person is trustworthy or potentially harmful. We feel extremely uncomfortable with masks, associating a covered face with crime, robbery, secrecy, intrigue, deception. There are laws that prohibit wearing masks in public, for example when driving cars (which is often a problem during the carnival season): The driver has to be clearly identifiable via his/her face.

This also explains why many Germans have such a huge problem with Muslim women wearing veils: No rational human being would voluntarily hide behind a veil, so the only explanation is that those old-fashioned Muslim patriarchs force the women to wear veils and, thus, violate basic human rights! Just another indication that Islam is misogynic and suppressive! Many people can’t imagine that women with a Muslim commitment choose to wear the veil as an element of their cultural identity. Wearing a veil as an act of personal freedom? No way!

The same in this virus crisis: Wearing a face mask as an act of protection and social responsibility? No way! If I have to cover my face, my most personal trait and expression of my ego and identity, my human rights are drastically restricted! The interesting thing is that this importance of face in German self-understanding is even stronger and much more physical than the Confucian mianzi (face) concept. In Asia, you keep face by kindness, cooperativity, respect, obeying social rules. In Germany, apparently, you keep face by showing it with all its emotions and resentments in it. How can I give you my feedback if you can’t even see my face?

Maybe it just needs time. Germans are just not used to wearing face masks, yet. I also wasn’t, and I admit that I still don’t like it. But I am quite sure, it is one of the factors that makes a difference in dealing with SARS-CoV-2 between Germany and Taiwan. Currently, I feel very safe in Taiwan, partly thanks to the discipline of the Taiwanese people who wear masks without any complaints.

SEP 1-41 – Feminist Epistemology and Philosophy of Science


Title: Feminist Epistemology and Philosophy of Science

Author: Elizabeth Anderson

Date: Published on August 9th 2000, last revision on February 13th 2020

Expectation: Why this topic?

The next on the list of different approaches in or to epistemology, according to Steup’s introduction article, is feminist epistemology (FE). So far, I had no special interest in or contact with feminist accounts in philosophy, but I heard that it’s a thing. I always perceived feminism as overshooting the mark: It is not about equality or emancipation, but constitutes an unnecessarily aggressive and angry attitude that wants to show the male part of our society that women are stronger, better, and smarter. Yes, I know that all this is true, so why do we have to fight about it? Freedom granted! Self-fulfilment granted! Equality and fairness granted! What is the damn problem? Of course, I may be ignorant and not aware of injustice, discrimination and all-pervading chauvinism. In my world, women and men stand on equal footing, so I don’t understand all that fuzz. If I get it correctly, though, feminist approaches to anything philosophical are not the academic equivalent of the social movement labelled feminism, but adumbrate all those cases where the situation of the philosopher impacts his or her elaborations in the form of some bias. It seems, feminists are almost always social constructivists in that they claim that gender, ethnicity, skin tone, cultural background, social class, etc., are classifications that cause certain modes of thinking and certain unquestioned default setting attitudes in the agent. That means, a feminist would regard epistemological considerations as strongly tied to the predominant role clichés prevalent in a particular society. When only men do epistemology, our ideas of knowledge and its acquisition are very different compared to if women played a more vital and visible role in epistemology. I have no imagination in which way that could manifest itself concretely. Maybe Elizabeth Anderson can teach me more!


FE studies the ways in which gender does and ought to influence our conceptions of knowledge, knowers, and practices of inquiry and justification. It identifies how dominant conceptions and practices of knowledge attribution, acquisition, and justification disadvantage women and other subordinated groups, and strives to reform them to serve the interests of these groups. Various feminist epistemologists and philosophers of science argue that dominant knowledge practices disadvantage women by:

  1. excluding them from inquiry,
  2. denying them epistemic authority,
  3. denigrating “feminine” cognitive styles,
  4. producing theories of women that represent them as inferior, or significant only in the ways they serve male interests,
  5. producing theories of social phenomena that render women’s activities and interests, or gendered power relations, invisible,
  6. producing knowledge that is not useful for people in subordinate positions, or that reinforces gender and other social hierarchies.

Feminist epistemologists aim to:

  1. trace these failures to flawed conceptions of knowledge, knowers, objectivity, and scientific methodology,
  2. offer diverse accounts of how to overcome these failures,
  3. explain why the entry of women and feminist scholars into different academic disciplines has generated new questions, theories, methods, and findings,
  4. show how gender and feminist values and perspectives have played a causal role in these transformations,
  5. promote theories that aid egalitarian and liberation movements,
  6. defend these developments as epistemic advances.

FE conceives of knowers as situated in particular relations to what is known and to other knowers. What is known, and how it is known, reflects the situation and perspective of the knower. Relations of the knower to the object that is known are aspects of: embodiment; first- and third-person knowledge; emotions, attitudes, interests, and values; personal knowledge of others; know-how; cognitive styles; background beliefs and worldviews; relations to other inquirers. These factors influence knowers’ access to information and the terms in which they represent what they know. They bear on the form of their knowledge (articulate/implicit, formal/informal, and so forth). They affect their attitudes toward their beliefs (certainty/doubt, dogmatic/open to revision), their standards of justification, and the authority with which they lay claim to their beliefs and offer them to others. They affect knowers’ assessment of which claims are significant or important.

Of course, the social environment has an undeniable impact on this situatedness. Gender, as FE claims, is a mode of such social situation, and knowledge is regarded as gendered. Via the phenomenology of gendered bodies and gendered first-person knowledge, feminists arrive at gendered attitudes, interests, and values. Common questions include:

  • Can situated emotional responses to things be a valid source of knowledge about them?
  • Do dominant practices and conceptions of science reflect an androcentric perspective, or a perspective that reflects other dominant positions, as of race and colonial rule?
  • Do mainstream philosophical conceptions of objectivity, knowledge, and reason reflect an androcentric perspective?
  • How would the conceptual frameworks of particular sciences change if they reflected the interests of women?

Knowledge of others, know-how (skills), cognitive styles, background beliefs and worldviews – these factors are gendered, too. For example, deductive, analytic, atomistic, acontextual, and quantitative cognitive styles are labeled masculine, while intuitive, synthetic, holistic, contextual and qualitative cognitive styles are labeled feminine. It is seen as masculine to make one’s point by argument, feminine to make one’s point by narrative. Argument is commonly cast as an adversarial mode of discourse, like war, while narrative is viewed as a seductive mode of discourse, like love. These phenomena raise epistemic questions: does the quest for masculine prestige by using masculine methods distort practices of knowledge acquisition? Are some kinds of research unfairly ignored because of their association with feminine cognitive styles?

Feminist epistemologists have considered situated knowledge within three traditions:

  1. Standpoint Theory – Feminist standpoint theory claims that the standpoint of women has an epistemic advantage over phenomena in which gender is implicated, relative to theories that make sexist or androcentric assumptions. Variants of feminist standpoint theory ground this epistemic advantage in different features of women’s social situation, by analogy with different strands of Marxist epistemology. For example, the feminine cognitive style claims epistemic advantage because ways of knowing based on caring for everyone’s needs produce more valuable representations than ways of knowing based on domination. Institutionalizing feminine ways of knowing requires overcoming the division of mental, manual, and caring labor that characterizes capitalist patriarchy.
  2. Feminist Postmodernism – Feminist postmodernist ideas are deployed against theories that purport to justify sexist practices – notably, ideologies that claim that observed differences between men and women are natural and necessary, or that women have an essence that explains and justifies their subordination. The claim that gender is socially or discursively constructed and that it is an effect of social practices and systems of meaning that can be disrupted finds a home in postmodernism. However, postmodernism has figured more prominently in internal critiques of feminist theories. One of the most important trends in feminist thinking has been exposing and responding to exclusionary tendencies within feminism itself. Another important aspect is the shifting plurality of perspectives that rejects objectivism and relativism and, instead, endorses the acceptance of responsibility for one’s epistemic situatedness and mobile positioning (trying to see from many perspectives).
  3. Feminist empiricism – Two apparent paradoxes encapsulate the central problematics of feminist empiricism. First, much feminist science criticism consists in exposing androcentric and sexist biases in scientific research, based on the view that bias is epistemically bad. Yet, advocates of feminist science argue that science would improve if it allowed feminist values to inform scientific inquiry. This amounts to a recommendation that science adopt certain biases (the paradox of bias). Second, much feminist science criticism exposes the influence of social and political factors on science. Scientists advance androcentric and sexist theories because they are influenced by sexist values in the wider society. This might suggest adopting an individualist epistemology to eliminate these social biases. Yet most feminists urge that scientific practices should be open to different social influences (the paradox of social construction). Feminist empiricists dissolve these paradoxes by rejecting their underlying assumptions employing three different strategies: pragmatic, procedural, and moral realist.

The standpoint theories have been criticized for generalizing too much over all women while, indeed, extrapolating from average middle-class (slightly privileged) white women, excluding minorities. Postmodernist and empirical approaches seem better suited to describe the actual situation of women.

Feminist science criticism follows a pattern that seems common in most feminist interventions: First criticism and complaint, then constructive alternatives and progressive advancements. The early feminist science criticism focused on gender bias as a source of error, pointing out in which way the entire scientific endeavor is gendered and, thus, flawed. Later, such bias was taken as a resource for a pluralist re-conception of science. For example, feminist science should have a relational rather than an atomistic ontology, favor the concrete over the abstract, and encompass intuition, emotional engagement, and other feminine cognitive styles. Feminists have an interest in ontological heterogeneity (using categories that permit the observation of within-group variation, and that resist the representation of difference from the group mean as deviance), complexity of relationship (the development of causal models that facilitate the representation of features of the social context that support male power), and other values like the accessibility of knowledge, that diffuses power in being usable to people in subordinate positions. Such feminist cognitive values do not displace or compete with tending to evidence, because doing science as a feminist, like doing science with any other interest in mind (for example, medical or military interests) involves commitment to the cognitive value of producing empirically adequate theories.

Feminists question the claim that science is value-neutral (as in Lacey’s autonomy, neutrality, impartiality). Instead, they pick up Quine’s underdetermination argument that holds that all scientific hypotheses are built on background assumptions that are not necessarily scientifically backed up, and derive that science is always a value-laden inquiry. The chief dangers of value-laden inquiry – wishful thinking or dogmatism – are avoided by making findings accountable to public criticism and scrutiny. Feminist philosophers of science stress the variety of roles for social and political values in science, and the contingency of their effects. The following types of influence of social values on theory choice have been defended: Selection and weighting of cognitive values (e.g., accuracy, scope, simplicity, fruitfulness, internal consistency, and consistency with other beliefs (conservatism)); standards of proof; classification; methods; causal explanations, models, explanations of meaning, narratives; framework assumptions (for example, disciplinary boundaries).

Feminists regard the following conceptions of objectivity as problematic:

  1. Subject/object dichotomy: what is really (objectively) real exists independently of knowers.
  2. Aperspectivity: objective knowledge is ascertained through the view from nowhere, a view that transcends or abstracts from our particular locations.
  3. Detachment: knowers have an objective stance toward what is known when they are emotionally detached from it.
  4. Value-neutrality: knowers have an objective stance toward what is known when they adopt an evaluatively neutral attitude toward it.
  5. Control: objective knowledge of an object (the way it really is) is attained by controlling it, especially by experimental manipulation, and observing the regularities it manifests under control.
  6. External guidance: objective knowledge consists of representations whose content is dictated by the way things really are, not by the knower.

Instead, feminist conceptions of objectivity tend to be procedural. Products of inquiry are more objective, the better they are supported by objective procedures. Some influential feminist conceptions of objectivity include:

  1. Feminist/nonsexist research methods;
  2. Emotional engagement;
  3. Reflexivity;
  4. Democratic discussion.

Because inquiry is collaborative and reliant on testimony, what we believe is influenced by who we believe. Who we believe depends on attributions of epistemic authority, which rely on views about people’s expertise, epistemic responsibility, and trustworthiness. Feminist epistemologists explore how gender and other hierarchical social relations influence attributions of epistemic authority, considering their impact on (1) general models of knowledge; (2) the epistemic standing of knowers; (3) whose claims various epistemic communities do and ought to accept; and (4) how this affects the distribution of knowledge and ignorance in society. Some of these effects amount to epistemic and/or hermeneutical injustice against members of subordinated groups. Some feminist epistemologists have advanced conceptions of virtue epistemology to remedy epistemic injustice and ignorance.

Conclusion and insights

I have to admit that I feel emotionally detached from the feminist movement. As expressed in the beginning, I promote legal and political equality, support the destruction (or deconstruction) of prejudice, bias, discrimination and fascism of any kind, and would like to see that in our society categories like gender, ethnicity, sexual orientation or kinship status don’t matter or, at least, are not taken to justify the discrimination of minorities. But academically and intellectually, I am not very interested in these political issues. This being said, I’d like to focus on the biggest merit of FE in the past decades: The endorsement of social constructivism (in the coat of postmodernism) in realms like science and technology. I fully agree that science is a value-laden inquiry. I fully agree that the situatedness of the epistemic agent (here: the researcher, innovator, scientist,…, not just employing but constructing episteme) plays a crucial role for her knowledge acquisition and justification strategies. Anderson’s elaborations confirmed to me that my tree of knowledge about constructing meaning from experience makes very much sense. A patriarchic androcentric society most likely produces chauvinists and misogynists, or more subtly, favours masculine practices and discourse approaches. Someone or something has to plant the seed for change, either radically or subtly. FE since the 1980s has certainly raised awareness for the biases and presuppositions that pervade scientific reasoning. This may have improved the situation of women, and maybe also that of minorities in micro-societies like the scientific community. Whether this has also advanced the scientific endeavor itself is another question that I don’t dare to answer, because this SEP entry does not provide any concrete examples in this direction (but many examples that show the disadvantages of male dominance in science).

Evaluation and comments on the entry

I noticed that Anderson revised her entry recently and made it much shorter (from 35 PDF pages to 24). I didn’t read the older version, but the current one is clear, well-organised, and sufficiently objective and critical with FE. The first chapter has exactly that aggressive and angry undertone of the social movement feminism, but later chapters explain feminist approaches more reasonably and academically. The article is enriched with many case studies (often not explicated, but indicated with references) and examples, which makes it a good resource for further reading. Most of the cited approaches and references are from the 1980s and 90s, which gives me the impression that FE might have lost its original appeal. Or, maybe, the widely accepted social constructivist view of science made the FE efforts obsolete: Mission accomplished? Feminists, I expect, will disagree vigorously!

SEP 1-40 – Evolutionary Epistemology


Title: Evolutionary Epistemology

Authors: Michael Bradie, William Harms

Date: Published on January 11th 2001, last revision on January 21st 2020

Expectation: Why this topic?

I have high school level knowledge about evolution. I claim to understand its basic principles – selection along the lines of fitness (a kind of suitability for the demands posed by the environment) of an organism that is determined by a factor with a certain degree of variation and, thus, exhibits different levels of success (for example, survival, conservation). In biological evolution, maybe the most prominent form of it, the success is the survival of the fittest which is determined by an organism’s better adaptation (or adaptability) which is the result of its genetic constitution (with mutations of DNA as the variation factor). What could be evolutionary epistemology? My first thought is that it applies the same principles to epistemic success: Coming to knowledge via a selection process that rules out non-viable beliefs, unsustainable meaning construction, fallacious justifications, and erroneous truths (=falsehoods). Knowledge as the result of a refinement and selection process that works with the mechanisms of an evolutionary principle. This would make much sense to me, because that’s what I often do: resolve epistemic (but also ethical) conflicts by weighing the plausibility of the alternatives against each other, becoming clearer about the underlying premises, and narrowing down the options by defining which value needs to be served. This is a normative reasoning process because the selection has to be grounded on something that is beneficial, desirable, advantageous, or simply good in some sense. At the same time, treating the cognitive agency towards epistemic success as evolutionary understands it in a naturalistic fashion, if I am not mistaken. Would this be a missing link to reconcile NE with VE or other normative approaches to epistemology? I also believe that Dawkins’ meme conception plays a role here since these bits of information are the knowledge-related equivalent of DNA base pairs susceptible of translation errors and thus causing mutations. The environment that sets the parameters along which the selection proceeds may be the individual history of experiences as a kind ontogenetic evolution or the society or culture in a phylogenetic approach to EE. The analogue to the biological concept of inheritance would be learning in the epistemic context. Let’s see if my ideas fit with the experts’ competent survey of EE.


There are two different and distinct endeavours under the name evolutionary epistemology. One is the evolution of epistemological mechanisms (EEM) that focuses on how animals and humans developed cognitive faculties that allow them to have knowledge; the other is the evolutionary epistemology of theories (EET) that examines the evolution of ideas, scientific theories, epistemic norms and culture in general. EEM asks “What environmental condition favoured the biological evolution of cognitive capability? Why this one and not another? In which way is having knowledge a form of fitness?“. EET asks “Why did some ideas and theories outlive others? What are the selection factors? If biological evolution leads to diversity, why would we expect epistemic evolution lead to the convergence to a more viable truth?“. Both understandings come in ontogenetic and phylogenetic conceptions:


EE is concerned about descriptive accounts of epistemology. There are three possible ways in which EE interacts with traditional normative epistemology. First, it could be a competitive relationship in that either one or the other approach is the right one. Second, EE (or NE in general) could be a successor of TE by being more plausible and viable in contemporary worldviews. Or third, it could be complementary to TE, which is the most common view of EE endorsers. On this analysis, the function of the evolutionary approach is to provide a descriptive account of knowing mechanisms while leaving the prescriptive aspects of epistemology to more traditional approaches. At best, the evolutionary analyses serve to rule out normative approaches which are either implausible or inconsistent with an evolutionary origin of human understanding.

Within naturalistic epistemology, the EE approach is best suited for a formalisation and modelling of its object of study (knowledge systems). Static optimization models and their advancements with population dynamics under consideration of fitness (quantitatively and qualitatively expressed) can be transferred from established biological contexts to knowledge systems. Apparently, most of the biological parameters have analogues in the realm of epistemology, most famously the gene/meme analogy and the natural/cultural environment analogy. Moreover, and specifically, the evolution of meaning conventions under consideration of communicative acts and language has been modelled and simulated.

Conclusion and insights

Looking back at what I wrote in the expectations section, my initial understanding of it corresponds to what the authors label EET: Refining knowledge following an evolutionary strategy. The other program introduced here, EEM, is, from my perspective, an issue of biology rather than philosophy. Yes, if course, the results from such studies should inform epistemology, but in will never provide the answer to the core questions in (normative) epistemology. I am slightly disappointed of this article because it doesn’t provide any hint whether there is or was any philosopher who tried to reconcile descriptive and normative epistemology accounts in a fashion that consults evolutionary strategies for an evaluation of justification processes, meaning constructions or knowledge claims. I am totally convinced that the insights from EE are of utmost value for science theory! But where are the researchers who did that work?

This article confirmed to me that EE is an extraordinarily interesting and inspiring approach. Unfortunately, serving at best as a brief introduction, it doesn’t provide many details. Now and here is not the time and space to start elaborating on my own ideas and concepts. I will do more research first and sort my thoughts.

Evaluation and comments on the entry

The summary above is short because this is one of the shortest SEP entries in this series. This is very unfortunate because I believe that much more could be said about EE! Large parts of chapter 2 (the second of only 2 chapters) are very general about formal description of selection and fitness, and not even about EE. Another weak point of this article is that the authors claim in the beginning that EEM and EET are distinct endeavours, they do not address this distinction in the remainder of the essay so that it is often not very clear to which program their elaborations refer. Since the authors use a very readable and clear language I hope they could teach me more about this interesting approach to epistemology! I will definitely do more research on that, starting with below-mentioned handbook on evolution and philosophy.

Further Reading

Richard Joyce (ed.), The Routledge Handbook of Evolution and Philosophy, Routledge, Abingdon, UK, 2018 [there is a whole section (5 chapters) to evolution and epistemological aspects]