Home » Science and Technology Ethics » S&T Ethics 12

S&T Ethics 12

7. Examples

We are now equipped with all the tools we need and an overview of the knowledge that is required for an ethical evaluation of technological progress. We learned that technology is a socially and culturally embedded system that is strongly intertwined with many other social sub-spheres such as science, economy and politics. Therefore, we speak of socio-techno-scientific systems in order to capture the entirety of relations and pathways of actions rather than thinking of particular technical artefacts or techniques. These systems have an effect onto environment and society, with a special implication on economy as a social sphere. In order to have a normative framework for the evaluation of technological progress at hand we introduced the concept of sustainability. The direct effect of the availability of certain technology-induced potentials of action – in philosophical terms: the opening-up of new means-ends-relations – is expressed in terms of risks and benefits as unwanted and intended implications of technology. These intended benefits and also the potential risks are related to specific values that are at stake for certain entities, therefore called stakeholder (actors in society, the natural environment), ranging from functionality and safety aspects to economic wealth, health, social balance and others (see, for example, the VDI oktogon). An important question for the proper guidance of progress was that of responsibility attribution: Who is held responsible by whom for what in which respect? This endeavour of analysing and elaborating the ethical implications of technology is part of technology assessment. This approach starts with the acquisition of knowledge about the specific cases, initiates a discourse with relevant stakeholders, evaluates the normative arguments and implements decision-making and effective measures to bring technology development to a sustainable outcome.

12-01

This may all sound rather abstract. For the single elements we have seen many examples. However, we should summarise all these considerations with the help of contemporary cases of socio-techno-scientific systems. First, we will have a closer look at Nanotechnology, then we will discuss current debates on nuclear energy generation.

7.1 Nanotechnology

I chose nanotechnology because it is a perfect example of a socio-techno-scientific system in which the boarders between social sub-spheres, especially science and technology, get blurry, creating a need for new forms of responsibility and risk assessments. Moreover, after the experiences with biotechnology and genetics, the approaches of technology assessment changed towards a stronger involvement of public participation and an analysis of social values that are at stake.

Every reasonable assessment starts with a compilation of the state-of-the-art of knowledge. Before we can discuss ethical implications of nanotechnology we need to be sure that we know what it is and what it is able to do. Otherwise we might end up in speculation, which would be a waste of time and resources – a situation that actually occurred in nanotechnology assessments around the years 2000-2005. So, let’s first see what nanotechnology is and can do.

7.1.1 What is nanotechnology?

The Greek word “Nanos” means dwarf (侏儒). It was chosen as a prefix for the dimension 10-9. On a length scale, that is at the very short end of the scale. In our human world, we are used to think in meters which is a length we can understand by our body size. A thousand meters, which we give the prefix kilo, is still easy to grasp since we can see it with our eyes or can feel it when we drive a few kilometers with a car. A million meters (a megameter) or even a billion meters (a gigameter) can be visualised on maps or graphs but goes towards the limit of our imagination. You just can’t imagine the size of the sun! When you say you can you are overconfident! The same goes for the other direction on the length scale: A millimeter, a thousandth of a meter, is still visible for us. A micrometer is already difficult, and a nanometer is beyond the abilities of the human eye and, therefore, beyond our imagination. One nanometer is the thousandth of a micrometer which is a thousandth of a millimeter.

12-02

We can get a rough idea about such a length dimension by comparison to something we know. Imagine we fill the planet Earth with footballs of the common size. Approximately 1024 footballs fit into it. This number is unimaginably huge! Now we take a molecule that is 1nm in diameter, a “buckminster-fullerene” (C60), and fill a football with that kind of particle. Again, 1024 “buckyballs” fit into the football. That means, the size ratio of the nanoparticle to the football is the same as that of the football to planet Earth. Another comparison I heard is this: When a seagull sits down on the largest carrier ship of the world, that ship will go 1nm deeper into the water. You can imagine it? Forget it! You can’t!

12-03

Two more remarks on the length scale and why it is important for Nanosciences and Nanotechnologies: First, as you might know, the wavelength of visible light (=recognisable by the human eye) is between 380 and 700nm. Structures smaller than that can’t be visualised with the classical microscope that is based on light, no matter how sophisticated the setup is! In order to visualise and analyse nanoscale materials or structures it needs different kind of microscopes and spectroscopes. More about that later. The other significant point is that most biomolecules – proteins, DNA, cell compartments, etc. – are in the range of nanometers. A carbon atom is about 0.1nm in size, making a chain of 10 carbon atoms roughly 1nm long. By nanotechnological research the boarder to biology and biotechnology becomes blurry. Therefore, we call NT a converging technology.

Let’s bring more light into our understanding of NT by looking at the history of it, which is not very long, apparently:

12-04

In a famous visionary lecture in 1959, Richard Feynman pointed out that “there is plenty of room at the bottom“, referring to the big potential that is hidden in the smallest dimensions of matter that – at that time – were almost unexplored due to technical difficulties. Feynman imagined the entire Encyclopedia Britannica written on the head of a needle. He envisioned the benefits and advantages of “building materials atom by atom“. The only problem: our “sticky fingers” that are just too big. The first person to use the term “Nanotechnology” was the Japanese scientist Norio Taniguchi. In the early 1980s the technical precondition for science at the nanoscale was discovered: Scanning Probe Microscopy. As mentioned before, our visible light spectrum can’t be exploited to “see” nanostructures since the amplitude of the electromagnetic wave is higher than the size of the material. A scanning tunneling microscope detects the material it analyses through an electric current. Like a very small “fingertip” it “scans” across the surface of the probe and detects height differences that are in the range of a few nanometer. Over the decade, massive improvements and advancements have been developed, but Binnig and Rohrer’s invention was the first of its kind and – finally – gave access to the nanoscale of matter and its discovery. Not long after that breakthrough, the first concerns arose. Eric Drexler expected that researchers sooner or later would develop molecular machines that can do mechanical work on a molecular basis, but also “nanobots”, self-replicating autonomous nanosized agents that could – once released – impact the environment so massively that it can’t be fixed anymore – a scenario known as “grey goo”. Another breakthrough pushed the development of nanoscience in 1990: Scientists at IBM succeeded in manipulating single atoms. With the tip of a kind of scanning tunneling microscope they moved Xenon atoms on a surface and placed them to write “IBM”. Feynman’s vision was one step closer to become reality. During the 1990s great progress could be achieved in research on nanoscale materials and the exploitation of their properties. In Politics and industry it was soon recognized as huge potential for a new market of nano-products that promise economic profit. The first nation to run a state-funded “Nanotechnology Initiative” was USA in 2001, followed by the EU, Japan and South Korea. Also Taiwan invested a lot into its “National Program on Nanotechnology”. Today, people even talk about a “Nano-Age” (as in “iron age” or “IT decade”), referring to NT as the most impacting technology of these times.

Let’s have a look at definitions of what “Nanotechnology” actually is. The Foresight Institute in USA published this definition:

Structures, devices, and systems having novel properties and functions due to the arrangement of their atoms on the 1 to 100 nanometer scale. Many fields of endeavour contribute to nanotechnology, including molecular physics, materials science, chemistry, biology, computer science, electrical engineering, and mechanical engineering.

This definition refers solely to the length scale and defines the area of research activity by disciplines of science and engineering. This definition was criticised to be not precise enough and to bear misunderstandings. We have seen that biomolecules are also in the range of a few to a few hundred nanometers. Does that make a cell or another organism “nanotechnology”? The definition that was used for the US-American NNI looks like this:

Nanotechnology is the understanding and control of matter at dimensions between approximately 1 and 100 nanometers, where unique phenomena enable novel applications. Encompassing nanoscale science, engineering, and technology, nanotechnology involves imaging, measuring, modelling, and manipulating matter at this length scale.

Here, an important aspect is added: NT requires an active manipulation of matter, a directed exploitation of nanoscale effects, to count as NT. We will see in a minute what these “unique phenomena” and “novel effects” are that these definitions talk about. We keep in mind that the basic idea of NT involves the scientific analysis and understanding of nanosized materials, exploitation of properties for engineering and product development, and the application of such products to constitute the “technology” that results from scientific activity. Besides these rather technical definitions, I would like to present another viewpoint, communicated by researchers from the IBM Watson research center:

“[Nanotechnology is] an upcoming economic, business, and social phenomenon. Nano-advocates argue it will revolutionize the way we live, work and communicate.”

This statement doesn’t contain any scientific or technological information. It simply claims – not without a certain critical undertone – that NT is a political or social construct, a concept to group certain scientific and technological activities together as a strategy to generate economic value. It points out its social dimension by mentioning its impact on the daily life of the general public – the way we live, work and communicate. This definition will be significant for the second part of this lecture. We will see how NT is discussed among different stakeholders and what kind of ethical, legal and social concerns it gives rise to. But first, let’s now see what is so special about nanosized matter:

12-05

Most of the properties of matter – chemical reactivity, electronic behaviour, magnetic susceptibility, etc. – depend on the surface size and configuration of the material since only the surface atoms can reveal their electrons’ characteristics. The crucial parameter is the volume(體積)-surface(表面)-ratio. Let’s make a thought experiment to understand how the surface of a material increases when it is cut into smaller and smaller particles. Imagine a dice that has an edge length of 1000 atoms (about 100nm). It has, therefore, 1 billion atoms. We can count 8 corner atoms, 12 edges with 998 atoms each, so it is 11976 “edge atoms”, and 6 faces with almost 6 million face atoms. Altogether, 6 million out of 1 billion atoms are on the surface, the others are inside the dice. The ratio of outer to inner atoms is 0.006. Now we cut this dice into 1000 dices with an edge length of 100 atoms (=10nm). Each dice has now almost 59,000 surface atoms (corner+edge+face). 59,000 out of 1 million, that is a ratio of outer to inner atoms of 0.062. With other words: the one bigger dice had 6 million surface atoms, the 1000 smaller ones have, altogether, 59 million surface atoms, almost 10 times more. This increases dramatically when we cut such a 10nm dice into 1000 1nm-dices that have an edge length of 10 atoms: such a dice has now 1000 atoms out of which 488 are surface atoms, almost half! In our thought experiment, the first 100-nm-dice cut into 1 million 1nm-dices increased the number of surface atoms from 6 million to 488 million! You can imagine that this leads to a massive change of chemical and physical properties!

In order to understand how exactly the particle size impacts the material properties, we would need so much fundamental physics and chemistry that it would take us an advanced course over one semester to get a glimpse of understanding of it! It will be a course on quantum theory, electron vibrational modes, band gaps, and other topics that are much too complicated for an introductory lecture for non-scientists. Therefore, we won’t even try to understand it. What is totally sufficient for our understanding of NT are the following aspects concerning electronic (電), optic (光) and magnetic (磁) properties of materials: All three are size-dependent, that means they are different at different sizes of the bulk material. For macroscopic objects, for example, the electronic characteristics can be described using Ohm’s law (歐姆​定律). For nanoparticles it is not valid anymore: electrons show a behaviour that deviates so much from the predictions by Ohm’s law, that we can’t apply it to nanosized materials. This has to do with energy states of electrons in the material: In smaller confined limits (e.g. a nanoparticle) with fewer atoms, they are less flexible and can’t change their positions, they are more “distinct”. That means, a material that shows high conductivity as a bulky material (for example a dice of 1cm³ or a 10cm wire) can be non-conductive in the nanoparticle form, or vice versa. Quantum effects (量子​力學) occur much more often than in larger-sized particles. Due to a completely different interaction with light, for example refraction or absorbtion, some materials have totally different optical properties, for example different emission maxima, expressed by different colours. The photo shows Cadmiumtelluride Nanoparticles of different sizes between 10 and 100nm. Each have different absorbtion maxima and, therefore, show different colours.

12-06

Also magnetic properties can change drastically for smaller particle sizes: a macroscopically non-magnetic material can form ferromagnetic nanoparticles. Also the strength of the resulting magnetic field and coercive forces (the energy needed to switch the magnetic spin, “N” and “S” so to say) can vary in both directions (higher or lower for nanoparticles), depending on the material. Let me show you a famous example that we can see in our surrounding, at least in Europe, every day: Gold and silver don’t have golden or silver colour in the nanoparticle form. They show – like here on the left – all kinds of colours at different particle sizes and even dependent on the particle fashion, for example spheres or prisms.

12-07

We can see these effects, actually, in coloured glasses, for example these beautiful church windows from the medieval ages or even older! Of course the glass manufacturers didn’t know what actually happened on the nanolevel, but they produced gold and silver nanoparticles by doping the silicate they made the glass from with traces of gold and silver salts.

12-08

from: Alan L. Porter & Jan Youtie, Nature Nanotechnology 4, 534 – 536 (2009)

In this figure we can see the main fields of activity of NT. Different areas and colours show the involved scientific fields: Classical natural sciences like Physics and Chemistry, material sciences, engineering on the right, the life sciences like Biology, Medicine, Cognitive sciences on the left. The size of a circle reflects the number of published research articles, which is a good measure for how much research is done in that field. The lines between circles reflect interdisciplinary collaborations of researchers from different fields. We notice two things from this overview: First, the two major fields of NT activity are material sciences (developing, investigating and improving nanomaterials) and biomedical research. Second, whereas the material sciences part is more active in producing and publishing research results, the life science part is conducting more interdisciplinary research. The high activity in the material sciences is not surprising for a new and emerging scientific field: They set the basis for the development by conducting fundamental research on properties and behaviour of new materials and their application potential. It is more remarkable that within a decade a very active body of interdisciplinary research in an “applied” field of NT arose: Nanomedicine and other human body related domains. We will see later why this is important to recognize!

We don’t have the time to go very deep into the different sub-fields of nano-particles, nano-devices and nano-techniques. The most prominent materials are inorganic nano-particles (e.g. iron, iron oxide, aluminum, silicon, etc.), carbon-based particles (fullerenes, graphene, carbon-nanotubes, etc.), organic nano-particles (capsules, almost cell-like nano-containers), and nano-structured surfaces (by printing, scratching, burning, etc.). I’d like to show you just one example of which I believe it is one of the most sophisticated applications of nano-particles, their design and their effect: hyperthermia.

12-09

This is a brain cancer therapy with iron oxide nanoparticles that are coated with a bioactive layer. It has receptor proteins on the surface that recognise cancer cells specifically so that the particles agglomerate in the cancer cells. The magnetic properties are exploited to kill the tumour. First, a dispersion of such nanoparticles is injected into the brain tumour. This is also the limiting factor at the moment and the reason why it can’t be applied to liver tumours or other organs: The brain can be perforated with a needle without being destroyed. This is not possible with other organs. After the injection the particles accumulate inside the tumour cells while healthy tissue remains unaffected. The patient is then subjected into an MRI that generates a high frequency magnetic field, that means the magnetic field switches the poles rapidly. The magnetic nanoparticles align to the field, but due to the high frequency they start shaking or vibrating. This effect creates a lot of local heat. The tumour cells with the nanoparticles heat up until they are destroyed. The researcher who invented this treatment, Prof. Jordan from Berlin, founded a company, MagForce AG, that holds the patent and is now focusing on this kind of treatment. Clinical tests have been successful and it is now approved as medical treatment.

To summarise, what we talk about in the context of current NT are mostly new forms of compounds and materials with new engineered and designed effects. Fields of application are surface coatings, electronic parts, sensor industry, materials for industrial and other applications, but also medicine (diagnosis and therapy) and cognitive sciences. Within this range, social and ethical implications can and must be discussed. These range from toxicological studies to effects onto social institutions such as health care system and safety by new surveillance technology, and global equity in access to new nanotechnologies. The nano-specific issue, in most cases, is a miniaturisation of devices, which enables new ranges of performance, but also new situations of human interactions. What is still far from realisation is autonomous and self-replicating nanobots as envisioned by Eric Drexler.

7.1.2 Small Particles – Big Issues?

Why is NT such a hot topic for Technology Assessment and ELSI research? We have defined NT as a social and political endeavour earlier. Worldwide we can observe an almost campaign-like political support of NT, making it an R&D field with high social and environmental impact, not only through new substances and materials with unknown properties and uncertain effects, but also through technological output that might change “the way we live, work and communicate“. Some sociologists expressed their viewpoint that there actually is no such thing as “Nanotechnology” or “Nanoscience” since scientists, engineers and product developers do the same thing they did before it was labeled “Nanotechnology”. NT is, according to them, simply a political agenda that is run as a response to the experiences with the biotechnology/genetics field. Politicians and economists want to avoid making the same mistakes. In BT they “lost” the public to fears and irrational concerns, in NT they try hard to win the public’s trust by pointing out – almost advertising – the “great potentials” and “revolutionary benefits” of NT, facilitated by mass media and a new dimension of science communication. TA/ELSI research regards itself as the “balancing pole” to mediate between the public’s concerns and fears and the campaigners’ overconfident prospects.

Additionally, the special thing about NT is its emerging and converging character. Emerging means that its progress catalyses and accelerates its own further progress – the more phenomena are uncovered and exploited the more fields of “Nano”-related R&D open up. Converging means that it covers and includes more and more disciplines, as we have seen in that “map of nanosciences”. Bridging traditional classifications of natural sciences and engineering, it is often grouped together with biotechnology (and genetic engineering), information and communication technology and cognitive and neurosciences to the “NBIC” sciences. Giving access and giving rise to new scientific and technological possibilities, to new technological artefacts and products that enter the industry and consumer market might have the potential to change societies. This is a regulatory challenge! With the idea of societal embedding in mind, the development process can’t be left unwatched on the one side, but a governing approach shouldn’t block useful and helpful innovations. The NT era lies beyond the positivistic S&T paradigm that all upcoming troubles and problems can be solved with the right policy. There are several realistic scenarios with NT-related technologies in which there actually is a threat of irreversible damage (e.g. release of toxic nanoparticles). Finding the fine balance between careful precautious regulation and supportive S&T governance is a big task for national and international legislations! A complex difficulty arises from the fact that NT and its scientific approach has the characteristics of an enabling technology. It means that there is a lack of clear applications that present definite means-ends-relations that could be analysed, but that it must be expected that the knowledge and expertise in the handling and application of nanotechnological items (compounds, devices, etc.) gives access to a huge variety of potential and actual items with all kinds of effects and purposes – of which not all will be desirable!

Another achievement of NT is certainly a new rise of science ethics. The debate on ethical aspects of this particular S&T field triggered a growing awareness for connections between science and society, sensibilised scientists and researchers for good scientific practice and appropriate the Code of Conduct in Science (as you learned in the first part of this course), and led to debates on values and worldviews in general. We will see details on this later. A second effect of a similar type is the institutionalisation of technology ethics. More than any other techno-scientific system before, NT got into the focus of extensive TA – not only on the technical level, but especially also on the social and environmental level. Especially the new approaches of public participation in the discourse on NT is a merit of nanotechnological progress and its professional assessment.

In regard of this premise, we can apply the scheme of four different risk discourse strategies to the case of nanotechnologies. Remember that we learned that the same technology may have discourse elements in more than one or even all four fields.

Certainly, some of the NT-related risks fall into the category of simple risk discourse. Toxicology of nano-materials in controlled and confined environments, for example in food packaging, technical coatings or in clothes is readily possible. Here, no discourse on disagreements is necessary. However, the case of toxicology easily gets very complex, for example when the nano-materials are entering “open space”, like an organism or the environment. Here, it needs the help of experts and researchers to elaborate suitable toxicological measurement methods and assessment strategies. Due to the complexity a disagreement on proper methodology and result interpretation might arise. Therefore, a cognitive discourse is needed, but not so much a normative one, since the affected values (e.g. safety and health) are shared by all involved discourse participants.

Among the uncertainty-induced risk discourses – those that arise from the fact that the newness of the development doesn’t allow comparison to previous experiences and that the future can’t be predicted – are impacts on social (sub-)systems (e.g. the healthcare system of a country), and on global balance and wealth. A big part of the NT debate was related to global equity and access to the benefits of NT (in the widest sense a matter of justice), and on opportunities and risks for developing countries. This discourse is an evaluative one because stakeholders (besides nanotechnologists those are politicians, regulators, but also economists) attempt to create a situation in which a certain value – justice and equity – is supported and its violation is avoided. However, since there is no experience, the implemented measures have to be reviewed and observed in order to correct them whenever necessary. The result is an iterative progress towards the desired situation, or at least around the least desired situation.

The fourth strategy, the ambiguity-induced risk discourse, applies to many heatedly debated issues arising in the context of NT. The most prominent one is nanomedicine and its methods of diagnosis and treatments (and their merge into one-step theragnostics), others are the NT-enabled possibilites of human enhancement, synthetic food industry, or military and surveillance technologies. Here, the disagreement is on the question which value(s) is/are at stake. Different stakeholders – which here includes the wider public, the third parties – have different preferences and viewpoints on what should be preserved and protected, or in which way risk trade-offs are acceptable or not. For example, in the field of nanomedicine, the efficiency of treatments might go along with the generation of a huge amount of patient data and a silent corruption of privacy. Which one is to be rated higher than the other is subject of the discourse. Here, a cognitive or evaluative analysis is not enough. It needs a normative assessment and deliberative approaches to find compromises and make decisions on how to proceed.

A problem for all non-simple risk discourses is the danger of speculation and science-fiction scenarios. When discourses are held on matters that have no reasonable foundation, a lack of credibility of the risk approach is created. There is a strong call for pragmatism by those who deal with NT-related risk issues. It is very difficult to predict the future and perform plausible technology foresight. Instead of debating future presents (situations that might or might not occur in the future) it is more reasonable to debate present futures (possible pathways of development that can be followed from the present situation onwards). TA elaborated several assessment tools that allow a convincing and meaningful analysis of technology development. As we have seen before, technology ethics finds its way into the debate in the form of a middle-way between top-down and bottom-up ethics, namely as the more pragmatic principlism approach. It requires a plausible connection between a technological context situation and a set of values that is at stake for at least some people or instances. I’d like to illustrate in the following NT-related example how this works in practice.

7.1.3 Technology Ethics in practice: The Nanopil

The Nanopil is a project by a Dutch team of nanoscientists and researchers in the framework of the Nanotechnology Program in the Netherlands. based on the idea of an oncologist, they attempt to develop a lab-on-a-chip for the detection of colon cancer (腸癌). This form of cancer is one of the most frequent ones in Europe (or maybe worldwide). Therefore, the government is interested in simple and cost-effective screening methods. The current method requires the patient to send a stool sample to the doctor’s office and has a relatively high error margin. The nanoscientists conceptualised a pill-sized lab-on-a-chip – a tiny sensor array with sample filters and detectors – that is swollowed snd that travels through the digestion tract, collecting bowel fluid, detecting cancer-specific molecules, and responding upon a positive detection (see an animated video of the concept here). There are two possible options for revealing the results: Either the lab-on-a-chip carries a RFID sender that transmits the detected concentration of the target molecule to a receiver, for example a smartphone, or the chip has a small capsule of a blue dye which is released when the target molecule (the indicator for cancer) is found. The idea of the scientists is the method’s simplicity: Patients don’t have to collect and send those humiliating stool samples anymore. Moreover, the sensitivity of the screening is greatly improved, making it much more efficient. The promoted values, therefore, are health, dignity, efficiency and maybe personal integrity.

The project was accompanied by a team of technology assessors who examined the ethical and social implications of the Nanopil. There are not many toxicological issues since the lab-on-a-chip doesn’t release any substances into the intestines. Therefore, this project can serve as a perfect example for a NT-related development with impacts onto values that are beyond mere toxicological risks. We may assess this project with a 3-tiered plausibility analysis. First, we may ask whether the expected values are plausible and whether they are indeed plausibly affected by the new technology. Second, we ask whether the expected and promoted values are really desirable for those who are affected. Third, we point out the options that the Nanopil designers and developers have in order to make the Nanopil support desirable values, that means how they can inscribe the desirable means-ends-relations into the new product.

The developers claimed that the Nanopil is a much simpler, less humiliating and more elegant detection method. Is this really true? It turns out during the research phase that the biggest obstacle for the nanopil to work is the clogging and jamming with solid substances in the digestion tract. Therefore, in order to work, the patient has to drink 4 liters of laxative, a disgusting liquid to clean the intestines. This part of the detection was not mentioned in the animation but adds an important procedure that has a strong significance for its applicability. The scientists also debated which indication method, RFID signal or blue dye, is more suitable. This has a technical dimension (the RFID chip is more difficult to implement than a blue dye capsule), but also a significant user dimension: In case of the RFID signal questions of data protection and privacy arise, for example who may receive the result on his smartphone, the patient only or also the doctor? The blue dye capsule implies that it is conceptualised as a self test: The patient might feel insecure in interpreting the results, so that there is no added value of the Nanopil as an efficient screening method. Another important value, the screening efficiency, will strongly depend on the availability and distribution of the pill. When it is only given out by doctors to patients, the demand of produced Nanopils might be low so that it will be more expensive. When it is offered freely in pharmacies, the price will be low, but patients might show different patterns of usage, so that the detection method has to be taken into account again.

The case has to be debated outside the confinements of the researchers’ lab, for example with doctors, patients, pharmacists, health insurances, etc. Doctors report that the majority of patients doesn’t have any problem with the stool sample, actually. In contrast, it will be very difficult to convince the people to drink the laxative before applying the Nanopil. When this problem can’t be solved, the Nanopil will not be accepted, according to their point of view. Patients are mostly concerned about the affordability of the screening. Here, a coordination with health insurances and distributors is required. Again, this has an impact on the actual pill design, so that the insights on the actual value preferences in society has a direct impact on the research and development process. Last but not least, the design of the pill directly influences the doctor-patient-relationship: Patients feel insecure with the interpretation of the results, especially in case of the blue dye option, and prefer a data exchange with the doctor. This, in turn, raises issues in data handling and its safety, especially in case of the RFID solution. Here, the case is an ambiguity-induced discourse that cannot be held by the scientists alone, not even together with the doctors. It requires the inclusion of the public and a decision by politicians (elected representatives) that directly affects the final design of the Nanopil. In any case, these considerations show that it is possible to assess a technology during its R&D phase and implement the acquired insights into value relations in the design and conceptualisation of the item. This is the idea of constructive TA.

7.1.4 Nanotechnology in Taiwan

Let’s leave the European perspective now and have a look at Taiwan in comparison. In the year 2000 the Executive Yuan’s Science and Technology Advisory Group established NT as “key area” of national R&D, confirmed in 2002. In 2002 the National Science Council approved in its 157th Meeting the “National Program on Nanotechnology” (NPNT) for 6 years (2003-08), later extended by a “Phase II” (2009-2015), approved in the 178th Meeting of NSC in 2008. All in all, 177 Billion NT$ were pronounced to be invested into Nanoscience and innovation research in the two stages of Taiwan’s NNI, making it the sixth largest NT funding program worldwide (in terms of financial volume). As far as communicated by the legislators, the major intention of running the NNI at this large scale was to facilitate commercial developments of nanotechnological applications. The substantive focus of research that was primarily supported by the program was on nanobiotechnology, basic research on characteristics of nanoscale materials, development of nanodevices and nanoprobes and their applications. In terms of the “science map” overview, the focus of the NNI in Taiwan was much more on contributions to the fields on the right side (material sciences, physics) rather than on the left side (interdisciplinary biomedicine, nanomedicine). Between 2003 and 2005 Taiwan’s Environmental Protection Bureau invested 22 Mio. NT$ on research into risk controls and environmental issues related to NT in lab or factory. 22 Million out of 177 Billion, that is 0,012% (for comparison: The NNI in the USA determined 2% of its funding for EHS and ELSI research)! The “success” of a S&T program is difficult to determine. The direct impact on the GDI, for example, is impossible to measure due the complexity of mechanisms in the interplay of science, industry and the global markets. One hint can be derived from patent generation. Here, Taiwan is only ranked 15th in the international comparison. As 6th largest Nano-campaigner, this is rather disappointing.

In order to understand Taiwan’s S&T Policy approaches we have to look a little bit into the recent past of Taiwan. In the early years of KMT (國民黨) reign on Taiwan, the support of scientific and technological development was of no significant importance for the leaders. The main goal of Chiang Kai-Shek was to get back the power over the mainland and leave the island again. The initiators of scientific and technological governance came from outside the elite (e.g. technocrats (工藝師) like Li Guoding (李國鼎), foreign advisors) and needed the support of external factors and events: In the 1950s, the US financial aid was paid under the precondition to invest certain amounts into S&T development. In the late 1950s and early 60s, Taiwan invested into nuclear physics, hoping to make advancements in the development of nuclear weapons before the enemy – PR China – would. In the 1970s the ROC was derecognized by more and more countries, losing several financial aid sources. At the end of the 1970s the government, now under leadership of Chiang Kai-Shek’s son, was finally convinced that only a strong support of local S&T as R&D motor under political guidance could lead to a stable Taiwan. The Hsinchu Science Park (新竹科學工業園區), The Industrial Technology Research Institute (工業技術研究院) and the Science and Technology Advisory Group were founded. Under strict authoritative governance by the KMT – it was still the era of martial law – the technological progress was fast and efficient, it led to the Taiwan Miracle. The success story following this era led to a strong scientism in Taiwan – the trust in S&T development as guarantor of wealth and good life. The viewpoint of the technocrats that pushed and promoted the focus on S&T support as foundation of wealth influenced the public perception of technological progress sustainably. It is widely believed that the comforts and advances of today’s Taiwan’s lifestyle are the merits of technological progress that led to higher international industrial competitiveness and economic growth.

Let me illustrate that by an example: The labelling system for nano-products. In the EU the labelling of products containing nanotechnological components is still under discussion. It is understood as a warning label for protection, giving the consumer the chance to decide not to buy it. Basically, it is a response to the high degree of scepticism and worries among consumers that are “afraid” of Nano. It reflects the highest ethical values of European societies: Freedom (“informed consent”, or here: “informed denial”) and autonomy (self-determination). In Taiwan, the situation is completely different! The NanoMark, the world’s first official Nano-label, is understood as a marketing label that shows the consumer which of the products he can choose from contains nano-materials. It is based on the observation that the acceptance of NT among Taiwanese people is exceptionally high. They associate NT with innovation and something fancy and new. It was found that products with the NanoMark increased their sales numbers. In contrast to Germany where companies have to be forced to label their nano-products, in Taiwan companies misusing the label for marketing reasons had to be forced to remove the label because they put it on a product that actually doesn’t contain any nano-compounds. Also trust plays an important role: Here, the trust in food or cosmetic industry is so low (due to a large number of reported scandals) that people have no choice but trusting the government that implements a labelling system that gives the people the feeling that there is, at least, something to rely on (“It has a label, it must be somehow tested, at least!”). The considerations that motivated the NanoMark introduction had mostly the competitiveness of the local industry and the social wealth in mind.

12-10

7.1.5 Nanoethics in books and movies

For “ethical laymen” – those who didn’t study Philosophy or Ethics – the ethical implications of science and technology (but also other, maybe all, life aspects) can sometimes be grasped more easily from narratives, that means from stories and literature and their underlying moral messages. Here is a list of books and movies that include Nanotechnology and its issues:

Books:

  • Michael Crichton – Prey (2002) (麥可克萊頓 – 獵物 (奈米獵殺))
  • Stel Pavlou – Decipher (2001)
  • Neal Stephenson – The Diamond Age (1995)
  • Greg Bear – Blood Music (1985)

Movies:

  • Terminator (未來戰士)
  • The Day the Earth Stood Still (當地球停止轉動)
  • Ghost in the Shell (攻殼機動隊) (Stand Alone Complex) HH

7.1.6 Practice Questions

  • A nano-device based artificial eye allows the perception of infrared (IR) light. It could help truck drivers driving more savely at night. What are the risks of the implementation of such a technology?
    • Remember the four levels of risk discourse
    • Hint: The availability of this device causes competition among truck drivers, indirectly forcing those who would deny such an enhancement to apply it.
  • Taiwan‘s NNI is openly aiming at „international economic competitiveness“. Evaluate the plausibility of this goal with the VDI octogon.
    • By focusing on economic values, which values are specifically neglected?
  • Is it justified to talk about a „nano-age“ (as in „stone age“ or „IT age“)? What makes nanotechnology so impacting?
    • Another way of asking: What justifies the acceptance of the high risk level?

 

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s