Technology Assessment
Full course description
In the 1960s, Technology Assessments (TA) came into being as a way of assessing the risks of innovation and averting undesirable consequences of new technologies. In principle, TAs aim to anticipate unintended effects of new and emerging technologies. Although it is impossible to foresee every possible consequence of a given technology in advance, TAs provide insight into its potential impact on society and can thus help to steer its development and implementation.
For a long time, TAs focused on the ‘hard’ impacts of technology, such as quantifiable environmental harm, or health and safety risks. Yet the ‘soft’ impacts of new technologies - like for example on ethics or social relations - have gradually received more attention. This is why today, various forms of TA exist, such as Constructive TA (CTA), Ethical Constructive TA (eCTA) or the DPIA (Data Protection Impact Assessment). This course introduces you to a variety TA approaches and important related concepts, like risk (society), hazard, uncertainty, and responsibility. You will learn to apply TA frameworks and concepts to examples and cases of your own interest. In doing so, you will ask: what are the limitations of these frameworks? How do they function? And how can we use them successfully to avoid undesirable consequences of new technologies?
The course offers several lectures from experts in the field of TA - such as scholars from the Rathenau Instituut in The Hague, the TA agency of the Netherlands -, philosophy of technology, and Responsible Research and Innovation (RRI). The course also includes a collaborative workshop on Generative AI in higher education, developed with FASoS’ tech lab The Plant, as well as a writing workshop in which you gain peer feedback on your final report. The assessment of this course consists of two mandatory assignments: a collaborative group presentation in the middle of the course, and an individually written TA report on a digital technology of your choice at the end.
For a long time, TAs focused on the ‘hard’ impacts of technology, such as quantifiable environmental harm, or health and safety risks. Yet, recently, the ‘soft’ impacts of new technologies - like for example on ethics or social relations - have gradually received more attention. This is why today, various forms of TA exist, such as Constructive TA (CTA) or Ethical TA (eTA). This course introduces you to such TA approaches and to important related concepts, like risk (society), hazard, uncertainty, and responsibility. You will learn to apply TA frameworks and concepts to examples and cases of your own interest. In doing so, you will ask: what are the limitations of these frameworks? Which criteria are missing? And how to deal with that?
The assessment of this course consists of a final, individually authored TA report, in which you will evaluate an emerging, digital technology of your choice. For this report, you will also engage in a peer-feedback session with your fellow students. This course also features a collaborative group presentation in which you will collectively assess a new technology. Last but not least, it offers several lectures from experts in the field of TA - such as scholars from the Rathenau Instituut in The Hague -, philosophy of technology, and Responsible Research and Innovation (RRI).
Course objectives
By the end of this course, students will be able to:
- Explain the societal importance of Technology Assessment and its approaches to technological risk and uncertainty
- Explain key theories, concepts, and approaches in Technology Assessment.
- Evaluate an emerging digital technology of your choice from a Technology Assessment perspective and recommend future steps for research and governance.
- Produce a professional report that assesses an emerging digital technology of your choice, using a technology assessment framework.
- Understand and evaluate GenAI tools like ChatGPT or Perplexity and acquire practical skills of using such technologies in teamwork contexts.
Prerequisites
None
- D. Petzold