From Trust to Tracks. A Technology Assessment Perspective Revisited
The two generations of TA: From forecasting to social constructivism
Over the three last decades, technology assessment has evolved both regarding its concept of technology-society interactions and its political or societal responsibilities. Traditionally, two generations of technology assessment are differentiated. The first relied on the concept of technological options and dealt with an evaluation of social impacts and developing scenarios for social responses. The anticipation of future changes and the democratization of the political decision making process were at the core of TA activities. This first generation was marked by a sort of technological determinism that sustained a vision of an autonomous technology with its inner logic that would affect the future of our society in a predetermined and thus in a non-negotiated way. In this framework, the role of TA was understood as forecasting in order to advise political decision makers and the so-called public about sustainable and socially acceptable technological choices. This institutional organization of TA with clear and separated roles attributed to the various actors (politicians decide, engineers design, the public does or does not accept the decisions) was hard to sustain in terms of empirical evidence.
This was also underlined by Bijker, who pointed out that the clear separation between decision makers, designers and users is an illusion when considering the socio-dynamism of technological deployment.
Since the 1980s, sociological and historical studies have developed a constructivist analysis of technology in contrast to the standard image of technology that was largely “technologically determinist”. The idea that technology is socially shaped, rather than an autonomously developing force in society or a primarily cognitive development, is not entirely new, but its present momentum and precise formulation are quite recent. Social shaping models stress that technology does not follow its own momentum nor a rational goal-directed problem-solving path, but is instead shaped by social factors. … Demonstrating the interpretative flexibility of an artifact makes clear that the stabilization of an artifact is a social process, and hence subject to choices, interests, value judgments – in short, to politics.1
As is well explained by Rip, the basic idea of Constructive Technology Assessment is
to shift the focus of TA away from assessing fully articulated technologies, and introduce anticipation of technology impacts at an early stage in the development. Actors within the world of technology become an important target group then, but the insight of recent technology studies – that impacts are co-produced in the implementation and diffusion stages – implies that technology actors are not the only ones to be involved. Within the world of technology, the preferred strategy for CTA is to broaden the aspects and the actors that are taken into account. More generally, one should work towards societal learning in handling, and sometimes managing, technology in society.2
This CTA is clearly based on the micro analysis of technological cases and articulated on a dual vision of technology as being shaped by society and on the other hand shaping it at the same time. If determinism can be seen as the major critique of the first stage of TA, the relativism related to the analysis of micro cases is one of the major risks of the second generation, since the assessment is dependent on the values and on the interests of the various factors involved in the technological dynamics. This focus on the actors, their values and their interests together with a commitment to a descriptive methodology make this constructive assessment of technologies a bit disappointing regarding its political and ethical commitment to society. In other words, there is a sort of liberalism that clouds this approach, suggesting that the “good” or the “fair” will eventually emerge from the social network involved in the construction of a particular technology. In this constructive approach, STS scientists consider that their responsibilities only apply to the social reflexivity generated by their description of the technological dynamism and its social construction.
The third generation: a revisited and militant TA
As social scientists we demand that the next generation of technology assessment be less neutral or more political and ethical in its approach to new technologies. Following Introna, we consider every technological artifact as micro-politics, as a script that incorporates social and political orderings, norms, and values.3 The role of this revisited TA is to make this script transparent by explaining the different closures that shape its conception. This exercise of transparency needs some support to explore the script and to assess it. To a certain extent, we have to oppose other norms and values to the normative project implicit in the technological approach. If we do not explore those scripts by adopting a clear normative principles stance, we just describe the technologies as they are decided and appropriated by actors in the field.
But is this sufficient to be sure that our society remains human? In a way, this constructive approach, by proposing that we are all actors of a technological construct, denies that those technological artifacts are dominated by vested and well organized interests, introducing an unbalanced game of power. How can we move beyond the focus on the micro scale, in which the constructive TA seems to remain, so that we can address societal issues and extend their deliberation to a larger audience? For all those reasons, we demand a more militant approach from social scientists when assessing technologies. This militant approach starts from the recognition that we have some values to defend, even if it is a very ill considered position in a general context still marked by the supposed neutrality and objectivity of science.
The first age of TA was macro and heavily marked by a technological determinism and by institutional settings; the second age was micro and strongly marked by a sort of relativism due to the constructive frame. What was missing in these two generations of TA was a “moral or ethical framing” based on defined principles to conduct the exploration of the artifact under consideration.
Let us briefly question the status and the meaning of those ethical principles.
According to Ladrière, ethics is based on ability or capability.4 It is not abstract knowledge that is theoretical or normative and which one could define and transfer to others. Instead, it is a practice, an ability to face a situation ethically. This position is very close to that developed by John Dewey who underlines that the search of universal and fixed norms for an ethical approach can be compared to the quest for certainty in epistemology, which is at the source of so many problems that are badly defined and therefore never solved.5 In that sense and according to Ladrière, the role of the so-called STS experts is not to decide in place of the concerned actors, but to make deliberation possible and to enlighten it by clarifying the ethical questions raised by the micro politics at work.
Ladrière and Dewey suggest that we never approach an ethical problem from a “tabula rasa”, without using some ethical references or principles transmitted by the tradition. But for Dewey as for Ladrière, these principles are not fixed rules that could, as in a cooking recipe, tell us by themselves what to do, how to act, determining almost mechanically the best way or the ethical course for our decisions and actions. For Dewey, these principles are explorative or analytical tools that are useful to shed light on a particular situation and to assess the various points of view expressed by the actors concerned. Dewey admits that general ideas such as justice, dignity, or fairness, are of value as tools of inquiry to question and forecast unknown ethical puzzles. They have no intrinsic normative force, but constitute a sort of moral background that may help facing an unknown moral situation.
What should those explorative principles be? In our TA practice, two explorative principles shape our analysis of technological artifacts: The first principle relates to the autonomy of the subject and the second to democracy, the two terms being intrinsically related by a process of co-originality, each being a necessary (but not sufficient) condition for the other.
Let us first introduce very briefly our concept of autonomy. This concept may appear very vague if we do not define it in a sort of robust and pragmatic approach. This is what Nussbaum and Sen6 do with their concept of capability, which they define by raising the Aristotelian question: Which activities characteristically performed by human beings are so central that they seem to define the life that is truly human? They identify ten fundamental capabilities that make life human. Those capabilities help to understand the two faces of autonomy as freedom from unreasonable constraints (from the state or from others) on the construction of one’s identity and autonomy as control over (some) aspects of the identity one projects to the world. The second explorative principle, democracy, is strongly related to autonomy. Here again, the concept is very broad and barely operationalized for this explorative exercise. Along with Sen, we define democracy by distinguishing three critical ways in which it enriches the lives of citizens:
First, political freedom is a part of human freedom in general, and exercising civil and political rights is a crucial part of good lives of individuals as social beings. Political and social participation has intrinsic value for human life and well-being. To be prevented from participation in the political life of the community is a major deprivation. Second… democracy has an important instrumental value in enhancing the hearing that people get in expressing and supporting their claims to political attention (including claims of economic needs). Third…the practice of democracy gives citizens an opportunity to learn from one another, and helps society to form its values and priorities… In this sense, democracy has constructive importance, in addition to its intrinsic value for the lives of the citizens and its instrumental importance in political decisions.7
According to this approach, democracy is at the same time the condition for the autonomy of human individuals and conditioned by this autonomy.
Deep search engines from democracy to autonomy
Based on these two explorative principles, let us examine the major issues related to deep search engines.
Deep search engines and democracy
Analyzing search engines as micro-politics means that this artifact is not only to be considered as a search tool, but also as an infrastructure with an embedded social or political order. This is very clear when doing any research on the web with the help of different search engines. The result is different every time, even if some websites keep on scoring on the first pages and others remain hidden, since they are not indexed at all or classified with such a low ranking that no user will ever consult them. This is not neutral and this not simply technology but mostly politics. This political vision of search engines is very accurately addressed by Introna and Nissenbaum, who state:
Make no mistake: These are political issues. What those who seek information on the Web can find will determine what the Web consists of – for them. We fear that technological limitations and commercial interests may conspire to disenfranchise those outside the mainstream and those who lack the resources or knowledge to promote their Web presence.8
The social shaping of those search engines and therefore their non-neutral requirements and specifications have been very well demonstrated by Cho and Roy. Exploring different engines, they point out that
most existing search engines use a “link-popularity” metric, called Page- Rank, to measure the “quality” of a page. Roughly speaking, the PageRank metric considers a page “important” or of “high quality” if the page is linked to by many other pages on the Web. For example, Google puts a page at the top of a search result (out of all the pages that contain the keywords that the user issued) when the page is linked to by the most other pages on the Web. In short, “currently popular” pages are repeatedly returned at the top of the search results by major search engines. The problem of this popularity-based ranking is that it is inherently biased against unknown pages. That is, when search engines constantly return popular pages at the top of their search results, more Web users will “discover” and look at those pages, increasing their popularity even further. In contrast, a currently unpopular page will not be returned by search engines (or ranked at the bottom), so few new users will discover those pages and create a link to it, pushing the page’s ranking even further down. This “rich-getricher” phenomenon can be particularly problematic for the “high-quality” pages that were recently created. Even if a page is of high quality, the page may be completely ignored by Web users simply because its current popularity is very low. This situation is clearly unfortunate both for Web page authors and the overall Web users. New and valuable pages are ignored just because they have not been given a chance to be noticed by people.9
If we approach those search engines as filters or as scripts that mediate our access to information and knowledge, and therefore our vision of the world, we can consider them, along with Giddens10, as structures that condition our interactions. As structures, search engines cover three dimensions: meaning, since they operate a certain ordering of the world, power, since they introduce an implicit distribution of power between information operators, and norms, since they sanction certain types of behavior by being indexed and well ranked.
How is democracy affected by those new artifacts? Three main issues are at stake when examining search engines: first, equity and respect for minorities, second the diversity of this new public sphere, and finally the question of the transparency of the regulation that supports its organization.
The equity of opportunities to exist and to be consulted on the Internet is the first and most evident issue raised by the “link popularity” metrics applied by most of the engines. This questions the diversity of the web as a public sphere and the chances for the minority’s voices to be heard. Most search engine providers argue for the objectivity of their search results based on their metrics. For instance, Google invokes a sort of direct and participatory democracy that guarantees that the best sources of information are always offered to those interested.
Google works because it relies on the millions of individuals posting websites to determine which other sites offer content of value. Instead of relying on a group of editors or solely on the frequency with which certain terms appear, Google ranks every web page using a breakthrough technique called PageRank™. PageRank evaluates all of the sites linking to a web page and assigns them a value, based in part on the sites linking to them. By analyzing the full structure of the web, Google is able to determine which sites have been “voted” the best sources of information by those most interested in the information they offer. This technique actually improves as the web gets bigger, as each new site is another point of information and another vote to be counted.11
But, the “good intention” of search engine operators regarding the fairness of their metrics can be disrupted by both their commercial strategy of selling good positions in their top slots and the technical strategy of some announcers using their competencies to artificially raise the ranking to the top.
Introna and Nissenbaum conclude that seekers will likely find large, popular sites whose designers have enough technical savvy to succeed in the ranking game.12 Hence, a second critical issue is raised: that of the “tyranny of the majority” and the normalization or uniformity of social visions that could emerge from this process. Let us just recall the social network theory deployed by Granovetter that demonstrates the strength and the importance of the weak ties both for the individual and for societal wealth.13 This issue is still reinforced by the strong concentration of the field, dominated by a very small number of major search engines.
Transparency is the last but certainly the major issue in relation to the question of how search engines affect democracy. Most users are ignorant about how ranking operates and often consider it as the true response to their queries and an “objective” vision of the world. This ignorance is still reinforced by the strict secrecy that shrouds the search algorithms and the poor public information about the metrics and methods published by search engines themselves. This information, however, is critical for the trust people have about the information they get and also for the role the web could play in sound democratic deliberation.
This brief assessment of search engines asks for a better regulation of them in order to realize their potential to support democratic debate. This regulation can follow three paths according to the regulation theory developed by Williamson: pure market regulation, state hierarchical regulation, and network regulation, namely heterarchy.14
Let us first examine free market dynamics regulation. This is the one currently at work and the one claimed by major operators as the best practice to warrant diversity and user satisfaction. But as demonstrated by Introna and Nissenbaum, search engines and the Internet in general are anything but a true free market where customers can access transparent information and can therefore express their preferences among clear and readable alternatives.15 Most lay users do not have any transparent information on the workings of these engines and even less the technical capability to draw a comparison of the ranking metrics used by the operators. Moreover, as seen previously, those free market rules are routinely disrupted by opportunist attitudes both on the part of the operators and of powerful web page providers. To regulate those effects, operators usually propose self-regulation by adopting codes of conduct. But this regulation strongly depends on corporate and commercial interests and more fundamentally raises questions regarding the so-called privatization of what should be considered public space.
To restore trust, some users prefer to turn to social networks, which they believe in and to which they belong. These networks play a role of intermediaries or of gatekeepers between end users and the global information sphere. But here again, questions must be raised regarding the scattering effects of this strategy on online public space, rendering difficult inclusive and productive democratic debate between those intermediary scenes and their troops. This also raises questions regarding the risks that “replis identitaires” (in the sense of self-centered identity politics) pose for social cohesion and the development of our society.
The last regulatory path is the hierarchical one passing by the hand of democratic states. What can a national state do when confronted with a global and international scene operated by transnational actors? And should a public actor intervene in this private sector? To answer those questions, it is important to consider the World Summit on the Information Society declaration of Geneva defining the Internet as a global public good.16 “Public”, as was also underlined by Poullet, means accessible for everyone and giving to everyone a true chance to actively participate in the Information Society.17
So to maintain the Internet as a global public good, the Internet must be regulated. Even if this public regulation is difficult, states should at least play an active role in fostering the transparency of the patterns and metrics used by the search operators in order to make their scripts as readable as possible. This could be done through different policies, such as giving certificates to search operators that provide transparent information about their metrics and ranking processes. It could also rely on public engines helping users to compare what they get and do not get when using a specific engine, and in explaining to customers how to increase their chances to be ranked in good positions. This policy of transparency is already at work in other domains that are considered a basic service, but where provision is privatized as in the case of electricity, for instance.
Deep search and autonomy
Let us now look at the other side of the coin, the autonomy of the users as citizens. Most of the search engines now offer new devices to contextualize and personalize delivered information. One of the values added by search engines consists of all the data collected on the search habits of their end-users, which is subsequently used to shape profiles and preferences in order to push personalized and contextualized information to them. This can be considered as empowering the citizens, but also has, as always, a reverse effect. Let us just remember the story of AOL, which in 2006 accidentally allowed online access to its whole database displaying more than 36 million queries made by 650.000 AOL users. With this error, the world discovered the back end of the search engines. All this collected data serves to infer a profile from the current searching and consuming acts of an end-user in order to predict future preferences of people sharing statistical similarities to him or her. This management of profiles and preferences is always presented as a benefit for the end-users and as increasing the efficiency of their search trajectory. At the same time, however, it does constitute an obscure iron numeric cage that constrains the users’ freedom and their capacities of selfdetermination.
Two points have to be addressed here: first, the lack of transparency in the way those profiles and preferences are generated and managed. Second, the lack of the individuals’ capacity to manage their numerical tracks, which means that they increasingly become “prisoners” of a story and of a social identity over which they can no longer exert any control.
This issue is traditionally addressed by legal considerations regarding privacy. In a recent article, Kessous demonstrates that the traditional regulations of privacy do not appear efficient enough to address this issue.18
Let us consider his argument. For Kessous, this regulation first endorsed a hierarchical pattern with national and international laws and bodies aiming at protecting privacy and individual freedoms. These public regulations appear quite incoherent and often ineffective and weak in a global context marked by a strong liberalization and an absence of effective world regulation.
The second path is the market based on the free will of the actors supported by the concept of informed consent on the one hand and opt-in and opt-out mechanisms on the other. This market regulation raises political issues regarding the concept of justice, since it creates a de facto asymmetry between the “haves” and the “have nots” in terms of their capability to act in order to protect their privacy and their autonomy. But this market mechanism can also appear counter- productive for the search engine operators, since their systems of preferences and profiles usually give a clear primacy to the acting or clicking body as the ultimate access to the truth, rather than to the subject and to his or her rhetoric or expressive capacity. In this search context, the clicking bodies are considered more objective, more reliable and more informative than the thinking or speaking persons, and more revealing of the “true” personal identities, personalities and lifestyles than whatever the individuals may tell or express. This “body paradigm” introduces a sort of paradox in the regulation inspired by the liberal frame of the “free will”.
The third path suggested by the author is based on the technico-political empowerment of the citizens by providing them with technical facilities to write their story and their identity themselves by managing their numerical tracks. Kessous calls these technologies “Maoïst cleaners”19 giving people the opportunity to “reset” their profiles, to delete some out-dated or prejudicial links, to restore their intellectual rights and the principle of reversibility of their social identity and life story. In my view, the hierarchical and market paths are necessary to protect the privacy rights of people, yet they are not sufficient to restore their autonomy and capacity for self-determination. This requires new technical innovations to support an effective political empowerment of the citizens.
The global economy is often synonymous with the end of the national state placed in a sort of asymmetric equation confronting large and well organized transnational corporations. Does that mean that there is no more space for an effective responsibility of the national states to protect their citizens? As pointed out by Stiglitz, government definitely has a place, but it must know its place.20 The example of deep search engines demonstrates that there are still large margins for pro-active roles of the national states in guiding their citizens in the so-called Information Society. These roles concern education and innovation: education by encouraging learning programs that help people to better understand and decode these new search windows by which they have access to information and to knowledge; innovation by investing in research programs supporting projects based on “ethical value-added” engines, but also projects to empower the citizens to manage and control their tracks… and hence returning to them their property and restoring their human right to their own identities.
1 Wiebe Bijker. “Democratization of Technology:Who are the Experts?” World Series on Culture and Technology. (1995) http://www.angelfire.com/la/esst/bijker.html (accessed Dec. 2008)
2 Arie Rip. “Science & Technology Studies and Constructive Technology Assessment.” European Association for the Study of Science and Technology (1994) http://www.easst.net/review/sept1994/ rip (accessed Dec. 2008)
3 Lucas Introna. “The Ethics of Things.” Working Paper, Lancaster University Management School, WP 2003/090, (2003): 19
4 Jean Ladrière. “L’éthique dans l’univers de la rationalité”. Namur: Artel / fides (1997).
5 John Dewey. Democracy and Education. The Macmillan Company (1916)
6 Martha Nussbaum and Amartya Sen (eds). The Quality of Life, Oxford: Clarendon Press; New York: Oxford University Press (1993)
7 Amartya Sen. “Democracy as Universal Value.” Journal Of Democracy 10.3 (1999): 3-17
8 Lucas Introna and Helen Nissenbaum. “Shaping the Web: Why the Politics of Search Engines Matters” The Information Society 16,3 (2000): 169-186
9 Junghoo Cho and Sourashis Roy. “Impacts of Search Engines on Page Popularity”, Proceedings of the World-Wide Web Conference (WWW), (May 2004): 20-29
10 Anthony Giddens. “The Constitution of Society: Outline the Theory of Structure.” Berkley: University of California Press (1984)
12 Lucas Introna and Helen Nissenbaum (2000)
13 Mark Granovetter, “The Strength of the Weak Ties: A Network Theory Revisited” Sociological Theory, Vol. 1, (1983): 201-233
14 Oliver E. Williamson. “The modern corporation: origins, evolution, attributes”, Journal of Economics Literature (1981): 1537-1568
15 Lucas Introna and Helen Nissenbaum (2000)
16 WSIS (2003), “Geneva Declaration of Principles”, World Summit on the Information Society, December 2003 – http:// www.itu.int/wsis
17 Yes Poullet, “Internet Governance: Some Thoughts after the two WSIS”, In; The Information Society: Innovation, Legitimacy, Ethics and Democracy, edited by Goujon, Philippe; Lavelle, Sylvain; Duquenoy, Penny; Kimppa, Kai and Laurent, Véronique, Springer, (2007): 203-224
18 Emmanuel Kessous. “La privacy dans les univers numériques : trois rationalités de la confiance”. In Variations sur la Confiance, Edited by Lobet-Maris, Claire ; Six, Benjamin and Lucas, Robin. Bruxelles: Peter Lang édit. To be published 2009
19 This term was suggested to E. Kessous by F. Pallu in reference to the dethroned dignitaries of the Maoïst regime who were effaced on the official pictures of the regime.
20 Joseph Stiglitz. “The Role of Government in Economic Development”. In Annual World Bank Conference on Development Economics. Washington DC (1996)
|Projects||Deep Search. The Politics of Search beyond Google
Deep Search conference 2008