The Wizards of OS 
"Information wants to be Free"
 
 
 

Information as a prime and primarily relational value
 

Sally Jane Norman
director of the Ecole Supérieure de l’Image
Angoulême/Poitiers, France
norman@wanadoo.fr


Current attempts to use digital tools to inventory humanity's material and immaterial assets, to merchandise as information products elements of our hitherto inalienable cultural heritage, are both threatening and absurd. Threatening, insofar as corporate avarice today weighs heavily on certain kinds of previously accessible, shareable knowledge and experience. Absurd, insofar as the digital visionaries driving this commodification race are as short-sighted as Midas: information which is processed as packets of goods, cut off from the res publica from which it emerges and whereby it survives and evolves, is doomed. Turning information into nuggets of discrete digital gold is tantamount to killing it, because information is only meaningful in the context of human relations : it is generated, nurtured, and transformed - in short, brought and kept alive ­ through intercourse via active, interactive human minds. This paper attempts to focus on the participatory, social quality of information, and to stress the vanity ­ and danger - of information hoarding that fails to recognise this vital quality. 
 

Preface

Ideas developed here are largely fuelled by references gleaned from that vast, free source of information openly available on the internet. For many scholars, this novel way of collating source materials designates a veritable research ethics more than just a working convenience. Until recently, preparing an academic paper meant spending much time and/or money in libraries and bookshops to compile references; this is no longer so. The personal anecdote that follows is a modest indication of how freely available networked information is forging new social practices, which privilege the joint elaboration of knowledge: when asked a few years ago to write up a Unesco working paper on culture and new media in keeping with an all-embracing brief and an equally unmanageable deadline, I accepted on condition that source materials be limited to those that could be harvested from the public domain, via the internet.(1) This exercise and the practical experience it imposed ­ e.g. the need to constantly cross-check and compare browser findings, to zoom up- and down-stream to earlier and subsequent writings - brought home salient features of new media culture far more effectively than any theorising could have done. Moreover, it directly prompted correspondence with many authors of online texts, who generously answered queries, corrected misinterpretations, fed back their own questions, and formed an active environment for a piece of writing ultimately set in a specifically social working process. 

The text below likewise explores and exploits freely, publicly-available online information dealing with the subject of freely, publicly available information. Consequently, it is heavily indebted to hypertextual meandering, trafficking, borrowing, and reembroidering of other people's ideas. While sources are formally acknowledged as far as possible, the singular synthetic power and textual intimacy offered by this working environment is giving rise to new kinds of writing, shaped as much by editorial as authorial activity. At best of times, it is difficult to quantify "original" thinking, which is necessarily the fruit of long and subtle processes of apprenticeship and self-positioning with respect to others. The metissage offered by hypertextual tools further complexifies the issue of how we define an individual idea or thought process. Irrespective of what status is attributable to this particular piece of writing, its starting point at least should be clear: surely the most effective way to further the cause of the knowledge commons and to defend the principle of freely, publicly available information is to use it, promote it, and hopefully contribute towards it.(2)
 

THE INFORMATION TRADEOFF: "HARD FACT" VERSUS FUZZY NEW FINDINGS 

Reflection too strongly anchored in extemporaneous exchange is liable to date faster than reflection purged of the white noise of ongoing discussion and activity. As usual, there are two sides to the coin: the seductive freshness of contemporary commentaries fades fast, leaving the reader with wrinkled half-truths which are as dry as the driest axioms and a lot less useful. Just like earlier mass media ­ town cryers, daily newspapers, radio, television ­, internet resources demand us to acutely apprehend the sudden immensity of available information in order to reposition, build, and exchange ideas capable of getting beyond the contingency of everyday updates and superficial descriptions. A finely guaged balance must be struck between information redeemingly, albeit fleetingly, grounded in and made relevant by pragmatic, ongoing experience, and information standing sufficiently aloof from this ongoing flux to bear more durable meaning. 

Generally speaking, information can be considered to be comprised of an at least momentarily irrefutable heritage of hard fact, together with new findings gleaned from ongoing arenas of activity. The constitution of a dynamic, meaningful social resource depends on a trade-off between these two components. The extent to which so-called hard fact is couched in and dependent on language, a decidedly slack, non-neutral medium equally riddled with preconceptions and ambiguity, is an age-old epistemological problem (notably analysed last century by Gödel and Wittgenstein). The extent to which uptake of new information generated by scientific discoveries depends on eminently human validation processes and readiness to deal with new paradigms, is likewise an old issue (notably analysed last century by Kuhn and Feyerabend). Unfortunately, the current race to accumulate digital assets tends to overlook these traps, and thus to blithely ignore the fact that information is an ultimately social phenomenon, conveyed by arbitrary linguistic systems and ratified by authoritative bodies which, for all their pomp and glory, consist of mere fallible mortals. 

Knowledge derives its cohesion as a social medium from a combination of seemingly fixed elements inherited from a more or less distant past, and freshly acquired elements assimilated from the contemporary world. Young and old components challenge and highlight one another, in a constant confrontation characteristic of a living, as opposed to dead resource. If information is locked up in rules and procedures preventing this vital mix of the long-known and the newly-discovered, it will atrophy, just like certain ancient languages. Like those languages, it will withhold keys to the era where it was operational, shedding light on history, but will be accessible only for specialists able to use those keys. 
 

SOCIO-HISTORICAL PERSPECTIVES ON INFORMATION TRANSMISSION 

Information is largely a process-borne phenomenon, a collective undertaking dependent on exchange and contextualisation for its value and vitality, as opposed to being a hard and fast commodifiable entity. Here, the term information is being employed as a layman's term, as it serves loosely to designate knowledge or learning in its transmissible and traditionally transmitted forms, preserved in, expanded, and handed down by monasteries, libraries, universities and academies, laboratories, printing and publishing houses, corporations and businesses, etc.(3) Grasped in this broader sense, information depends for its livelihood on multiple players and multiple perspectives; it makes sense for a given audience in a given context. Today's efforts to digitise and thereby sequester whole chunks of the human heritage are often sadly blind to the finesse and diversity of information recording and transmission techniques that mark thousands of years of civilisation.(4)

In ancient times, a close association of text and voice, of those who read aloud and those who gathered to listen, gave recorded information and knowledge a particular social flavour. The advent of the codex or compendium of manuscripts, replacing the scroll or tablet, was a decisive instrumental advance in that it physically freed the reader's body and hands: as the reader did not have to stand to peruse unwieldy documents, the previously inevitable declamatory posture was no longer a necessity. Reading could become an individual act for the seated scholar, whose free hands could annotate, illustrate, and make copies while reading (this development can be usefully compared with that underway at present in the realm of computer-bound, hypertextual information supports(5)). Monastic and scholastic traditions subsequently fused copying and reading as a single meditative, spiritual and intellectual activity, undertaken by an individual, largely aimed at reading to recognise given, recorded truths, rather than to discover new ideas. The Enlightenment, with its proliferation of publishing houses and the development of reading societies, book clubs, and lending libraries ­ as well as its wave of illicit publications and pamphlets ­, generated a storm of critical activity and a sense of detachment from the sources of authority previously called on to consecrate public and publishable information, via royal and ecclesiastical seals. 

These are just a few of the innumerable milestones in the history of information transmission, each being obviously inseparable from a specific ideological context which in turn attaches different values and expectations to the notions and mechanisms of information circulation. Our information history, which dates back far into the predigital dark ages, can provide precious insight into current and emerging information practices, perhaps allowing better management and anticipation of their social repercussions. Without this broader historical perspective, the arrogance of corporations posturing as obligatory digital traders of our cultural legacy is likely to continue unchecked. In this case, our information heritage practices risk appearing to future generations as a formidable digital black hole, engulfing masses of knowledge and experience they are incapable of restoring in any meaningful way, because of the indiscriminate techniques used for inscription and encryption.(6)
 

ARTEFACTS OF SACRIFICE 

A metaphor might effectively bring this last point home: in pharmaceutical jargon, lesions in dissected laboratory animals resulting from their slaughter for the carrying out of biopsies, are technically called "artefacts of sacrifice". At times, artefacts of sacrifice are so severe as to obstruct and even completely rule out legibility of the awaited toxicology findings. In short, the very act of killing an animal to turn it into an object of science, a thoroughly controlled source of analytical measurements, may end up invalidating the scientific experiment that the animal was born and bred for in the first place. Poorly designed or poorly conducted experiments tend to bear within them the seeds of their own invalidity, their own downfall. Much experimental use of digital technologies for information storage purposes today runs this risk. Such technologies are being increasingly vigorously exploited in the vainglorious hope of stockpiling and commodifying wholesale our entire knowledge heritage. But some of the knowledge being "put down" in the course of this digital race is, like the laboratory rats and rabbits mentioned above, being slaughtered twice over: 

  • A first time, in that digitised information is too systematically and wrongly being considered as making redundant its original, non-digital support, its human or material bearer, which is thus being casually and callously cast aside once the recording process is over (alternatively, it may be locked up by more circumspect digitising magnates, to secure their monopolies). 
  • It is being slaughtered a second time, in that insufficiently sensitive digitising techniques used to cull knowledge are sometimes irremediably marring and scarring that same knowledge. Consequently, rather than a heritage of useful, revivable information, artefacts of sacrifice are leaving us with a legacy of useless, silent corpses. 
Today's and tomorrow's cultural assets reside not in packets of discretised information as such, but in the ways these packets are socially taken up and deployed. Information repositories are resuscitated and potentiated through acts of volition, recognition, and cognition that humans bring to bear on their contents. We do not read and write spontaneously. These skills must be learnt, and learning the codes that enable us to tap into the human knowledge heritage is necessarily a social undertaking. The strength of information resides in the hermeneutics used to process and assimilate it, in other words, in the social mechanisms which ensure the building of active thought and dialogue. 

Using digital tools intelligently means understanding the power of the standards and norms we are thereby implementing. It means apprehending their immense capacity to embrace the most heterogeneous phenomena, and their equally immense and indissociable capacity to homogenise, flatten, devitalise the idioms and idiosyncrasies which ensure the dynamics of cultural diversity. Because it is these same traits, these quirks, that contain the seeds of evolution, of adaptive energy. These obdurate points of difference are our mutant potential, and as such demand to be respected and preserved. Tools which cannot be used with the necessary discernment should be handled with care, to avoid their inflicting disastrously, definitively destructive artefacts of sacrifice on our living culture. 
 

PRIVATISED INFORMATION MINING 

The social practice of openly debating emerging ideas has traditionally subtended the building of a knowledge commons. Launching original concepts which upset the status quo always involves a certain danger: pillories and gallows, along with their modern media tribune equivalents, are regularly set up in public places to punish so-called heretics, often later acclaimed as intellectual trailblazers. Censorship of information that threatens accepted beliefs is beyond the scope of this paper, but what should be stressed here is the fact that radically new lines of thought, whether celebrated or condemned, have in the past tended to swiftly reach the public arena. Ideas challenging existing tenets, together with the reactions of refutation or assimilation they trigger, have long served to revigorate the knowledge commons. 

Recent large scale shifts from public to proprietary places of information development and storage, and in parallel, from concept-driven fundamental research to product-oriented, market-honed research, are dramatically modifying this pattern. It would be silly to underestimate the degree to which past research has served vested interests: the desire to enhance prestige of a court or empire has often motivated the creation of favourable working environments for artists and scientists ­ these have been gilded prisons in many cases. But whatever the drawbacks of such patronage, the quest for prestige guaranteed timely publication of new works: they were to be published as widely as possible within their benefactor's lifetime, to serve as testimony to his or her insight. If we look at scientific research in the modern nation state model, government expenditure until recently tended to comply with the social contract for science, which " ensured generous state funding of science in exchange for science helping to increase the security, wealth and health of nations."(7) Public-funded research remained relatively independent, and the uptake and transmission of new findings complied with an essentially linear model, whereby basic knowledge gradually cascaded to become practical know-how and, finally, the basis for material goods. 

Over the past decades, however, substantial portions of previously state-funded research programmes have been supplanted by private sponsors seeking immediate financial returns on investment, making research more dependent on market imperatives and competition. Facilities which used to be havens of free exchange and debate, public arenas for developing and testing new ideas, are gradually being fenced off from the agora, as private interests tighten their grip on potentially marketable fields of investigation. (Foray and Kazancigil note that a turning point in public research ethics occurred in 1980 in the United States with the Bayh-Dole Act, authorising universities to patent publicly-financed findings(8)). The very notion of "long-term research" seems to be imploding, as the demand for spin-offs becomes increasingly explicit in public-funded programmes, whose shortening durations cater to product trends and lifespans, more than to the philanthropic ideal of an enriched knowledge commons for the betterment of humanity. 
 

UNDERMINING THE KNOWLEDGE COMMONS 

To take an example of this shift in the conception of fundamental, long-term research, European Union framework projects wearing the "LTR" or long-term research label are generally three-year undertakings which place mandatory emphasis on the development of prototypes for potential EU market development. Although this approach should be understood in the context of a programme initially set up to boost industrial production, it has undeniably perverse effects when project proposals are backtailored to appeal to product-hungry Commission review teams. Dazzling descriptions of third-year prototypes in project submissions tend to outshine and undermine prosaic descriptions of methodologies which, nevertheless, constitute the backbone of fundamental research. 

The more a project is geared towards building a predefined prototype, the less it qualifies as fundamental research. When the planned outcome is a near-marketable good, we are dealing rather with applied research to define, implement and test product specifications. The fact that more and more structures worldwide, including universities and other public-funded facilities traditionally devoted to intellectual exploration, are adopting a similar market-targeted stance, means that fundamental research relatively shielded from short-term production pressure is beginning to look like a thing of the past. 

Then again, why can't "just-in-time" market-focussed applied research meet our turn-of-the-millenium social and cultural information needs? The answer is painfully simple: increasingly privatised research is logically giving rise to a proliferation of patents and a watertight locking off of specialist reserves subject to confidentiality restrictions.(9) As a result, arduous coordination is required to develop potentially ground-breaking areas of investigation, which frequently depend on the pooling of previously separate forms of expertise. 

Original research is of a very different nature to efforts to fine-tune products for the market so that they present a minimum cutting edge over their competitors. Such research ensues from, triggers, and subtends visions which are often the fruit of unprecedented multidisciplinary encounters; it stems as much from cross-contamination of information, from applying different lines of reasoning to a given problem, as from monodisciplinary lines of pursuit.(10) Yet it is hard to synergise cloistered information acquisitions : the phagocytic activity of major digital holdings is provoking understandable reactions from certain resistance groups, which are fiercely sitting on small but vital enclaves of hard-won proprietary knowledge, further aggravating counterproductive stalemates. Drawing together patents and licences, and breaking down multiple corporate walls of silence and secrecy, has become extremely onerous at the financial, legal and social levels. Ironically, what pretends to be a market- and profit-oriented approach to information is backfiring, when over-policed market practices are themselves preventing market development. To use a trigger-happy American expression, this phenomenon could be described as "shooting oneself in the foot". 

So one problem posed by current information privatisation trends is that they lead to a scenario known as "the tragedy of anti-commons". To quote again from Foray and Kazancigil: "In such a situation, the risk of under-using knowledge and simply abandoning socially desirable fields of discoveries is high."(11) This deplorably wasteful situation perhaps bears salutary irony: with any luck, the mismatch between blind avarice and profitability has become so flagrant that the monopolistic stalemate may soon be visible, even for its most short-sighted defenders. 
 

THE HARD DATA HERESY 

Apart from the negative social and economic effects of strait-jacketed proprietary information control, the hoarding of information on the assumption that it is stable, fixed, autonomous, and endowed with intrinsic preservable value, is increasingly questionable. In many areas of hardnosed industrial activity, supposedly particularly sensitive to issues such as the market value of proprietary information, production strategies paradoxically attach growing importance to time-bound contextualisation, thus to the inevitable devaluation of onetime crucial information. 

To take one example of how the old hard data myth has been devalued in cutting edge industries, if we look at such patent-prone sectors as pharmaceutical or precision equipment manufacture, or large-scale power-production plants, a key activity they have in comon is that referred to as "document management". Document management involves tracking and updating specifications, standards, protocols, records, statistics, and forecasts; it consists of devising infallible seriation markers to identify potentially endless versions of reference documents (the shift from canonical sources to evolving documents whose value derives from their place within a series is of course not limited to industry, as shown by parallel shifts from the generic ISBN (International Standard Book Number) to ISSN (International Standard Serial Numbering), URL (Uniform Resource Locator) or PURL (Persistent Uniform Resource Locator) type classifiers(12)). 

In current information practice, the notion of hard data consigned in a definitive document thus appears increasingly anachronistic: even documents as seemingly finite as plant construction plans are only appreciable in their family tree, insofar as alterations and extensions, including and especially during the actual building phase, quickly become an integral part of the facility's genealogical history and production programming. Regarding documentary status, of prime importance is the possibility to read the most recent version against the backdrop of previous versions. Temporal or chronological contextualisation factors would thus seem to represent an intimate part of the value of information, giving it literally deeper, layered meaning. 

Let us now look at contextualisation factors operative rather at the spatial, topological plane. Again, contiguity (in this case, horizontal as opposed to vertical layering) and comparability enhance and indeed ensure the value of individual information items. In other words, a datum or parameter analysed on the horizontal plane lends itself to interpretation when read in the light of coexistent data or parameters. In a fossil fuel or nuclear power plant, for instance, a raw parameter like fuel load is significant only when associated with other contextualising data that make it meaningful. To focus on production criteria, these data might consist of overall site output figures, current status of the national grid and international market demands, etc. (data pertaining to production flow registers and forecasts, or previous and planned yield as a function of maintenance outage schedules, obviously constitute time-bound "vertical" factors). Alternatively, if we focus on fuel load from the physicochemical standpoint, the current situation can be read and interpreted in terms of fuel composition, filter conditions, burn-off and coolant system status, rod cladding condition and spent fuelpit capacity for nuclear loads, etc. 

Each reading of an information item or parameter brings into play a whole mesh of items and parameters, deployed across often tightly entangled temporal and spatial planes (the warp and the weft), providing a contextual fabric without which the datum is a suspended in a meaningless void. In short, strategic information would seem to be increasingly characterised by its immense lability, and its dependence on intricate, interwoven context meshes to acquire value. 
 

"INTERGENERATIONAL EQUITY"(13)

Debate surrounding the viability and value of the knowledge commons and public domain is inextricably tied up with notions of heritage: what do we intend to bequeathe to future humankind? Within what kind of timespan ­ past, present, and future - should we assume responsibility for human action? How do we see ourselves in the transmission chain of which we are a part, and on which life depends? 

The human Freud aptly described as a "prosthetic god" (Civilisation and its Discontents) is being endowed with ever more powerful technical means to apprehend and comprehend the universe. Today's chronotelescopic eyes enable us to track, date, and quantify past activities, monitor current events, and conjure up increasingly substantiated prospective visions. Yet rather than inculcating a stronger sense of heritage and ethics, our new vantage point seems to have generated a cancerous form of social vertigo and amnesia. Observed through the lens of long-term history, our era is all too easily characterised as a time of hubris, of shortsighted avarice commanding immediate and maximum exploitation of an insufficiently respected legacy. 

As stated earlier, anachronistic information hoarding and partitioning practices may result in under-exploitation and even outright abandon of socially necessary fields of discovery. To attempt to allay this deadlock, Foray and Kazancigil postulate and vindicate an ethics of "intergenerational equity", and the accompanying notion of a moral commitment to future generations to uphold the common good.(14) This ethics deserves to be discussed at all levels ­ in keeping with an "inter-epochal" as much as an "inter-generational" timeframe. 

Take the temporally vast issue of our environmental and ecological legacy: stochastically-governed non-human centered evolutionary processes have ensured sustainability of the biosphere for several billion years, notably via recycling of basic elements. Over the past couple of centuries, however, the increasingly determinant action of technics has transformed our biospheric heritage, yielding growing quantities of deadlock byproducts non-amenable to recycling, and irreversibly depleting certain natural resources. This rapid switch from a self-run biosphere to a human-run noosphere has been precipitated by often laudable, legitimate desire to improve living conditions and, thereby, to increase the common good. But an unholy mix of short-, medium- and long-term interests has distorted development strategies, amputating them from the broader planetary perspectives without which they cannot make sense. A billion-year biospheric bequest has been overthrown wholesale, without our ensuring that we have enough foresight to substantiate centenary, let alone millenary forecasts.(15)

If this same concept of bequest is applied to current attitudes towards our cultural legacy, as has been done by Foray and Kazancigil, the current era can again be sadly slated for its hubris and avarice, for its failure to safeguard, nurture, and transmit a knowledge commons at least equal in magnitude to the knowledge commons we have been fortunate enough to inherit. It is surely no coincidence that pleas to counter and ponderate mankind's recent extravagance should be gaining strength in a world of networked immediacy, where pools of resources ­ air, land, water, knowledge ­ are becoming uncannily and forcefully visible at the level of the planet and of humanity as a whole. 
 

TECHNOLOGICAL PATHETIC FALLACY AND INFORMATION MAIEUTICS 

To continue and conclude on this vaguely optimistic note, certain emerging trends in the public arena have the merit of positively demarcating themselves from information sequestering and hoarding habits. These trends indicate recognition and valorisation of the subtle, tenuous, hybrid human-machine relationships that are being engendered by our information practices.(16) To return to the European Commission's IST programmes, it is interesting to see via current calls for proposals where much new research energy is being channelled and accompanied. A change in focus seems to have occurred during the past decade (this shift is already apparent in the IST or Information Society Technology acronym, which has superseded the Commission's earlier IT or Information Technology acronym). Rather than machines and pipes, raw processing and containment capacity, what is being increasingly ­ or at least equally ­ highlit these days in infocommunications research is the intelligence and adaptability of distribution mechanisms, tailoring and scaling of information delivery rates, enhanced congeniality of hubs to ensure convergence and integration of heterogeneous data bases and user populations, multicontextual experimentation with given toolsets, conversational functionalities, affective computing, and intelligent authoring systems. In short, we are witnessing a drive towards increasingly sophisticated, behaviourally-modeled attributes that stubbornly strive to come closer to intuitive human traits, valorising our ability to dialogue, to interact, to stimulate, catalyse, and spur on collective thinking. 

Examples of this evolution are evident in the recently chartered realm of "CSCW" ­ Computer Supported Cooperative Work ­, launched in the early nineties to optimise the use of computer groupware in networked situations. To come up with new approaches to shared virtual workspaces, computer scientists called on sociologists, psychologists, and ethnomethodologists to imbue previously "cold" architecture and software design with inhabitual "soft science" criteria, often awkward to grasp but likely to flesh out these new workspaces in more humane ways. Presence, attention, recognition, and other concepts underlying the subtle proxemics of social interaction became central research issues for multidisciplinary platforms that indeed accomplished major breakthroughs ­ at both industrial and social levels. A few years ago, the need to further enrich use of new infocommunications tools led a number of CSCW pioneers to engage artist co-researchers, to heighten the aesthetic appeal and engageability of collective virtual spaces. One such endeavour is "eRENA", a project from the i3 family ("intelligent information interfaces"), recently completed under the aegis of the EU esprit programme. In its focus on the potential dramatic and theatrical impact of networked electronic arenas, eRENA investigated such knotty issues as "mixed reality boundaries", seeking to map out and invest virtual space with mechanisms and spatiotemporal codes and symbols as effective and communicative as those employed on the agora, in the theatre or stadium.(17)

This tendency towards a more vivacious, inhabited conception of information architectures and tools might be seen to translate a kind of "technological pathetic fallacy". Pathetic fallacy is a metaphorical literary device which consists of attributing human behaviour and affect to non-human things. A simple example is the common name of the Salix pendula, which eloquently humanises this elegant tree with its "weeping willow" appellation. A trickier example is the title of this conference, which imputes human volition to the purportedly abstract phenomenon of information ("Information wants to be free") - then again, poetic licence here makes explicit the organisers' defence of a humanistic conception of information. Along similar lines, there seems to be a growing desire these days to "humanise" technology, to endow it with the reactive, responsive qualities which are likely to make it a more useful and stimulating sparring partner for its human makers. 

The level of complexity, the interdisciplinary challenges and largely unpredictable cross-disciplinary encounters instated by this technological pathetic fallacy, make cut-and-dried partitioning practices not only archaic and counterproductive, but plain ludicrous. The noosphere, the essentially intellectually driven world we henceforth inhabit, as opposed to the stochastically governed biosphere of a bygone evolutionary age, is generating a specific information systemics, a living domain of information sharing and culture. Within this systemics, richly polysemic boundary objects and exchange processes are progressively upstaging proprietary, commodifiable information values. The fact that certain research programmes are beginning to recognise these emerging forces is significant, even though we still need to invent appropriate social practices and attitudes to accompany and reinforce them. A kind of knowledge maieutics, a subtle skill of information midwifery is called for, if we want our information offspring to be viable rather than stillborn, to be alive and kicking rather than the cadaverous artefact of a sterile sacrifice. 
 
 

Sally Jane Norman 

Saintes, October 2000 





NOTES 
 

1. http://www.unesco-sweden.org/Conference/Papers/Paper9.htm; Written in 1998, certain aspects of this paper are now dated, particularly given the strategy employed to collate source materials. Then again, physical publications are just as prone to dating, the real battle consisting of trying to extract viable longer-term sustenance from the grist of the contemporary information mill. Issues broached in the Unesco paper possibly of interest here include the problem of circumscribing public and/or private cultural property, and of potential violence due to thoughtless cultural decontextualisation and reappropriation (i.e. rape of privy ethnic mores by digital cultural wholesalers). Also discussed was the inordinate power of standardised information storage and routing architectures, exclusively and logically developed by technologically privileged actors, and therefore often experienced as an unwelcome diktat by technologically underprivileged peoples faced with either cold digital surrender or cultural extinction.

2. Amongst the many online source materials appropriated in the course of this reflection, the French Solaris review has been a decisive influence, to which explicit references hereunder cannot do full justice. Founded by Jean-Max Noyer, Ghislaine Chartron, and Sylvie Fayet-Scribe, this review is directed by the InterUniversity Research Group in Information and Documentation Sciences. Its goal is to discuss certain issues pertaining to the production and circulation of information and knowledge, in the general context of digitisation of the sign, and the development of electronic networks. Cf. http://www.info.unicaen.fr/bnum/jelec/Solaris/

3. ""Information" is a polysemic term. Its historicity needs to be specified, to avoid anachronistically projecting into the past a concept which dates back to the nineteen-fifties. A broad definition is envisaged here because, from the historical viewpoint, information is globally linked with learning and knowledge, and with various specific forms of knowledge - theological, mythical, philosophical, technical, scientific ­ to be appropriated by individuals who do not yet have the status of "professionals" in society, but who work in privileged sites of knowledge and learning: libraries, monasteries, corporations, universities, laboratories, publishing hourses, academies, businesses, etc." Sylvie Fayet-Scribe, "Pourquoi poser la question de l'histoire de l'accès à l'information dans différentes disciplines et dans différentes temporalités", Solaris n°4, http://www.info.unicaen.fr/bnum/jelec/Solaris/d04/index.html (translation SJN).

4. Roger Chartier, « Du Codex à l'Ecran : les trajectoires de l'écrit », Solaris n°1, http://www.info.unicaen.fr/bnum/jelec/Solaris/d04/index.html

5. cf. Chartier, op.cit.

6. "Fine-tuned recognition of contexts within which certain events are transmissible and others are not is indispensable, if we really wish to make shared virtual spaces more fully and humanly inhabitable. In Maori culture (as in many cultures with strongly articulated transmission protocols), fear of the dissolution of treasured knowledge through its wholesale delivery to the world at large is in some cases leading to quiet death of that knowledge, borne to the grave for want of a sufficiently comprehensive human relay, a vital new carrier. Too much human wealth is going gentle into that good night for precisely such reasons". S.J.Norman, "Kupenga, Knots, Have-Knots", http://intertwine.aec.at/it2texte/norman.html

7. Dominique Foray and Ali Kazancigil, "Science, Economics and Democracy: Selected Issues", document prepared for the World Conference on Science, UNESCO- ICSU, Budapest, 1999. http://www.unesco.org/most/foray.htm

8. Foray and Kazancigil, op.cit.

9. Foray and Kazancigil, op.cit.

10. The complex issue of developing coherent review and validation structures for multidisciplinary undertakings has been tackled by several Solaris contributors; see W.A.Turner, P.de Guchteneire, K.van Meter, "Merit Review, Digital Library design and cooperative cognition"; Hervé Le Crosnier, "Les journaux scientifiques électroniques ou la communication de la science à l'heure du réseau mondial"; Françoise Renzetti and Jean-François Tétu, "Schéma d'organisation de la presse périodique électronique"; Solaris n°3, http://www.info.unicaen.fr/bnum/jelec/Solaris/d04/index.html

11. Foray and Kazancigil, op.cit.

12. For a well-versed overview of the question of changing documentary standards in the digital era, see Elisabeth Giuliani, "Les enjeux de la normalisation à l'heure du développement de l'information "dématérialisée"", Solaris n°6, http://www.info.unicaen.fr/bnum/jelec/Solaris/d04/index.html

13. Foray and Kazancigil, op.cit.

14. Foray & Kazancigil, op.cit.

15. Josef Gitelson, Director Emeritus of the Institute of Biophysics, Krasnoyarsk, Siberia, described this situation in a paper on "Man-Made Noospheric Closed Ecosystems", at the Institute of Ecotechnics Conference, Aix-en-Provence, October 2000.

16. A spate of high tech acronyms testifies to these trends : CHI (Computer-Human Interaction) and tellingly inverted but less pronouncable HCI (Human-Computer Interface) conferences abound and propound a plethora of affective, behavioural, and intelligent (?) computing techniques. How human and computer thought processes are merging in our infocommunications networks is discussed by Jean-Max Noyer in "Vers une Nouvelle Economie Politique de l'Intelligence", and by William A. Turner in "Penser l'entrelacement de l'Humain et du Technique: les réseaux hybrides d'intelligence"; Solaris n°1, http://www.info.unicaen.fr/bnum/jelec/Solaris/d04/index.html

17. See http://www.nada.kth.se/erena/