|
Online Publications
Towards
Cybersociety and "Vireal" Social Relations Citation: From
Printed to “Wikified” Encyclopedias Sociological
Aspects of an Incipient Cultural Revolution Release
2.5, June 2007
3. The Wikipedia as an encyclopedic project 4. “Wikis” as tools for focused and cumulative intellectual productions 5. Six Dimensions of WP growth and evolution 1.
Introductory remarks The success of the Wikipedia is spectacular
insofar as it contradicts most current common sense assumptions as well as almost
all conventional theories about human motivation and social organization.
Even the founders – who originally aimed at a conventional elitist project
(Nupedia) - were completely surprised by the processes of incessant growth
and expansion they have inadvertently kicked off in January 2001 when the
English WP was started.
Ironically, the most central premise of WP
(that the "swarm intelligence" [1] constituted
by all contributors together surpasses any individual wisdom) is regularly
compromised by the very inaccurate collective predictions users make about
the further development of their own project. For example, when English WP
users were invited to guess when the 1 millionth article will be posted
(actually at March 1 2006), more than 220 of them overestimated the time
needed considerably, while only 46 were about right (or somewhat below the
actual time) .[2] From a sociological point of view, there
are three major reasons for studying the Wikipedia: because illustrates so profoundly
how modern human society and culture is to be reshaped and transformed by the
Internet and its still unexplored future potentials of global digital
communication. 1) On the macrosociological and
macrocultural level, studying the WP provides insights how the shift from the
printing era to the digital age goes along with revolutionary new ways to
produce, organize, distribute human knowledge, and how such transformations
affect the worldwide interrelationships between different institutions, collectivities,
nations and cultural regions. In particular, we see that patterns of
monocentric elite guided cognitive systems give way to more open, dynamic and
polycentric knowledge cultures, and that knowledge may become more
independent from money and power as well as from the sphere of formalized
education and academic credentials. Thus, the question arises how these
changes affect the complexity, scope and dynamic adaptation of human
knowledge, its relationship to political, economic and academic-scientific
spheres, its coherence and acceptance thru different cultures and demographic
strata, and its characteristics on substantive and epistemological levels. 2) In a mesosociological perspective, the
WP (like Linux) contradicts the established (Weberian) wisdom that complex
cooperative performances and products can only be realized in centralized and
formalized settings of bureaucratic organization. Instead, we see that some
of the most complex of these productions can evidently also take place in
informal, decentralized open source networks continuously activated by unpaid
volunteers who deliver their contributions according to their own preferences
and judgments, without overall blueprint planning, formal role assignments
and hierarchical controls. After six years of unimpeded evolution, the
Wikipedia has grown into a complex organization from which many insights
about the preconditions, functional prerequisites, consequences and limits of
open source network communities can be gained 3) On the microsociological plane, finally,
the Wikipedia begins to change the basic ways how individuals (or small
groups) search, select, retrieve and apply knowledge on their workplace, in
school, as medical patients or voting citizens or in any kind of private
situation. Particularly impressive are the new potentials to access relevant
knowledge without cost and delay under almost any circumstances and role
conditions (even while on the move); and the capacity to enact self-guided
learning processes by navigating through hypertext structures in a
personalized fashion. In addition, we see former passive "readers"
to be transformed into hybrid "users" who switch flexibly between
receptive and contributive roles, and to use encyclopedias in a prosaic
instrumental fashion - not to be compared with the intimidating status
display effects emanating from by thirty-something exclusive leather backed
tomes on polished Mahogany shelves. Unquestionably, the breathtaking complexity
and dynamics of the WP makes it extremely difficult to reach a sound,
scientifically founded judgment of the project s a whole - particularly on
its characteristics. Thus, there may never be comparative study that include
more than some few out of the 250 WP's worldwide - simply because no research
team is acquainted with so many languages. And many studies on the accuracy
or comprehensiveness of the reported information may be curtailed because any
findings become quickly obsolete as a result of incessantly ongoing updating
and modifications. As a consequence, the following passages have the more
modest aim to carve out some insights on a more general level (by comparing
printed and wikified encyclopedias), and to illustrate the arguments with
anecdotal evidence from only three major Wikipedias: the English, German and
French. 2.
The new "asymmetric competition" between open source networks and
conventional bureaucratic organizations When the Internet became popular some ten
years ago, many pundits predicted that like all the preceding conventional
media (press, radio and TV), it would soon become commercialized and
dominated by professional groupings and large bureaucratic organizations.
However, the subsequent developments provide little support for such upbeat
economic expectations. On
the one hand, the "Dot.com Crisis" of 99/00 has illustrated that
many business models imported from the era of top-down (or: one-to-many)
communication were ill-suited in the new Net environment where everybody had
the same technical means for creating, transforming, storing, copying and
transmitting information.
And secondly, the decentralized and interactive features of the new medium
have recently come to dominate in unprecedented spectacular ways:
particularly in the rising prominence and significance of and user created
content. First of all, it is
striking that the biggest and most successful players on the Web are those
that rely on "bionic software" (You Mon Tsang) [3] by
aggregating and analyzing the information generated constantly by millions of
users. [4] By exploiting the "Long Tail"
(Anderson 2004) of small and irregular users, they gain more knowledge and
produce more useful services than conventional enterprises that typically
focus on a much smaller number of "essential clients". [5] On a most elementary level, Bit Torrent
exemplifies this basic principle of Web 2.0: those services improve with
increasing number of users, because everybody contributes his own computing
and storage capacities of his personal computer:
On a more complex level, "eBay
enables occasional transactions of only a few dollars between single
individuals, acting as an automated intermediary. Napster (though shut down
for legal reasons) built its network not by building a centralized song
database, but by architecting a system in such a way that every downloader
also became a server, and thus grew the network." (O'Reilly 2005). Similarly, Google and
similar search engines base their algorithms of page ranking on the surf
activities of users and on the hyperlinks set by all webpage producers
(Barnett 2005); Amazon derives its attractivity from methods of collaborative
filtering from which users get recommendations about what they shall buy
next; and in the case of "del.icio.us" [6],
thousands of users coproduce a search engine based on the public exchange of
bookmarks and the tagging of visited sites. In all of these cases, the Web as a
platform gives rise to new manifestations of "emergence": in the
sense that qualitatively new "molar" products arise out of the
combination of a very high number of (sometimes extremely tiny)
"molecular" contributions. Secondly, we can observe the
rise of the “Blogosphere” as a new non-commercial and non-professional arena
of interactive public discourse: challenging the traditional monopoly of the
monological “mainstream media” to steer the public agenda setting and to
shape public opinion. And thirdly, we are most fascinated
by the rise of peer-to-peer networks that successfully compete with big
corporations (or interorganizational systems) in generating goods and services of the highest complexity mankind has ever
produced. Thus, P2P file sharing networks are easily capable of substituting
the conventional music industry in distributing songs on a worldwide basis;
by pooling their excess computational capacities, 4.5 million PC users are
able to constitute the most powerful supercomputer on earth (SETI@home) for
searching signs of extraterrestrial civilizations; thousands of networked
software developers are able to compete with Microsoft’s in producing
GNU/Linux, an operating system comparable to Windows; and innumerable of
unauthorized collaborators pool their knowledge to create the Wikipedia: an
encyclopedia that matches or even surpasses the Encyclopaedia Britannica or
the German “Brockhaus” in at least some crucial ways. As seen most succinctly in the rivalry
between Microsoft and Linux [7],
the emergence of open source communities has given rise to an
"asymmetric competition between social organizations that produce very
similar products, but with completely different (even antagonistic)
cooperative structures:
Such “open content communities” (Reagle
2004) or networks of “commons based peer production” (Benkler 2006) are
characterized by at least twelve common characteristics that set them in a
sharp contrast to conventional bureaucratic organizations:
Of course, these twelve features are
functionally interrelated. For instance, the lack of payment is caused by the
fact that products cannot be commercialized, and it has the consequence that
no selective recruitment practices, hierarchical controls and rigid working
duties can be implemented. While we all know that voluntary activities are quite
competitive with formalized organizations in many modest small productions
(like cutting hair or cooking a meal), we are astonished to realize that
nowadays, they seem to challenge bureaucracies also in the realm of the most
complex products of goal-directed human cooperation: e. g. computer operating
systems and encyclopedic works. 3.
The Wikipedia as an encyclopedic project It is no doubt that the Wikipedia aspires
to be an “encylopedia” in the precise sense of this highly traditional term,
because in his article “What the Wikipedia is not” [8],
founder Jimmy Wales takes great care to deny explicitly that it is something
else: a dictionary, a news feed, a collection of essays, an instruction
manual, a repository of links, a directory or (horribile dictu) a vehicle for
propaganda, self-promotion and advertising. Of course, this insistent
explicitness is highly necessary because thousands of contributors all over
the planet have to be precisely instructed so that they are enabled to behave
conformingly and to recognize and correct deviant entries. It is advisable to have a short look on the
glorious tradition within which the Wikipedia insists to place itself –
despite the cogent insight that by passing from paper to the Internet,
literally everything is different than it was ever before. In its extensive
article on the concept, the Encyclopaedia Britannica defines encyclopedias as
“summaries of extant scholarship in forms comprehensible to their readers.” As such they have
existed since 2000 years in very different size and format and as products of
highly variable (individual and collective) modes of compilation. The term
“Enkyklios Paideia” originally implies that all human knowledge can be
represented in a closed circle and mirrors an ordered cosmos that can be
explicated in a consensual and definitive way because human knowledge is
thought to basically stable and the world to which it refers not subjected to
fundamental change. Typically, encyclopedias are representatives
of cultural epochs that (aim to) contain its most authoritative and respected
knowledge in condensed form. As a consequence, the most comprehensive
editions also tend to be written in the language in which most contemporary
knowledge is produced (Latin in the middle ages and English today). Encyclopedias usually address themselves to
“interested educated laymen” who are consulting them in matters where they
are not experts, but about which they are able and motivated to gain reliable
basic knowledge. As a consequence, they have to strike the right balance
between a high-level educated language and a simple, widely readable style.
This also implies that the best contributors of articles are often not the
most eminent scholars (because they are often too much absorbed by current
research), but many second-rank experts (e. g. teachers, writers or
officials) who have professional reasons to acquaint themselves intimately
with a particular topic or who have accumulated their knowledge in the course
of their occupational experience or institutional career. They induce such
contributors to do their best to clad their specialized elitist knowledge
into a more popular form, so that rather broad strata of “interested readers”
are able to understand the texts without consulting auxiliary sources (like
dictionaries etc.). In many conventional encyclopedias, there
was no guarantee that each article stems from an expert in the corresponding
field. In Diderots Encyclopédie, for instance, about three quarters of all
articles are said to have been provided by a single collaborator (Chevalier
Louis de Jaucourt). Until the Renaissance, most encyclopedias
didn’t address to a general public, but to specific elite socialized within a
specific circles or formal formalized settings (e. g. clerics). The printing
press then has given rise to a much less circumscribed, anonymous public:
consisting of expanding bourgeois strata, academics and “intellectuals” who
have acquired knowledge by self education or on other informal ways. In the
course of modernization, encyclopedias have changed due to the rapid
expansion and fragmentation of existing knowledge on the one hand and the
growing divergences of different knowledge spheres on the other. Thus, earlier
cosmological architectures have given way to neutral alphabetical orderings
of articles, because societal consensus about ontologies and the priority of
different knowledge spheres has evaporated. And tight connections to
educational systems and powerful cultural elites have been loosened, because
knowledge became increasingly distributed broadly among various population
segments (even rather marginal and politically dissident groups). In
addition, ambitions of authoritative knowledge codification were abandoned.
Instead, many more recent encyclopedias can be rather understood as a
reaction to rapidly increasing flows of new publications: by satisfying the
need for shorter digests that provide easily accessible information and
orientation (Yeo 2001).
A major step in this evolution was the appearance of Diderots
“Encyclopédie” (between 1751 and 1772) which was promoted and realized by
pre-Revolutionary intellectuals who maintained a critical distance toward
classical authors, and even an overt hostility toward the reigning political
and religious regime (Munzel 2003). Objectivity was particularly cultivated in
highly “eclectic” epochs characterized by highly pluralistic cultural elites:
e.g. in the encyclopedias of the old Roman Empire like Pliny’s “historia
naturalis”; and in the 19th century). In the 19th century, cultural pluralism
gave rise to highly objectivistic compilations (like the German Brockhaus)
devoid of all ambitions for synthesizing knowledge or transmitting it in a
pedagogical fashion. While these traditions persist until the
present, the 20th century gave rise to huge governmentally sponsored
encyclopedias whose mission was to reflect the knowledge culture of a
specific nation. However, even highly authoritarian and totalitarian Regimes
(e. g. the Soviet Union or Italy under Mussolini) have produced encyclopedias
in which most subjects are treated in a rather open-minded, non-ideological
way. By compiling existing knowledge from a multitude of sources,
encyclopedias seem to be intrinsically disposed to affirm the autonomy of
objectivistic cognitive orientations vis-à-vis the restraining influence of
powerful societal actors, reigning ideological fashions and established
cultural institutions. In the following, it shall be demonstrated
how the Wikipedia fits into this encompassing history of human endeavors to
articulate and transmit the essential canon of knowledge of a specific epoch
or culture. On the one hand, it is easy to show that the migration to an
interactive online hypertext format is a necessary step in order to realize
encyclopedias adapted to the conditions of contemporary societies, because
traditional paper editions are not able to keep pace with the amazing
manifold, complexity and dynamic change of science and other current
cognitive cultures. If the Internet did not yet exist, it would have to be
immediately invented in order to secure the continuity of encyclopedic
ambitions. On the other hand, it is also evident that by trying to realize
old ideas with new technologies, something radically new is emerging for
which we don’t yet have adequate conceptual schemes. Within the short history of the Internet,
the Wikipedia stands out as one of the most successful non-commercial Web
projects at least in quantitative terms: Since its inception in January 2001,
it has forked out to more 250 languages comprising more than 5 million
articles. [9] The English version alone has grown to
about tenfold the size of the Encyclopaedia Britannica and is currently adding
about 50 000 new articles every . [10] In
September 2006, comScore has reported that with 154 000 000 unique visitors
per month, and the web address "wikipedia.org" was placed on rank
twelve worldwide (behind giants like Google, Microsoft, Ebay and Amazon). [11] With
56 millions monthly visitors, the English Wikipedia occupies rank 10 of all
English websites, while the German version maintains even rank 6 among all
German-speaking sites (with 8.9 million visits in June 2006). [12]
The site’s traffic is heavily boosted by Google which places Wikipedia
entries regularly at the top of search result lists. Concerning criteria of quality, judgments
are controversial and difficult to verify objectively, because the system is
so big and volatile that nobody is able to cognize and evaluate it as a
whole. While fundamental criticism abounds in many publications [13],
websites and discussion fora, one of the leading researchers of open source
networks, Yochai Benkler, concludes intuitively that ”most of the commercial,
proprietary online encyclopedias are not better than Wikipedia along any
clearly observable dimension. (Benkler 2006: 168). By comparing 42 articles
of the WP with 42 analogous entries in the Encyclopedia Britannica, the
prestigious magazine "Nature" found that the average number of
errors was 2.92 in the EB and 3.86 in the WP - thus concluding that the level
of correctness is rather similar in both publications. (More intransparent are the differences in
softer spheres of human knowledge (e. g. in the social sciences, arts and
history), because inadequacies can less easily (and less consensually) be
assessed). These spectacular achievements contrast
with the very modest costs involved in building up the system and in running
the whole enterprise. Currently (2006), founder Jimmy Wales is managing his
enterprise with only four full time collaborators and a small yearly budget
of about 1.5 million Dollars (provided mainly by small donations between 50
and 100 Dollars bestowed to the “Wikimedia”, the supporting mother
corporation). With such very small financial investments,
the WP has reached the most supreme status that can be reached in the Internet:
·
the status of a “portal site” that serves
as an entry page for million of users; ·
the
status of a one stop reference site not only for lay users, but for
professional “multipliers” like journalists or teachers who disseminate
Wikipedia knowledge in all other media. It cannot be denied that the
Wikipedia has factually become a serious competitor to Encarta, Columbia,
Grolier, EB or other conventional encyclopedias, because it is increasingly
used as a unique reference source for reliable information. A rather valid indicator for this growing
reference status is the rising frequency of the phrase "according to
Wikipedia" on a rising number of WebPages. In May 2005, this expression
got already 22 000 hits on Google, in January 2007, this number has already risen
to 740 000 in January 2007 [14].
In addition, there are very likely innumerable copyings of Wikipedia material
without indication of the true source. Such hidings are very common because
WP citations are still very much discouraged, especially in academic
settings. Thus, the Wikipedia provides vigorous
evidence that some highly optimistic expectations about human online behavior
may under certain conditions come true – despite the fact that they collide
fundamentally with traditional common sense assumptions and established
theoretical concepts: ·
that user-created content provided by
unpaid voluntary collaborators can be highly attractive to worldwide web
publics; ·
that widely respected knowledge results
from highly anarchic production processes at which everybody can participate
without any (academic or other) credentials ("out of mediocrity,
excellence"); ·
that thousands of unpaid collaborators can
be found worldwide that engage in highly demanding work for purely intrinsic
reasons: making contributions that do neither add to their material wealth
nor to their personal reputation; ·
that a
highly complex worldwide collaboration network can survive and continuously
expand on a highly informal and non-economic basis: without being supported
by large amounts of money, paid administrators and formal bureaucratic rules
and sanctions. 4.
“Wikis” as tools for focused and cumulative intellectual productions A major shortcoming of the “dead wood era” was
that for mere physical reasons, writing on paper does not lend itself to
higher levels of collective cooperation. Just because it is cumbersome (or
plainly impossible) to circulate paper sheets so that everybody can add contributions, the notion of individualized
authorship is reinforced. Such high "transaction costs" are a major
reason why the production and application of knowledge is still characterized
by a rather low division of labor (Teece 1988; Ciffolilli 2003. In fact, most
intellectual work has remained on a primitive “handicraft” level: contrasting
increasingly with manual labor which got ever more collectivized in the
course of industrialization. Thus, not only "monographies",
but even many "encyclopedic" works tend to be mainly anthologies of
articles written by individual authors. Cooperation is largely restricted to
the intervention of an editor (or editing committee) that may modify or
shorten the article of give it back for revision. Collective cooperation is
mostly confined to the “molar level”: interrelating articles, streamlining
their formats and fitting them into predefined blue print structures, In very early encyclopedias, even this
higher level cooperation was largely absent because they were produced by a
single editor (like Cassiodorus, Honorius Inclusus or Vincent of Beauvais)
who acted mainly as anthologists: by just selecting and aggregating existing
texts. Even many eighteenth-century encyclopedias were the products of single
compilers, such as Chambers' Cyclopedia of 1728 and the first edition of the
Encyclopaedia Britannica, issued between 1768 and 1771. By the very end of
the century, however, the task of "compiler" (who collects given
texts) had metamorphosed into that of "editor" (who commands, directs,
selects and modifies incoming contributions). By contrast, digital media in general and
the computer networks in particular provide many alternatives for more
sophisticated forms of cooperation: ranging between completely open
collaborations where everybody can participate to closed circles which
restrict access by various means of digital control. For the first time in
history, collaboration on the very micro level is also technically supported:
by software tools of “collaborative writing” that enable groups of any size to
work collectively on the same article or sentence and to influence even the
most tiny details of spelling, grammar and punctuation. On the most general level, it can be said
that the Internet is equally apt to facilitate two basically divergent modes
of collective verbal communication: On the one hand, it supports discursive
communication [16] by
enabling users to express themselves personally, almost like in an oral
discussion. Such exchanges result in “threads” consisting of all the posted
messages filed in the order they have come in during time, without any
mechanisms available to synthesize or systematize what has been written [15].
As seen in newsgroups, web fora, chats, blogs and other forms of online
discourse, threaded online communication makes the achievement of consensus
usually more difficult than face-to-face discussions because even in very
large groups, everybody can articulate his personal opinion at any time,
without referring to what has been communicated by others. Consequently,
online discussion groups are more functional when an increase in complexity
is sought (e. g. by "brain storming") than when the reduction of
complexity is the goal (as in decision-directed deliberations; Kerr/Hiltz
1982: 99f.; Gallupe and McKeen
1990; Geser 2002). In addition, the diachronic
structure makes it very hard for readers to harvest the yield of the
discussion: especially in the case of newcomers who need much time to sift
through all the accumulated materials. On the other hand, online communication
supports as well synthetic document mode communications, where the individual
contributions become bricks or mortar of a larger collective production:
regardless of the time of posting and the identity of its originators.
“Wikis” (invented by Ward Cunningham 1995) can roughly be compared to naked
concrete walls that can be painted by everybody:
However, while paints on walls tend to
stick irreversibly (or to leave traces when removed), entries in wikis can
easily be erased by everybody who does not agree. Thus, an evolution of memes
is started where the "fittest for survival" are those with which
most participants do not disagree:
However, technical reversibility is not
enough: there has to be an incessant intensive activity from the part of
users to correct any abusive entries within the shortest possible time. In
the case of conventional printed encyclopedias, every maintenance activity
can cease at them moment they are delivered and distributed. Wikipedias,
however, remain only functional as long as very large number
of editing users remain watchful and active. Otherwise, vandalizations
– even if produced by very tiny user fractions – would remain uncorrected, so
that the whole system would be continuously degraded. While thread communication boosts
subjective self-expression and individualization, Wikis support processes of
supraindividual community-building and objectification.
While the thread mode is functional
for facilitating communication processes, the document mode gives
priority to their results: like in he case oft most conventional written
texts whose final form provides no information about the antecedent processes
that have led to their creation. Thread mode communication is based on the
“Heraclitean” (or Hegelian) premise that true knowledge emerges within a dialectic
discourse between diverging communicators and remains open to dynamic change.
Document mode productions rely on the “Platonic” assumption that true
knowledge takes the supraindividual form of objectified “ideas” or “theories”
whose eternal truth can finally be ascertained beyond all interpersonal
disputes. In
contrast to the “dialectical” blogosphere, Wikipedian philosophy relies on
the premise that true knowledge is produced in a continuous cumulative
process of aggregating and synthesizing information, not in a process of
dialectical discourse. Consistent with this epistemology,
Wikipedia participants are advised to in “constructive cooperation rather
than adversarial strifes":
In some cases, thread-mode productions are
subsequently transformed into documents in order to systematize and simply
the information and to ease its diffusion to additional participants: e. g.
in the case of FAQ pages which inform newcomers shortly about the goals,
values and norms that have been elaborated in the preceding discussions. In a
similar fashion, the Wikipedia combines the two modes by paralleling each
article page with a discussion page where dissensual aspects concerning the
articles (e. g. conflictual views about scope or terminology) can be fought
out. But the relationship is highly asymmetric, because the discussions are
just an auxiliary tool for improving the quality of the article, while the
article is not seen as an input for fuelling the discussion. In a functional perspective, Wikis can be
considered the informational analogue to assembly lines in the industrial
era. Like the latter, they provide the technological basis for aggregating an
infinite number of modest individual performances into a highly complex end
product that stands out as an object dissociated
from all its individual co-producers. [18] In no other sphere of
text production, the shift from individual to collective authorship has been
so fundamental than in the Wikipedia, where typical articles may well be the
product of several hundred edits executed by a many dozens of collaborators. The success of the Wikipedia even depends
highly on this “fine grained modularity”, because extensive participation can
only be generated when even users with very modest skills, very little time
and rather low work motivations see the opportunity to make valuable
contributions (Benkler 2006: 100). This also has the effect that articles are
mostly “endogenous creations” shaped by the cumulative influences of the
different collaborators, so that exogenous dependencies (on earlier
encyclopedias) are less pronounced than in the printing age where new
encyclopedias were often very much influenced by their historical predecessors.
[19] While Wikis support convergent collectivist
cooperation, they nevertheless do not provide intrinsic mechanisms for
synthesis and systematization. In printed encyclopedias where each entry
is usually constructed by a single author, longer articles usually have a
highly structured, coherent architecture (e. g. by progressing from more
general to more specific aspects). In the Wikipedia, by contrast, articles
are usually the product of many independent contributions of piecemeal parts,
because nobody is given the responsibility to take care of the article as a
whole. Thus IBM researchers have found that most collaborators simply add or
cancel specific words, sentences or passages, while very few reorder
paragraphs or reorganize the article as a whole Viégas et. al. (2004). Like many other texts
that are digitally created these days, many Wikipedia articles thus tend
toward a low level of overall structuring and integration, because they are
the product of a lose sequence of copy and past procedures spanning over a
wide period of time.
In many cases, the structure of most
articles becomes irreversibly fixed at the time of their creation (or soon
afterwards), so that the synthetic capacities (or incapacities) of their
originators get a decisive weight. 5.
Six Dimensions of WP growth and evolution Analyzed under various different perspectives,
the WP shows a consistently accelerating pattern of growth. Expansion rates
were particularly spectacular in 2006 where the total number of active
Wikipedians as well as the number of edits, articles, words, images and
internal linkings (by all WP's worldwide) has more than doubled between Oct.
2005 and Oct 2006. In
the following, it is demonstrated that the WP unfolds in a six-dimensional
space: all dimensions contributing to its quantitative size and ubiquity on
the one hand and its qualitative significance on the other. 5.1 Worldwide multilingual diffusion Since its inception in early 2001, the
Wikipedia is a global project rapidly expanding to all major languages,
ethnicities and geographic regions. In Dec 2006, the statistics page on
“multilingual ranking” lists currently active Wikipedias in not less than 250
(!) languages: among them in dead idioms like Sanskrit and Latin as well as
in almost all subnational languages of Europe that have little or no
tradition of writing (like Alemannic, Ladino, Piedmontese, Sorbian and
Greenlandic. [20] However, only 176 of
these had more than 100 articles, 110 more than 1000, 52 more than 10000 and
12 more than 100000 entries. [21] As seen from Figure 1, all the larger and
middle-sized WP's drawing on more than 200 contributors were founded before
the end of 2003, while the smaller versions covering minority languages have
been steeply multiplying up to the present. While these versions diverge extremely in
terms of size and growth (in accordance with the population carrying them),
this rapid diffusion and ubiquity is most astonishing, because the question
arises what makes the acceptance and active support of WP's is so independent
of any specific cultural and societal conditions. Source: Erik Zachte's Wikipedia Statistics http://stats.wikimedia.org/EN/Sitemap.htm In the course of 2006, the WP has achieved
at least a rank among the 30 most frequented websites in most regions of the
world. It enjoys a particularly high status in all German-speaking countries
(Germany, Switzerland and Austria), where it occupies rank five or six. On
the other hand, its popularity is least pronounced in some Eastern Asiatic
countries: because of blockages (in the case of China) or strong competition
by similar domestic sites (Table 1). [22] Table 1: Position of Wikipedia in the rank
order of all websites visited by the country’s population (based on page
views on Dec 15th 2006) [23]
5.2 Staff expansion Figure 2 shows that the total number of active
collaborators has developed rather moderately in the first four years, and
then has multiplied about eightfold (from 38 000 to more than 300 000 from
Oct 2004 to the present (Dec. 2006. Evidently, the English WP has been far
better able to keep pace with these Worldwide developments than the two
European WP’s that have experienced a much smoother (while still perfectly
continuous) expansion. Source: Erik Zachte's Wikipedia Statistics http://stats.wikimedia.org/EN/Sitemap.htm Of course, part of this rise is
attributable to the fact that the curve is accumulative: thus including many
earlier users who have since discontinued their collaboration. In fact, only about
25-30% of all these contributors belong the current “labor force” (=
individuals who have made at least 6 edits in the current month).
Interestingly, this percentage has remained rather stable during the last
three years: except in the German case where the percentage of actives has
continuously dwindled. Source: Erik Zachte's Wikipedia Statistics http://stats.wikimedia.org/EN/Sitemap.htm Looking at the expansion of the active user
base, it becomes even more evident, that the whole global WP system as well
as the English WP is experiencing exponential growth, while the German and
French Wikipedia are characterized by a much more moderate expansion (Figure
3). As
the growth data on the Zachte statistics are available for each month, it may
be asked whether the evolutionary pattern can be adequately modelled by a not
to complicated mathematical equation, from which short-range or even
middle-range future predictions may be derived. Given that new contributors
are constantly arriving while older ones are leaving, the cumulative
historical number of active Wikipedians is likely to rise without limits.
Therefore, a polynomial or exponential equation seems more fitting than a
logistic curve that approaches an unsurpassable highest value. By trying out
different formulas, it is found that, the rising curve of total collaborators
within the last two years can almost perfectly be fitted by a quadratic
polynomial (in the German case) or a cubic equation (in the three other
cases) (Table 2). Table 2: Modeling the growth in the total
number of contributors with cubic polynomial equations (covering the monthly
figures from Oct 2004-Oct 2006).
As to be expected, the core of highly
active individuals (with more than 100 edits per month) is much smaller and less
subject to expansion. Unsurprisingly, its relative size was most prominent at
the incipient stages of the project, and it seems to decline constantly in
the course of ongoing expansion (Figure 4). Interestingly, the nucleus of
activist Wikipedians is significantly larger in France than in Germany or in
the Anglo-Saxon countries. While the number of worldwide (hyper)activists
has not kept pace with the broader base of participants, it has nevertheless
about doubled each single year: enlarging considerably the pool from which
administrators, arbitrators, bureaucrats and other incumbents of with
supervisory and integrative duties can be drawn. Source: Erik Zachte's Wikipedia Statistics http://stats.wikimedia.org/EN/Sitemap.htm Nevertheless, best curve fittings for the
absolute rise on highly active collaborators are achieved when logistic
(instead of polynomial) statistical equations are used in which curve
approach a maximum values that is not much higher than the present figures
(Table 3). Thus, the equations predict that this "ruling elite" of
administrators and other activists will only rise very modestly in the coming
years: at most by ca. 20 percent. Table 3: Modeling the growth in the number
of highly active contributors (who posted more than 100 edits in the last
month) with a logistic equation (covering all monthly figures from 2001 to
Oct 2006)
5.3 Diversification The multilingual proliferation of
Wikipedias leads primarily to an expansion on the level of articles, because
each collectivity contributes its own particular localities, personalities,
cultural productions and historical events. Compared to the skyrocketing
trend on the global level, the growth of every single Wikipedias (even in
English language) is rather modest (Figure 5).
For checking whether expansion occurs in a
linear or in an exponential fashion, it is analyzed how expansion rates
change over time. As seen from Figure 6, the creation rates of new articles
have increased very much on the world level, while the rate of the other
encyclopedias have flattened out: indicating an almost linear pattern of
growth. Very similar to the growth of contributors,
the recent increase in articles (2004-2006) also follows a curve that can be
adequately modeled by a cubic polynomial equation, in which one of the
coefficients (of the first or second degree) is on the zero level (Table 4). Table 4: Modeling the growth in the total
number of articles with cubic polynomial equations (covering the monthly
figures from Oct 2004-Oct 2006)
5.4 Elaboration While the global WP system expands mainly
by steep rises in the number of articles, the English, German and French
versions give more weight to an increasing elaboration of their entries: by
submitting them to many edits and enlarging their textual size. The German WP
in particular seems to compensate its rather moderate additions of articles
by considerable efforts in their internal elaboration: so that the average
number of words per entry has increased fourfold (!) between 2002 and 2006
(Figure 7). Source: Erik Zachte's Wikipedia Statistics http://stats.wikimedia.org/EN/Sitemap.htm Not unexpectedly, the German and English WP also excel in the number
of edits per article (which seems to have gained much momentum recently in
the Anglo-Saxon sphere) (Figure 8). Source: Erik Zachte's Wikipedia Statistics http://stats.wikimedia.org/EN/Sitemap.htm Figure 9 shows that updating frequencies have
generally increased since Oct. 2003: with the exception of the most recent
time interval where Germany which has experienced a decline. Source: Erik Zachte's Wikipedia Statistics http://stats.wikimedia.org/EN/Sitemap.htm While many edits are directed at
enlargements, others serve the purpose of correcting errors, eliminating
vandalisms or substituting obsolescent by updated information. They are
typically made by assiduous "wikiclerks" more dedicated to formal
than substantive aspects of encyclopedic work. Empirically, such aspects of
"diligence" can be grasped by relating the number of edits not to
the number of articles, but to the volume of words As seen from Table 10, the intensity of
such "maintenance" activities have evidently decreased in the
German WP, while they increased sharply (between 2003 and 05) in the English
version where nowadays, more than six edits (instead of four in the other
cases) are on the average occurring monthly for every thousand words. Source: Erik Zachte's Wikipedia Statistics http://stats.wikimedia.org/EN/Sitemap.htm Considering all these findings, we may
conclude that the expanding field of active participants (as seen in Figures
2,3,4) constitutes the basis for more speedy
developments on the level of diversification, elaboration and diligence. This
is particularly true for the English Wikipedia which currently exceeds the
French and German sister WP’s by a fourfold creation rate of new articles (ca
60 000 per month) and by an almost twofold number of additional edits per article
(about 1.5 per month). 5.5 Increases in internal
cohesion A fifth evolutionary dimension concerns the
degree of internal integration which can be roughly operationalized as the
number of interlinkages between the different pages. As to be expected: there is a monotonic
increase in the number of such hyperlinks in all Wikipedias, but despite the
exponential increase in collaborators and edits, these increments seem to
diminish recently, particularly in 2006. More than in other aspects, pronounced cultural divergences stand out here:
with the German and English WP in the forefront, while the French WP lags
considerably behind (Figure 11). Given that the potential number of interlinkages
increases exponentially with the rising number of articles, this decelerating
growth implies that relative degree of connectedness between the articles is
on a sharp decline. [24] Source: Erik Zachte's Wikipedia Statistics http://stats.wikimedia.org/EN/Sitemap.htm 5.6 External Embedment Finally, there is a sixth dimension of
growth that refers to the embedment of the WP within the WWW. This "external
integration" is also continuously increasing, but (in comparison with
the internal interlinking) in a rather modest way. Source: Erik Zachte's Wikipedia Statistics http://stats.wikimedia.org/EN/Sitemap.htm Until the end of 2002, all Wikipedias have
evidently followed an "isolationist" strategy by restricting the
number of hyperlinks to other websites to almost a zero level. Since then,
the mean number of such links has edged somewhat above 1.25 in the global
averages as well as in the French and German WP, while approaching the value
of 2.0 in the Anglo-Saxon edition Figure 12). These results show convincingly that up to
the present (2006), the Wikipedia clings to the printing age insofar as it
tends to define itself still as a rather self-contained system offering the
whole of human knowledge, rather than as a node within a Net in which for
every single entry, a lot of other equivalent (and in some cases much richer)
knowledge sources may be found. 6. On the potentials and limits of wikibased
open source encyclo-pedias: some preliminary
conclusions after six years of ex-perience 6.1 Free
self recruitment of collaborators Conventional encyclopedias usually base
recruitment on previously achieved status characteristics that are interpreted
as valid indicators of expertise: e. g. by inviting only personalities with
professoral or doctoral degrees. In other cases, recruitment is expanded to
individuals enjoying an informally achieved public reputation (e. g.
intellectuals) or with status positions in non-educational institutions (e.g.
high ranking politicians or successful entrepreneurs). Of course, such
recruitment patterns help to keep the resulting knowledge canon within the
limits of institutionally established elitist culture. There was always a
strong bias against the inclusion of “indigenous” knowledge originating in
folk cultures or esoteric circles. Thus, magic and astrology had no place in
European encyclopedias since the 12th century. In modern societies, such ex ante criteria
are of limited value for various reasons 1) High scholarly reputation and status
achievement is not primarily based on the possession of existing, but on the
production of new knowledge. Thus, highly innovative researchers may have
rather limited knowledge about the broader structure and historical
development of their specialized topics. 2) Educational status characteristics are
always based on past achievements; so that they may not be consistent with
present qualifications (e.g. in cases where a scholar is no longer up to date
because of illness or advanced age). 3) There may be important spheres of
knowledge which are not in the reach of any formally educated scholars,
because their acquisition occurs mainly by self education or by accumulating
practical experience. This is particularly true for most practical knowledge
(e. g. used in the production of goods and services) that has been so
prominent in Diderots Encyclopaedia. And it is even more
true for any spheres of “subcultural” or “indigenous” knowledge
controlled and transmitted informally within special segments of the
population: e. g. knowledge about Heavy Metal music bands or computer games,
about sectarian religious belief systems or anthroposophic medical
treatments. 4) The knowledge about many subjects has
become so complex and multifaceted that it is distributed among many
individuals with different specialities and experiences. Thus, no single
person will be capable of producing comprehensive articles on “London”
“Goethe” or “Renaissance” that treat all important aspects of the topic on
the same scholarly level.
In short: widening the pool of potential
collaborators, evaluators and correctors is indispensable when knowledge is
too complex to be mastered by any “expertocratic elite”. For instance,
systems of “peer review” fail when it is not possible to select ex ante the
people most capable of evaluating a specific contribution – e g. because the
contribution refers to something so new or so specialized that no “reputable
experts” exist. By starting without any advance knowledge
about who possesses what kind of knowledge about what topic, the Wikipedia is
in sync with a highly complex, intransparent society where new areas of
previously inexistent knowledge areas (e. g. about new technological or
cultural phenomena) are constantly arising, and where an unknown manifold of
knowledgeable individuals have to be taken into account. According to an internal WP page, an avid
interest in Wikipedia has been known to...
The very lack of recruitment procedures has
the consequence that the composition of Wikipedia collaborators is rather
homogeneous, because the general “digital divides” segregating various
population segments are not only reproduced, but even amplified:
By giving everybody the capacity to edit and change existing pages,
all Wikis are evidently based on the premise that On the motivational level, this implies
that most participators are driven by positive intentions, not by drives of
intentional lying or vandalism. In the case of Wikipedia, this premise is stated in the
principle of “good faith”:
On the skill level, this presupposes that any
information provided is more likely to stem from knowledgeable than from
uninformed individuals, and that more informed individuals have better
chances to have their contributions accepted. This again implies that most
people have a realistic self-assessment about their own knowledge – so that
they accept corrections when they are originating from a more competent side.
In fact, 51% of benevolent and competent
participators would be sufficient to set a cumulative process in motion which
would lead to a gradual overall improvement of the Wiki because at least in
the longer run, “good content” is more likely to be posted and to be
maintained, while “bad content” is more likely to be corrected or weeded out.
Of course, such low percentages would result in very slow improvement
processes: with the consequence that at each point of time, many errors would
be uncorrected, and many less visited hoax sites or vandalized articles would
not be eliminated. Consequently, the assumption to produce highly reliable
knowledge bases approaching those of professional encyclopedias is based on
the assumption that the percentages of competent and benevolent contributors
(as well as the correlations between knowledge level and influential
participation) are rather high. The Wikipedia certainly contributes to a
leveling between experts and laymen as knowledge providers, because the
names, status positions and qualifications of contributors are not visibly
marked. No assertion is accepted as “true” just because it stems from a Wise
Old Man who is highly respected because many of his statements have proven to
be true in the past. As
knowledge is so much dissociated from personal communicators, it has to be evaluated
on the basis of its intrinsic merits, i.e. the empirical sources on which it
relies as well as its consistency with other facts or theoretical
constructions. "Wikipedia's articles on the British peerage system - clearheaded
explanations of dukes, viscounts, and other titles of nobility - are largely
the work of a user known as Lord Emsworth. A few of Emsworth's pieces on
kings and queens of England have been honored as Wikipedia's Featured Article
of the Day. It turns out that Lord Emsworth claims to be a 16-year-old living
in South Brunswick, New Jersey. On Wikipedia, nobody has to know you're a
sophomore." (Pink 2005). 6.2 Extensive and efficient exploitation of
intrinsic motivations Since the Renaissance, Western societies have
focused very much on culture as an arena of individual productions (text,
pictures, music compositions etc.) neatly attributable to single authors.
More than that, every new work should stand out from others by “originality”:
showing a singularity of features not realized anywhere in the past and not
repeatable in the future. Of course, strategies of individual attribution may
be functional for boosting individual motivations: a major reason why they
are also widespread in the academic sciences: serving as a driving force for
individual careers and reputation As this "romantic individualism"
is inimical to all forms of labor division, it has also undermined for
centuries all encyclopedic endeavors, because such projects have to be based
on a collectivism that doesn't leave much space for individual
self-aggrandizement. Thus,
the question “why does anybody participate” was legitimate during the whole
history of encyclopedias, because collaboration in such projects was seldom
an attractive way to gain individual rewards. Typically, the articles
delivered were not paid and did not contribute much to
personal reputation, because the name of authors remained concealed (or were
indicated only by initials, like in the newer editions of the Encylcopedia
Britannica). Long before the Wikipedia, therefore,
coauthors have been primarily stimulated by other motives than by boosting
their personal reputation: e. g. by the extrinsic satisfaction to belong to a specially selected, extremely prestigious scholarly
elite, or by the intrinsic satisfaction to co-define the “official” canon of
knowledge of a given epoch or society. For motivational psychology, therefore, the
Wikipedia does not pose radically new problems, because by studying any kind
of voluntary behavior, the social science have always done wise to assume
that “homo sociologicus” (in contrast to (homo oeconomicus) is driven by a
multitude of different motivations:
Compared to traditional paper publication
projects, the Wikipedia has enhanced capacities to harvest and aggregate such
diverse motivational resources, and to channel them efficiently into
constructive cooperative endeavors. 1) Many collaborators of moderate and low
levels of motivations may be won just because the thresholds to participation
and the costs of collaboration are so low. Thus, everybody with a hooked up
PC can log in and edit pages anytime at any place on the planet earth.
Because no membership role with formalized duties has to be adopted,
participants remain free to decide on the modes, ways and intensities of
present and future collaborations; and contributions can be so “fine-grained”
(e. g. by just adding a single figure or correcting spelling mistakes) that
almost anybody can feel self-confident enough to add at least some modest
improvements. Again,
it is not unjustified to compare the Wikitechnology with the assembly line:
both share the principle of breaking down production processes into small
independent parts, so that lower skill levels are sufficient to make valuable
contributions:
This also explains why some sister projects
like Workbooks have not taken off: because the minimal threshold that had to
be reached (e. g. to contribute useful textbook chapters) has been much too
high (Benkler 2006: 101). 2) The “law of big numbers” teaches that to
the degree you increase the number of participants, the more likely it
becomes that among them, you will also have
individuals with a very wide variety of (also quite rare) characteristics,
skills and motivations.
Thus, very few individuals may find deep
satisfaction in correcting the spelling and punctuation of other people's
texts, but in a population of 800 million Internet users, they may still run
into thousands; and whoever succeeds in mobilizing them can generate a volume
of voluntary work worth millions of Dollars if it would have to be bought on
the labor market. Evidently, the success of the Wikipedia is based on such
effective filterings. For instance, hundreds of “police constables” are
patrolling for overseeing and correcting various cases of abusive behavior;
silent brigades of “janitor-minded” individuals are constantly active to
clean up after vandalizations; talented “mediators” feel urged to intervene
in order to moderate heated edit wars; and fussy “clerks” with a bureaucratic
mentality are highly useful to correct even very tiny errors in biographies
or statistical tables. 3) Given its constantly rising status as a
primary reference site and its spectacular impact on global knowledge
culture, the Wikipedia is attractive for anybody who draws satisfaction from
being part of such a big and influential project – even if his contribution
is minor and insecure. Thus, a collaborator confesses that “knowing that an
article will instantly become a published part of a worldwide reference is an
intoxicating enticement.” (Wilson 2006); and another compares himself with
somebody working on the most sacred texts of human society:
Some contributors face for the first time
the opportunity to make proselytes by displaying the knowledge they have
privately accumulated in their hobby activities to a wider public. Thus, a
Madison-based software engineer named Sean Lamb has derives personal
satisfaction from contributing articles about American railroad history: a
very specialized topic not likely to be treated by many others. (Patrick
2005). On
the other hand, however, collaborators may feel demotivated by the
perspective that their efforts are likely futile because their contributions
are deeply modified or even eliminated completely by subsequent editors.
This leveling implies that even highest
reputable professionals find themselves in a fierce competition with
colleagues as well as with outsiders who may have gained heir knowledge on
extraprofessional (e. g. autodidactic) channels. Following Rational
Choice theory, we might assume that higher level experts are strongly
discouraged to participate, because they gain much less acceptance than when
they use more conventional channels [29];
while lay persons are strongly encouraged because for them, wikis may provide
the only arena where they can successfully display their knowledge and their
ideas. [30] 6.3 Low needs for capital and organization Printed encyclopedias are highly ambitious
endeavors that have to be based on high investments of resources and
long-term editing commitments. Books in general need much care and effort in
order to avoid errors irreversibly fixed on printed paper. In the case of
encyclopedias, errors are particularly harmful because they may be copied and
diffused throughout society to the degree that the work is used as a
reference. Like in the case of telephone books, the reputation of an
encyclopedia depends fully on its perfect reliability, and intensive checks
and controls are necessary for keeping up to such standards. Usually, only a few potent societal actors
or collectivities are capable of engaging in such a project: e. g. monarchs,
rich elite members, foundations or governmental institutions. Very often,
they have been created for the purpose of expressing the tradition and
thinking of a societal elite or of symbolizing a
national culture. Thus, most of the classic Chinese encyclopedias owe their
existence to the patronage of imperial rulers; the emperor Constantine VII of
the Eastern Roman Empire (913-959) was responsible for a series of encyclopedias,
and king Alfonso X of Spain (1252-1284) sponsored the “Grande e general
estoria” (“Great and General History”). Much less
frequently, we find find encyclopedias
originating more at the periphery of society: like the famous French
“encyclopédie” of Denis Diderot that was emerging within the enlightenment
movement that opposed the reigning religious institutions and monarchical
regime. Since the 19th century, such independent
endeavors have almost vanished for various reasons: e.g. because the volume
of relevant knowledge has expanded, the demands for comprehensiveness and
reliability have risen, and the costs connected with new printing
technologies and distribution procedures have increased. Especially the 20th
century was rich in “governmental encyclopedias” aiming to provide a most
impressive picture of national culture and national achievements (e. g. the
Enciclopedia Italiana, the Soviet “Granat” encyclopedia or the Enciklopedija
Jugoslavije (first published 1955–71). This explains why in many cases, not
only the most educated scholars, but the most powerful personalities of the
respective time (e. g. Lenin and Mussolini) have made significant
contributions. Because of high costs, conventional
Encyclopedias have a high expressive value as status symbols. Whenever an EB
or a Brockhaus is found on a shelf in a living room or in a private library,
a double message is sent out: that the owner has (had) enough money to buy it
and enough education to make use of it (while everybody knows and accepts
that it is de facto rather rarely consulted). Seen in this wider historical perspective, the new digital media help encyclopedias to
regain the independence from governmental power centers, economic enterprises
and other societal institutions: an independence that was quite remarkable in
the 18th century but was later lost in the course of rising nationalism and
industrialization. Thus,
labour costs are very low because so much motivation for unpaid
voluntary collaboration can be tapped (see 4.2); and capital costs are
negligible because like other Web projects, the Wikipedia thrives on hardware
and software resources that are already fully available for other reasons:
individually owned PC’s already acquired for various private or professional
purposes, and excess capacities of networks that have been built for
telephone transmission or other commercial purposes. For several reasons, populations in rich
modern societies have high “discretionary resources” (in terms of free time,
money or skills) that are disposable for various new purposes because they
are not committed to work or family duties (McCarthy/Zald 1977; 1987). There
are many potential providers of such resources: e. g. temporarily jobless or
partially employed people, students, housewives or retirees. In addition,
changes in modern lifestyle contribute to a growing ”decommitment” of
resources; many adults live alone or with few or no children, many are rather
isolated immigrants far away from relatives and friends, and increasing
numbers do not participate in voluntary associations or political parties.
(Putnam 2000). The Internet amplifies these individuals
further by providing unlimited possibilities for data transfer and communication
and by harnessing them to a large variety of new purposes: e. g. by providing
interactive online networks where everybody has a chance to feed in his or
her contributions. A
major decline in labour costs is caused by the demise of many more "ritualistic"
activities that make the finalization of printed works so cumbersome. For
instance, much work is dedicated to "streamlining" and
"homogenizing": e. g. by implementing standardized criteria of
typing, grammar, orthography, titles, footnotes, bibliographies etc. While
such standardizations do not contribute much to readability, they seem
nevertheless indispensable for aesthetic or conventional reasons. In the case of computerized hypertexts,
there is much more tolerance for inhomogeneities: maybe because each page is
seen in isolation, so that inconsistencies across pages become less visible
than by skimming a book As a consequence, much editorial overhead costs can
be saved. On the other hand, this implies that it is highly difficult to
transform electronic encyclopedias into manuscript ready for publication.
Such high prospective costs for "streamlining" were a major reason
why the ambitious book edition of the German Wikipedia has recurrently
failed. [31] Among many other consequences, the decline
in production and distribution costs implies that conventional limits of
growth and accessibility are completely removed. 1) In the printing age, there were always
harsh limits on the total size of encyclopedias, and thus indirectly on the
volume dedicated to various articles or systematic divisions. (For instance,
the total text volume of the EB and Brockhaus has remained on the same level
since about 150 years!). Therefore, editors were always required to exert
selection: thus inevitably expressing their own personal preferences about
what shall be included and what deserves a shorter or more comprehensive
treatment. This selectivity was highest in the one-volume pocket
encyclopedias which always tended to be heavily shaped by the personality of
a single author. However, the larger and the more anonymous the public, the
more pressing the need to broaden the scope in order to satisfy all the
different interests. But as the overall space was limited, this resulted in
an ever more atomized knowledge structure with a declining average size of
entries [32]. By contrast, digital encyclopedias can
expand without such pressures toward atomization: by just adding new and expanding
additional articles at the same time. Because of its unlimited potentials for
growth and diversification, the Wikipedia fares far better than printed works
in exploiting the "Long Tail" (Anderson 2004): the large number of
highly specialized information needs articulated by very many infrequent
users. In fact, the Wikipedia builds its reputation heavily on the
totality of mostly quite unpopular, rarely
consulted articles, while classical encyclopedias found it predominantly on a
smaller number of more frequently used entries. [33] This
implies that the Wikipedia is attracting a very large and highly diverse
public, similar to Amazon which lives from selling few copies of very many
different books. On the other hand, the lack of physical resources makes any
kind of filterings and shortenings difficult to legitimate: because they
cannot be justified by technical or economic arguments: so that more
ideological, philosophical or scientific reasons have to be provided (arguments
likely to be quite dissensual over cultures, user groups and
"Zeitgeist" fluctuations). 2) In the age of printing, there was a
rigid trade-off between volume and accessibility. Only very small
encyclopedias were cheap enough to get a large distribution and enough handy
to be carried along. More user-oriented encylopedias including “everything”
were not only expensive, but so clumsy that their fate was to remain on
rarely visited bookshelves in libraries or other rooms far apart from
practical activities and “real” human life. Digital encylopedias can grow
unlimitedly without losing accessibility caused high acquisition costs,
clumsiness or other material factors. Soon, they will be fully available for
portable handheld devices or in audio form, so that users have all
information at their fingertips at the very moments they need them (e. g.
within a meeting or while driving a car). 3) Traditional encyclopedias could only be
produced on the basis of sizable and rather wealthy collectivities;
preferably by populations organized in nation states with governmental
agencies and large editing houses able to act as initiators and
sponsors. Wikipedias, by contrast, can
flower everywhere, because even tiniest groups have enough potential to set
such online processes of knowledge accumulation into motion. In fact, there
is a global diffusion of Wiki technology, protocols and software, because
this knowledge is so standardized and decontextualized that it can be copied,
transmitted and implemented everywhere, regardless of any socio-cultural and
linguistic conditions. Thus, Wikipedia clones have rapidly sprouted in about
250 languages, even if many of them are still “empty shells” waiting to be
brought to life by active users. Given that all contributions to the Wikipedia
belong to the public domain, anybody disagreeing with the current
encyclopedia is allowed to initiate a new one by simply eliminating the
unwelcome entries and retaining all the rest. Such “forking” has given birth
to the “Enciclopedia Libre Universal en Español (split already in 2002) [34],
and to the foundation of Wikinfo [35] in
2003 (an alternative project rejecting the rigid “neutrality” principles of
the Wikipedia). 4) Given their large and long-term need for
subsidies, traditional encyclopedias could usually gain only limited autonomy
from their mighty economic and political sponsors; and given their linkages
to academic elites, they had no choice than to give priority to
institutionally established knowledge cultures. In comparison, the Wikipedia has hitherto
remained remarkably independent from economic corporations as well as from
governmental agencies and educational institutions. The economic autonomy is
dramatically highlighted by the complete lack of advertisements and the very
low operational budget that is mainly covered by a multitude of rather small
individual donations. While “blind spots” and censuring endeavors certainly
exist, they seem to be associated more with idiosyncratic personal
sensitivities than with larger-scale institutional interests and strategies.
While this allround autonomy is certainly a highly valuable asset, it is also
a source of risks because it makes the WP “underdetermined” (Berinstein 2006)
and therefore vulnerable to any kind of intrusions, assaults and even
“kidnappings” by any highly active particularistic groups. 6.4 Multicultural segmentation By encouraging the most knowledgeable
individuals of each culture to support their own encyclopedic project, a multi-domestic
and multicultural repository of human knowledge in almost all written
languages may come into existence. In fact, constructing an encyclopedia
means: putting a language under a very hard test: stretching its verbal
expression capacities to the utmost by conceptualizing and describing an
unlimited number of different topics, by importing and assimilating a an ever
growing manifold of terms from other languages, and by creating neologisms
for keeping pace with new developments and events. At the same time, a focus of collective
identity is created which may be particularly important when the speakers are
not living together in the same geographical location, but are dispersed in
diaspora where they have no opportunity to use their mother tongue. In some
cases, contributors seem to consider the WP as a vehicle for transporting and
reinforcing traditional folkways and other elements of traditional culture. [36] Given
the small number of speakers, such minority WP’s have
to be built on a very small group of collaborators who tend to shape it
strongly according to their subjective preferences and views. In many
cases, they invest their limited energies into blueprint structures of many
different “stubs” (e. g. about each local community): leaving it to others to
insert corresponding content. On the special leadership role of the English WP How will the relationship between the few
"big" and the numerous small WP's develop in the future? On the one hand, the rise of so manly
smaller WP's in practically all existing languages has the effect that the
originally dominant WP's retreat into a relatively more modest
position. In
the case of the English WP, however, this status loss is attenuated (or even
neutralized) by the fact that by representing the hegemonic Western knowledge
culture in the most dominant of all current languages, it occupies a singular
reference position of global reach - not to be compared by other Western WP's
that are more exclusively affiliated with "their" national or
linguistic culture. In this respect, the English WP may "inherit"
at least partially the supreme status of the Encyclopedia Britannica which is
- and always was - able to attract a worldwide public of readers as well as a
globally dispersed collectivity of first-rank contributors. This singular
significance may explain why since 2004, its share of collaborators does not
diminish in the same way as its percentage in articles and words, but has
stabilized on an astonishingly high level: well above 40%. Especially the
share of total collaborators has lately again been on the rise, so that at
the end of 2006, it has almost regained the level of 2003. Source: Erik Zachte's Wikipedia Statistics http://stats.wikimedia.org/EN/Sitemap.htm As a consequence of this outstanding
hegemonic role, the English WP has a particular responsibility in
transmitting accurate, complete and consistent knowledge, because it serves
as an authoritative knowledge provider for so many users, including the
active contributors of many lower-scale Wikipedias all over the globe.
Therefore, it will come more under particular pressure to install
far-reaching mechanisms of internal control. Secondly, it is highly probable
that the English WP becomes an arena for any kind of global controversies on
scientific, ideological, philosophical or religious levels, including the
imminent clash between Western and Islamic culture. 6.5 Flexible polymorphic organization Conventional encyclopedias are produced
under the condition that many parameters are rather irreversibly fixed in
advance. For instance, the realms of knowledge ("ontologies") to be
included have to be circumscribed, lists of experts to be called for
contribution have to be compiled, and the organization of the whole
enterprise (in terms of roles, competences, norms, procedures, deadlines
etc.) must be defined. In many cases, there additional exogenous constraints:
the publication has to insert itself into the tradition of antecedent
editions of the same product (e. g. Brockhaus or EB), it has to fit into the
larger edition program of the publishing house, and expectations of potent
sponsors may have to be satisfied. As a consequence, such printing projects
are likely to be overdetermined: far from being flexible for adapting
to environmental needs, the personal and organizational parameters may even
be in contradiction to the stated mission and the concrete functional needs. Wiki encyclopedias do not require such
antecedent decisions. They may start “from the scrap” as very embryonic
projects without any explicit planning and design. Within a process of
unplanned incremental growth and “open-ended evolution”, it will be
determined ex post what contents are considered, who participates in what way
and what kind of organizational procedures and structures may develop as a
result of manifold smaller decisions and adaptations. Of course such projects
are likely to be “underdetermined” because on the one hand, structures
are flexible enough to adapt to task needs an environmental conditions; on
the other hand, such conditions are themselves not “given”, but subject to
changing collective decisions. Evidently, Wikipedias are better adapted to
highly complex and dynamic societies where the parameters needed for
organization building are not known ex ante, because ·
the world of relevant knowledge is so rich
and so volatile that it cannot be represented in a blueprint model; ·
the distribution of knowledge among members
of society is not known (and highly fluctuating), and ·
the
activities needed for selecting, formulating and synthesizing encyclopedic
contents are so manifold that they cannot be reduced to formalized
procedures. Apart from its anchoring in the Wikimedia
foundation, the Wikipedia’s internal structure is primarily shaped by
endogenous forces, and it is highly flexible and self transformative, because
it does not rely on the acknowledgement of externally generated status
criteria like educational degrees or professional reputation.
This means that participants have primarily
to relay on their own judgment whether they are knowledgeable enough to
contribute to a specific topic, or what they have to learn additionally in
order to possess all the relevant information. And when modifying already
existing texts, they need enough self-confidence for being sure that they
know better than their predecessors. Especially in the large sphere of more
marginal articles rarely read by anybody, it is crucial that only the most
knowledgeable individuals feel a motivation for writing, while ignorants have
at least enough insight and self-control to abstain. While this “anarchistic individualism” has
proven to be a viable starting point (because it does not predetermine
specific organizational structures), there was never the intention to cling
to it for fundamentalist ideological reasons. Instead, the overall mission to
create a reliable encyclopedia made it necessary to create incrementally a
highly “polymorphic” system that combines elements of very diverse regime
types and organizational structures. [37] First, It is still
“anarchistic” in the sense that everybody can actively participate without
membership duties, without even disclosing his or her personal identity. Secondly, it is “autocratic”
insofar as the founder (Jimmy Wales) has the towering role of a “God King”
who can intervene in any possible ways without constitutional controls (Pink
2005). There is even an element of “latent totalitarianism” in the sense that
an unlimited centralization of power is easily possible without any
constitutional controls. Thus, Jimmy Wales himself and some of his admins
sometimes exert the power to “redefine” even the history of Wikipedia: by
eradicating earlier text versions so that no hints remain that they have ever
existed. Evidently, the leadership of Wales was particularly crucial at the
inception of the Wikipedia project where it was important to define the
mission (to create an encyclopedia, and nothing else) as well as the most
important behavioral norms. (e. g. the principle of neutrality). While the
personal (somewhat “charismatic”) authority of Wales may be large accepted
currently because he is considered as a “benevolent dictator”, it
questionable how this personal authority will ever be substituted when he
leaves. Third, the WP structure is
“bureaucratic” insofar as various roles with highly formalized competences
and duties have been created. In fact the Wikipedia confirms the regularity
that when open social systems want to maintain a higher state of order, they
are forced to generate high (and permanently increasing) levels of
formalization and highly sophisticated mechanisms of control, because they
have to deal with a large variety of problem cases and with very
heterogeneous collaborators. An increasing number of “admins” is regularly
patrolling the system: with special competencies to block editing, delete
articles and revert texts to earlier versions. Many of them do a rather
regularized job by getting alerted whenever specific pages are changed. so that they consult in order to "approve" the
additions or to revert unwelcome modifications. Their appointment has itself
become a matter or highly formalized nomination and voting rules , so that
superior “bureaucrat” and "steward" positions had to be created for
taking the decisions or supervising the procedures, and a highest elite of
"developers" has emerged that can implement direct changes to the
software and database. In addition, mediation bodies and a higher ranking
“arbitration committee” have been installed to rule in cases of severe edit
wars. And most importantly: an increasing canon of explicit rules, norms and
procedures has been formulated for informing all users about the provisions
they have to take when creating new or editing existing entries, and for
guiding their informal policing activities. However, these bureaucratic
mechanisms have only a subsidiary role insofar as they come into play only
“after substantial play has been given to self policing by participants, and
to informal and quasi-formal community based dispute resolution mechanisms.“
(Benkler 2006: 104). Fourth, the Wikipedia is
“democratic” insofar as “admins” are selected by elective majorities (even if
such elections are not based on a representative participation), and as many
other decisions are the result of lively open discourse and deliberation (e.
g. on the “talk” pages). Fifth, it is somewhat
“plutocratic” to the degree that is depends financially on donors who may
have a say over the strategy of the whole enterprise, and that considerable
power is exerted by the governing board of the “Wikimedia foundation” – the
non-profit frame organization which owns the material assets. Sixth, it is “technocratic”
insofar as specialists determine the development of Wikipedia on the software
level (protocols, programs and network technologies). And seventh, finally, it is
certainly highly “meritocratic” because only participants with high activity
level and excellent performance records have a chance of being appointed to
higher roles. [38] Given the lack of "vested
interests" usually pursued by fully paid employees and managers (e. g.
for securing employment or maximizing prestige and power), there are good
reasons to believe that the evolution of the WP's organizational structure
follows "contingency theory" [39] by
adapting flexibly to task types and environmental conditions. A conspicuous characteristic of this
organization is certainly the complementary interplay between decentralized
structures centralized structures. a constellation not
too far from conventional encyclopedic productions. All printed encylcopedias always rely on a complementary relationship
between There is some evidence that a similar labor
division between "complexity generating" and "complexity
reducing agents has emerged within Wikipedia, without conscious planning. Thus, Aaron Swartz has
found that most of the text volume is provided by rather peripheral users,
while registered users and "admins" concentrate mainly on
additions, revisions, abridgments and deletions.
Such empirical findings plainly contradict
the position maintained by Jimmy Wales in innumerable speeches: that the
Wikipedia is predominantly written by a rather small community of about 3000
regular Wikipedians. They
imply that the growth and diversification of the Wikipedia is primarily
dependent on an expanding number of actively contributing users - not on an
increased work load carried by an invariant core. While it is heavily important to make
participation easy and rewardable for such large masses of occasional
contributors, it is difficult to do this because these peripheral users have
very little say and influence in the whole system:
On the other hand, the "editors"
are an important factor in maximizing downloads and readership: e. g. by synthesizing
materials, by eliminating technical jargon and by presenting the material in
clearly arranged forms. The future of the Wikipedia will heavily
depend on the equilibrium between decentralized contributions and
centralizing coordinations. It could easily be stifled if admins are turning
to a heavy-handed regime, and it could explode in chaos if these editing
services would weaken (e. g. because not enough unpaid volunteers are found
for such rather "bureaucratic" tasks). Another important problem is
that these occasional contributors are too little involved in the discussions
and modifications made after their postings. Very often, they may not consult
"their" pages frequently enough to see the changes and deletions
made by other users: so that modifications to the worse may remain
uncorrected. A similar symbiosis is found on the strategic level where
centralized leadership is needed in order to direct the efforts of content
producers into predetermined channels. For instance, some more responsible
editors create "stub articles" about hitherto neglected topics they
think deserve encyclopedic attention. by doing this,
they invite experts to channel their work energies on these topics. In other
cases, articles are characterized (with a remark below the title) as
insufficient: needing elaboration or a better indication of sources. In the
future, more efforts may be needed to bundle such unsatisfactory pages into
topical categories, and to address such bundles to specific groups of
scholars: inviting them to contribute their valuable expertise. It is important to see that internal
centralization is fostered to the degree that the WP is confronted with
external attacks to which is has to respond quickly and in a decisive
fashion. Thus, Jimbo Wales has installed the policy of "office
action" in order to avoid conflicts arising from imminent legal action
or informal complaints (e. in cases of problematic biographic entries).
Whenever a serious complaint is directed at Wikimedia Foundation (the legal
person responsible for Wikipedia), Wales or one of his delegates remove the
article temporarily, so that harm (e. g. personal slandering or libel) is
avoided the justification of the complaint can be examined. [40] 6.6 Community embedment Formal organizations are typically embedded
in larger, less formalized structures. Many are components of societal
institutions (like economy, polity, military, education) from which they
derive their basic values, norms and structural patterns (Powell/DiMaggio
1991), and others are parts of ethnic or religious collectivities or
world-wide social movements (Zald/McCarthy 1987: passim). In the case of the Wikipedia, this larger
substrate may well be called a “community”: in the sense of a rather stable
collectivity that acts as a breeding ground for common values and behavioral
standards, as a group context for interpersonal communication, as a reference
group for personal identification and as a supraindividual agency for
effective socialization and social control:
In contrast to communities that base their
identity on a common history and tradition, on a specific locality or even on
a particular founder, Wikipedians anchor their cohesion in the visible output
of their cooperative endeavors: the Wikipedia as it flowers and raises in
global recognition and reputation. As an objectified structure, the Wikipedia
has a dual quality: on the one hand, it constitutes a centralized focus on
which all contributors fix their attention; on the other hand, it constitutes
the decentralized environment in which every user easily finds his own "working
niche". While all contributors deal independently with their particular
subprojects, they at the same time feel a sense of togetherness: like masons
working on different walls of the same cathedral. In contrast to the
cathedral, however, the Wikipedia can become the workplace of thousands or
even millions of (simultaneous) contributors without compromising this basic
unity and centeredness which is the basis of communal integration: This
integration rather independent of any horizontal interaction, because the
vertical reference (of any peripheral member to the common focal center)
generates enough sense of unity and social integration. In fact, this
vertical (or radial) integration is so potent that users can engage in a
multitude of controversies and conflicts without running the risk of
desintegration. (The only threat is the exit option of "forking"
with its segregative implications). As seen in many cases, the Internet gives
rise to new collectivities by allowing self-recruiting activists and self-constituted
groupings to gain worldwide visibility and reputation by pooling their
efforts and resources producing widely accepted goods and services. Thus, the
Linux community has become potent enough to challenge Microsoft on the level
of operation system software, and the Wikipedia community seems to approach
the status of global “cognitive authority” defining the canon of “uncontested
human knowledge” as a result of a very extensive and long-term process of
cooperative interaction.
Because formal hierarchical control is weak
or inexistent, control has to be provided by the contributors in the form of
self-guidance on the one hand and mutual peer-control on the other. Thus, the
anticipation of being corrected by others subsequently can act as a powerful
motivation to avoid errors:
As a consequence, the integration of
Wikipedia articles in the WWW is relatively low. (see
Figure 12 above). In some aspects, the Wikipedia is dominated
by a “geek adhocracy” [43]:
an aggregation of self-recruited activists whose dedication to the project is
expressed in a very large amount of editing activity. “A vigilant army of self-styled Wikipedians defend the site and
enforce community policies based on the principle that Wikipedia is an
encyclopedia and not a forum for advertisements, slanderous remarks or
pictures of your cat. They police the site to try to establish a neutral
point of view, warn users against violating copyrights, and call for respect
toward the contributions of others.” (Wilson 2006). On the elementary levels of daily
activities and interaction, the communal culture is manifested primarily in a
particularized language. Thus, Wikipedians "revert" (or even
"rerevert" pages when they reinstate an earlier version, they love
"Wikignomes" who are dedicated to patient low-profile tasks like
correcting grammar mistakes or broken links; and they hate "Wikitrolls"
who permanently violate guidelines and engage in various disruptive
behaviors. Among the values of the WP community, a
"passion for correctness" stands out that is manifested in many
"edit wars" that appear highly ritualistic because they focus on very
tiny points like orthography and punctuation. For instance, there was
extensive discussion in the “September 11, 2001 attacks” article whether a
second comma should be inserted (after 2001).
As dissensus and quarrel lingered on for
weeks and months, the page was not promoted to the status of a “featured
page” in Jan 2005. [45] Similarly
trivial was a fight concerning the entry about scientology where contributors
argued for nine months over whether the Scientologist method of childbirth
should be called "silent birth" or "quiet birth." Like most communal collectivities in the
RealWorld, the WP community functions as a breeding ground for groupings that
arise easily among the members for dealing with specific temporary tasks:
Whenever the edition of an article is
dominated by a highly consensual group, it becomes very hard for outsiders to
intrude and to make their own contributions (Cormaggio 2006). This illustrates
that the principle of openness of the project (implying that every anonymous
user can edit all pages) can only be maintained when rather weak community
ties among collaborators exist. In fact, community is functional for purposes
of integration: e. g. for implementing homogeneous standards of filtering or
for fighting vandalism, but it is rather disfunctional for diversification
and growth: because such expanding activities demand openness for any new
contributions. Interestingly, the Wikipedia community has
reproduced in the digital sphere the same dichotomy between
"frontstage" and "backstage" performances that -
according to Goffman - is a general characteristic of groups operating before
a public (Goffman 1959). On the one hand, there is the frontstage of serious
work relationships: resulting in the articles everybody can see. Here,
individuals are under pressure to be behavior in a highly disciplined:
conforming to collective norms that strongly forbid the expression of
subjective emotions, opinions or the playing out of intimate interpersonal
relations. On the other side, there is a backstage of talk pages, online
conference meetings and bilateral exchanges that allow the playing out of
spontaneous personalized activities and the satisfaction of socio-emotional
needs. “Like at a paid job, some people choose to extend the relationships
they have within the ‘workplace’ to a context outside the workplace.
Metaphorically (and, sometimes, literally!), they stop by the pub with their
workmates and have a few beers. They may joke about situations "on the
job", they may talk about their personal lives.
They may even do back-of-the-napkin brainstorming sessions that fix problems
nobody expected. "Beers after work" happens on talk pages, User
talk pages, on the mailing lists, in edit summaries, in person-to-person
meetups, in private email, in IRC or Jabber chatrooms... the list goes on and
on. Whenever Wikipedians drop their business-like demeanor and address each
other as human beings, with warmth and personality, there's the smell of beer
somewhere in the digital air.” [46] 6.7 Keeping pace with current events and
discoveries Given the very long-term production and
diffusion processes, conventional encyclopedias had all a strong bias in
favor of past knowledge and knowledge about the past. This distance from
current knowledge was aggravated by the need to check everything thoroughly
(in order to preserve the reputation for reliability), and a strong tendency
to rely on texts that have appeared already in earlier encyclopaedic
editions. Therefore,
they always focused very much on knowledge as a stable, fully consensual
canon of immutable facts (e. g. historical events, geographical locations, or
just the meaning and spelling of words) or regularities (e. g. mathematical
or natural laws), and they abhorred the fields of insecure and volatile
knowledge where the state of the art changes weekly as a consequence of
additional data, ongoing controversies or new scientific publications. Many
(like Zedlers gigantic “Grosses vollständiges Universal-Lexicon”) did not
include biographies of living persons. Most in harmony with the clumsy
printing technology were thus dictionaries, gazetteers and historical
encyclopedias: mere compilations of atomized information pieces that had
never to be revised. The bias toward stable knowledge had also the effect
that many encyclopedias presented themselves as treasuries of highly
established elitist culture. Thus, the Brockhaus “Konversationslexikon” had
the explicit goal of making bourgeois parvenus fit for successful
participation in the more polite aristocratic circles of their time. Therefore, only encyclopedias of very
traditional societies could maintain the conception that they were mirroring
the whole of true knowledge: e. g. the middle age encyclopedia “Speculum
majus” of Vincent de Beauvais (1244) which aimed to provide a definitive view
of “the world how it is and how it
should become”. At
least since the Renaissance, encyclopedias have given up such ambitions by
acknowledging that in a dynamic society with permanently advancing knowledge,
attempts to crystallize out stable knowledge compilations can only have very
limited success. The more their focus shifted toward natural sciences and
technological branches, the more they had to face a universe of constantly
changing knowledge – without becoming ever able to react to such changes in
any flexible way. A very clumsy way to keep pace was to add periodic
updates in the form of monthly additions (e.g. the “Larousse mensuel
illustré” since 1907) or yearly volumes: (e. g. the “Britannica Book of the
Year” since 1938). As there was no way to integrate the chronological
additions into the alphabetical or systematic structure of the original
encyclopedia itself, they were not very helpful for the readers, because an
ever growing number of chronological volumes had to be consulted. However,
they had a useful function in complementing news media by setting daily
events in a broader, encyclopedian perspective, because
Of course, there was also a sharp trade-off
between size and updating possibilities. Only one-volume lexica (like the
medical "Pschyrembel") could be frequently updated to keep pace
with rapidly changing knowledge and terminologies. Paradoxically, updating
was most difficult in the case of encylopedias most needing it: large
multivolume works that contained detailed information much more subject to
change than shorter dictionary entries. This explains why the intervals between
subsequent editions of the Encyclopaedia Britannica have increased between
the 18th and the 19th century [48],
despite the immensely accelerated production of new knowledge that would have
made more frequent updates highly desirable. Similarly, there was always a very
unfortunate trade-off between updating capacities and the degree of
interdependence and synthesis of the knowledge presented. Updating is most
easy in the case of highly atomized, fragmented knowledge structures like
dictionaries or gazetteers, because each change or addition is only affecting
a single entry. The more knowledge is presented in interdependent structures,
the more frequently it occurs that a modification of one article has an
impact on several other entries. For instance, when the biography of a
politician has to be reassessed, this may have implications for other
articles like the history of his country or his political party, in which
this same person is involved. Similarly the emergence of a new scientific
theory may affect many articles where its impact on the interpretation and
explanation of different phenomena are discussed. One of the most fundamental and most
problematic innovations of the Wikipedia is to expand the notion of
“encyclopedic knowledge” to phenomena of contemporary society and culture, to
living persons and to current developments and events – thus entering into
competition with journals, magazines and other news sources in the Mainstream
Media system as well as in the Net. In fact, Wikis are highly efficient tools
for aggregating information about current events that are experienced by many
witnesses from different angles: such as earthquakes, hurricanes and
floodings, war battles, city riots, pandemies etc. In such cases, they can act as platforms
for the inductive collection of knowledge by many self recruited contributors
who may effectively enlarge and enrich (or also relativize
or falsify) the information provided by professional journalists or from
official sources (Dorroh 2005). On several occasions, the WP has already
proven its status as an authoritative news source because numerous
contributors are busy to keep pages tightly up to date with unfolding
developments and events:
Similarly, the Israeli-Libanon conflict in
summer 2006 gave quickly rise to a corresponding page that experienced more
than 10 000 edits within a few days: offering a multifaceted and highly
balanced account of the unfolding war while keeping pace tightly with all the
incoming news. [49] Likewise, it took only
four hours for the "Execution of Saddam Hussein" entry to evolve
through 630 edits into a detailed account of the event as well as on the
international reactions. Together with all the external hyperlinks, it
reached a size more than 1300 words. [50] Contrasting with the isolated short-term
news reports in the media, such Wikipedia entries often combine timeliness'
and historic depth at the same time. By attracting a large number of
contributors, such articles become sites of very time-compressed history
construction "from below": by aggregating highly diverse
information that cannot yet be integrated in overarching blueprints and
concepts because the event - as well the way it is interpreted by the media
and the general public - is still under way. The question arises whether such
articles arising from current news are later reorganized in the light of
subsequent developments and the broader, more distanced interpretations that
usually go along with evolving time. This hybrid role of the WP as a news source
and a historical source has several far-reaching implications. First, this implies that
much of its content is focusing on matters widely apart from the canon of
classical culture (like that transmitted in institutions of formal education)
e. g., computer games, TV series or Heavy Metal music productions. Given the
rather low average age of many most prolific collaborators (see 4.1), it is not astonishing that the entry about Augustinus is
less comprehensive than that about Britney Spears. Secondly, many articles are
inevitably incomplete, erroneous and controversial, because they refer to
subjects still in the realm of change and ongoing public discussion: e. g.
recent scientific discoveries, still active writers or singers or unfolding
political events. While the Wikipedia certainly derives a major part of its
skyrocketing popularity from the fact that it can also be consulted in such
current matters, it also suffers from the additional unrealistic expectations
associated with fulfilling this widened role. This problem has been dramatically
highlighted at the death of Kenneth Lay (mentioned above), when the Wikipedia
was heavily criticized because the true cause of death was only reported with
some hours delay. Of course, such denouncements are just revealing to what
degree the Wikipedia has already gained the status of a universal
news-knowledge provider – not to be compared with a conventional encyclopedia
which is given years for collecting, checking and reporting such kind of
information. However,
it is remarkable that even the Wikipedia has preserved a conservative bias by
disallowing the publication of any “original scientific studies” [51].
In other words: encyclopedic knowledge is still “second hand knowledge” that
has already been certified by the academic community: so that very new, not
yet certified knowledge has no place. Another ”traditionalizing” effect of Wikipedia stems from
the easiness with which already existing online texts can be included by
simple “copy and paste”. Thus, the Wikipedia contains much material from
rather antiquated encyclopedias that have been publicized on the WWW because
they have become part of the public domain (e. g. the Encyclopaedia
Britannica of 1911 and the “Catholic “Encyclopedia” of 1913). Given its permanent
modifications, however, the Wikipedia remains in a state of fluidity that
makes it difficult to integrate its contents into larger and more stable
cultural productions. For instance, it is difficult to cite a Wikipedia
article in any other text (online or offline), because one always has to
indicate at what exact time the article has been retrieved. 6.8 Changing usage patterns and user roles 6.8.1 Increased accessibility Conventional encyclopedias are ridden by an
unfortunate trade-off between size and usability. Only small packet editions
are accessible in a way their knowledge can be retrieved in a manifold of
places and become part of many different human activities and social
communications. Larger editions are clumsy to handle, typically stationary in
libraries or other rooms where few other activities than mere reading takes
place. As a consequence, many explicit ambitions of paper encyclopedias had
to remain utopian: the Brockhaus or Meyer notion that it should support
educated human discourse (“Konversationslexikon”) as well as the even older
concept that by providing all relevant human knowledge, encyclopedias could
help individuals to carry out all their daily tasks on a higher level of
competence. The mere physical problem of handling many heavy volumes is an
obstacle for cognitive synthesization. In fact, the increasing the number of
articles has inevitably to be paid by a rising fragmentation of knowledge,
because even when many cross-references are included (as in the classical
Brockhaus editions), they are not likely to be followed because too much time
and effort is needed to switch between different tomes. Thus, the trend
toward dictionary type encyclopedias with many smaller entries since the 19th
century (Brockhaus, Meyer, Larousse) has almost
eliminated the possibility to transmit more complex knowledge structures that
transcend the atomized level of explaining the meaning of particular words,
concepts or names. The Wikipedia’s accessibility is much
higher for three different reasons: 1. Hundreds of million users can reach it
almost anytime and anywhere on the WWW. Given their easy accessibility
(irrespective of size and internal fragmentation), digital encyclopedias can
fulfill better the function for which their conventional forerunners have
already been explicitly conceived: encouraging individuals to enlarge their
cognitive world by acquiring at least some basic knowledge about a topic
beyond their daily experience and professional expertise. 2. Every single user can access it in a
large variety of situational conditions and roles: e. g. for getting
immediately specific practical information for solving a current problem or
for inserting it in a developing document or mail communication. The chances
that such knowledge is actually mobilized when a need arises are far greater
– and they will increase additionally when Wikipedia knowledge is universally
available on mobile handheld devices. Such portable WP editions exist already
for the ipod [52] as
well as for notebooks and handheld ebook readers [53]
and for the Mobile phone (“Wapipedia”). [54] 3. Given the densely knit hypertext
structure of the Wikipedia, every user has at every moment an unlimited number of alternatives for
navigating through the system: thus realizing his specific preferences or optimizing
the way new knowledge can be integrated with what he or she already knows. In
other words: individuals are better able to transform decontextualized
universal encyclopaedic knowledge into contextualized individual knowledge
that can be assimilated to particular individual thoughts and activities as
well as social communications and cooperations. This easy integration into microsocial
contexts and individual roles provides good preconditions for expanding the encyclopedic universe from
factual and theoretical “know-what” knowledge to practical “know-how”
knowledge that can be used in everyday life for guiding any kind of human
action. For instance, by including advices about how to counter hiccups, how
to relieve headache or how to prepare espresso coffee, the Wikipedia revives
encyclopedic traditions of the 18th century where a similar weight was laid
on such practical knowledge (e. g. the first edition of the Encyclopaedia
Britannica 1768-71) – but with far better chances that it will be factually
applied 6.8.2 Combining receptive and participative
roles
To the degree that these people knew each
other personally, there could also be high informal feedback – among the
recipients as well as between recipients and producers. In the following
period, printing technology was responsible for a drastic segregation between
producers and consumers. As editors were increasingly confronted with an
anonymous unknown public, they lacked the necessary information for matching
their works with the recipient’s preferences. As
a consequence, we see the spread of extremely “producer-guided” encyclopedias
that are primarily conceived for expressing the cultural tradition of a
national elite or the ideology of an intellectual movement (like the French
encyclopédie), much less for
satisfying any needs of potential readers. This dissociation was reinforced
by the almost complete lack of backchannels: so that editors got no feedback
from the reader’s side that would have helped them to adapt better to their
wishes. In other words; there was a rigid trade off between expansion of
readership and feedback: The wider the distribution (thanks to mass
printing), the less it was possible to anticipate the structure and
composition of recipients, and the more complete were producers socially
isolated from their readers. When using a conventional encyclopedia, I
completely embrace the role of a pure recipient who is confident that the
information found is correct. This generalized confidence may of course be
based on the high reputation of the encyclopedia as a professional and
reliable source, but in addition, it is also made inevitable because as a
reader of the encyclopedia, I have no immediate access to alternative
information sources and no possibility to communicate with the responsible
producers. Under these conditions, of course, reliability is absolutely
essential; unreliable encyclopedias are completely useless. As a consequence,
printed encyclopedias resemble phone directories, road maps or train
timetables by aiming at an ideal state of “complete accuracy” - what implies
heavy costs because high marginal costs are associated with finding and
eliminating the very last remaining error. Thus, conventional encyclopedias
foster the regressive role of an “unconditional believer” who doesn’t take
any critical stance. Publishing occurs only at the final point of a very long
and complicated editing process that is usually completely hidden from the
eyes of outside observers. This implies the premise that readers are only
interested in the product, not in the intermediate processes of production. By contrast, Wikipedias develop in public,
so that all participants can not only observe and evaluate all successive
stages of development, but also participate in the formation and modification
of the rules by which these processes are guided, and intervene whenever they
see a reason. Thus, the categorical dichotomy between “producers” and
“readers” gives way to the hybrid role of the “participant” or “user” who can combine both roles in a way completely at his own
choosing: by sifting through materials others have written at one moment and
by posting his own contribution at another. As every article is permanently
“under construction”, users feel invited to read everything with a critical
eye and fundamental provisos: ready to validate any information by additional
sources whenever absolute certainty has to be achieved. As a Wikipedia user, therefore, my role is
rather complicated, because I have to combine my stance as a “faithful
recipient” with an element of skeptical role-distance: maybe the current
content contains errors or has been vandalized ten minutes ago – and in both
cases, I may assume the responsibility of not only noticing, but actively
eliminating such flaws. It has often been remarked that in contrast
to oral speech, written texts facilitate critical reflexivity because they
stand out as objectified artifacts that can be interpreted by anybody at very
different occasions from widely different angles. Thus, they give rise to a
communicative meta-level where they themselves become the object of oral talk
or written commentaries. However, these reflexive capacities could not
develop fully in the printing age, because in most cases, readers had no
feedback channels available for expressing and communicating their thoughts. Digital texts on the Internet are disposed to catalyze much higher
levels of reflexivity for two reasons: Wikis have a particular capacity to evoke
critical reflections, because they make it extremely easy for every user to
implement changes and add commentaries, while keeping everything that was
ever written ready for retrieval. On a more general level, the Wikipedia
catalyzes reflection because millions of users contribute to a very wide
range of opinions and preferences – engendering controversies on the primary
level of substantive knowledge as well as on the metalevel of procedural
norms. For instance, highly sophisticated discussions about the neutrality
principle are constantly going on: giving rise to a heightened awareness of
all the subtle, implicit ways it can be violated (e. g. by using insinuating
“weasel words”).
This reflexivity is particularly manifested
in the human sciences where there are many scientific concepts that have an
intrinsic ideological bias because they have been created and elaborated by
people sharing a particular (e. g. political) view. For instance, this is
the case for the term "Right Wing Authoritarianism" that has come
under fire by conservatives who claimed that it has an intrinsic leftist
bias:
In such cases, it is very helpful that in
the article's heading, it is indicated that a discussion about its neutrality
has arisen: so that readers get sensitized to these problems of which they
otherwise would not be aware. Such examples illustrate to what degree the
Wikipedia has the potential to internalize dissensus and conflict instead of
communicating a fictious impression of universal agreement. While
conventional encyclopedias support the notion of a canon of
"unquestioned truth" (by simply leaving out dissensual views), the
Wikipedia is open to reflect any kind of manifest dissensus - thus submitting
all truths to a much harsher test of acceptance. By its mere lack of reliability, the
Wikipedia demands mature recipients that are capable of receiving information
while at the same time preserving a critical attitude: motivating them to
corroborate the information by consulting additional sources. A critical
stance is particularly encouraged in cases where an article is highlighted as
being “controversial” (e. g. about abortion, homosexuality, Taliban,
etc.): so that users know in advance that they have to rely on their own
judgment, instead of absorbing a nonpartisan, “absolutely neutral” point of
view. Evidently, all these possibilities for
personal participation provide ample opportunities to solve tensions and
conflicts in smooth, inconspicuous ways. If I disagree with an entry in a
conventional printed encyclopedia, I have no alternative than to protest
harshly or even sue the editors legally. When the same happens with a
Wikipedia entry, I have many other less offensive options: correcting the
entry myself or writing to admins that it should be corrected. Similarly,
when articles are of low quality and transport erroneous information, this
may not be a reason for denouncing the whole publication and for turning to
alternative encyclopedias, but just engender the motivation to contribute
personally to an improvement. 6.9 Public visibility of production processes
and resilient adaptation We all know to what degree major cultural
achievements are the products of widespread and enduring collective efforts.
Thus, the evolution of law has been promoted by a multitude of infinitesimal
contributions like court judgments, legal commentaries or academic opinions;
and advanced technologies are the results of manifold improvements enacted by
anonymous engineers and technicians. However, we usually see just the final
products, while the production processes remain hidden: inaccessible for
analysis as well as for deliberate control. Thus, when Berger/Luckmann follow
Husserl and Schütz in characterizing empirical reality as an “intersubjective
construction”, they just indicate at the result without unveiling the
underlying processes that have lead to it –so that it remains unclear who has
participated to what extent, and whether the said processes could have let to
alternative results (Berger/Luckmann 1999: passim). The word “tradition” is usually applied to
such past legacies in which we find ourselves embedded like in natural biotopes,
unable to know why and how they have come into existence and unable to
determine their further development in the future. Seen under this
perspective, the online productions in Peer-to-Peer networks are innovative
in the sense that they make cultural production processes explicit and
completely visible to all interested eyes. They share with “traditions” the
basic feature that the products of collective endeavors degrade individual
authors to the modest role of mere “contributors”. But unlike “traditions”,
these molecular inputs can be identified, regulated, modified or reverted at
will, and the system of rules under which these contributions generated can
be explicitly stated and systematized as well as changed by specified
authorities and transparent formal procedures. All encyclopedias must find ways that the
information they convey is accepted as "authoritative": in the
sense that normal recipients believe that it is reliable and that it
represents the most advanced state of knowledge available at the current
time. In the printing age, there was no alternative than to rely on indirect
authority of personal credentials: The authority of the encyclopedic
knowledge was derived from the high reputation of its contributors: e. g,
their academic degrees, Nobel prices, etc. Of course, this implied a
high trust in the formal institutions responsible for distributing such
credentials: e.g. in the quality of academic education and certification. By
contrast, the Wikipedia can make itself independent from such derived
authority sources because it is able to produce its own primary authority
which emerges from collective online interaction. In other words: Wikipedia
articles are not trustworthy to the degree they stem from reputated scholars,
but to he extent that are the (preliminary) end product of all the preceding
edits and discussions to which many collaborators with different perspectives
and knowledge background have contributed. Why do these procedures make
knowledge authoritative? Because they have been going on in public light and
have been stored in a fashion that they can be recapitulated by anybody
anytime: at present and in the future. Thus, the Wikipedia exemplifies
Luhmann's hypothesis that in modern societies, traditional legitimation is
replaced by "procedural legitimation" ("Legitimation durch
Verfahren"; Luhmann 1968). (Another example is the evolution of the
political system where modern law derives its authority no longer from
tradition or the charisma of a founding leader, but from widely accepted and
transparent of law-enacting procedures (e. g. citizen votings or
parliamentary decisions)). From a functional point of view, this procedural transparency provides
the basis for flexible self-correction processes that enable the Wikipedia to
cope successfully with a wide range of exogenous and endogenous disturbances.
Social systems can be classified according to they way they deal with events
that may threaten their essential structures and functionings. At the one
extreme point, there are “resistant systems” that defend themselves against
disturbances by preventing their emergence (by suppressive activities) or
their intrusion (by boundary controls and filtering). At the other extreme,
we find “resilient systems” that allow any disturbances to enter, but then
mobilize self-correcting mechanisms in order to eliminate them in due time or
to make them compatible with their own structures and goals Conventional encyclopedias are clearly
“resistant systems” that emerge in the context of formalized and centralized
organization. By applying highly selective methods of recruitment,
bureaucratic rules and permanent supervisory controls, they take care that
from the onset, no deviant productions caused by dilettantism or intentional
vandalism are generated. Such unbending discipline
is all the more important as texts are finally frozen on paper, so that no
corrections can be made ex post. Processes of improvement and growth
typically take the form of discrete major steps (e. g. "editions"):
each of which characterized by a multitude of smaller changes (or even a
major change in the work's architecture). By contrast, Wikis develop continuously
over a very large number of minor revisions, so that users may find a
slightly modified version at every moment of consultation. Thus, they remain
forever in the unfinished stage of "Perpetual Beta" (Tim O'Reilly):
by inviting users of any specific article have to adopt an attitude of
"critical acceptance" by synthesizing two contradictory expectations
at the same time: that the information offered is basically correct and
useful, but still so incomplete and faulty that corrections, improvements and
updates are needed (O'Reilly 2005). In the case of commercial goods or
services, such a philosophy of "continuous improvement" is
difficult to adopt, because customers find themselves at a loss when they try
to gather sufficient information about the products' current quality (and
corresponding price). Thus, the Wikipedia exemplifies the resilient type
system that remains permanently vulnerable to all kinds of disturbances, so
that the maintenance of order is completely dependent on the self-correcting
activities that set in after they have intruded. The way it works is by
having a large number of people who keep track of recent changes, often
through watch lists, which notify the user whenever a page they have marked
has been edited. As all the subsequent versions of an article are stored in
the “page history”, it is technically extremely easy for anybody to cancel
any recent changes by just restoring an older version. This feature results
in a “conservative bias” which is of course functional for fighting
vandalists or fierce ideological crusaders, but which may also discourage new
contributions (because of the fear that even very laborious contributions are
just wiped out). In a study of the page histories of
Wikipedia's English language version, MIT and IBM researchers Viégas,
Wattenberg, and Dave have demonstrated that most Wikipedia vandalizations are
usually corrected within very short time (a few minutes), so they will escape
the notice of most users (Waldman 2004).
The efficiency in dealing with vandalism
demonstrates vividly that Wikipedians constitute a tight community – despite
the large geographical distances and very weak personal ties among the
members:
Resilience implies that at any given
moment, the system looks somewhat degraded or even chaotic, because it
contains a certain number of (yet) uncorrected errors, Of course, no such
deficiencies are tolerable in cases where information has to absolutely
reliable because highly consequential actions are based on them (e. g.
timetables, price lists, legal codes, telephone directories etc.). On the
other hand, resilience provides flexibility and openness for innovation,
because systems remain free to decide which of the intrusions have to be
treated as negative disturbances to be eliminated, and which should be seen
as enriching “innovations” that should be kept (or even subject to further
elaboration). Whenever a topic is controversial (e. g.
for ideological and emotional reasons), a transitory period of “irrational”
postings characterized by extreme opinions can be observed, before more objective,
neutral formulations take the lead. Sometimes, fierce “edit wars” are
engendered between participants who permanently erase each others version.
For setting limits to such escalations, the 3RR rule was established:
forbidding any single user to enact more than three reversions of a page
within 24 hours (except in cases of manifest vandalism). [57] In
addition, a temporal protection of a page can be requested in order to cool
down heated editorial warfare. [58]
Such temporal measures are highly effective because most edit wars is
associated with current public moods and discussions that rapidly fad away
when other topics come up. The more controversial a topic, the longer is the
time period during which users may be confronted with rather one-sided,
opinionated entries. But in the longer run, emotions tend to cool out, so
that extremist passages are weeded out and substituted by more neutral
formulations in accordance with the official “Neutral Point of View” (NPOV). Mechanisms of resilient self-corrections
are highly functional for dealing with smaller, decentralized problem cases
that can easily be handled by the voluntary patrollers. However, they reach
limits in cases of sudden massive disturbances that may lead to a “work
overload” of these policing members. In such cases, resilience has at least
partially to be substituted by defensive resistance measures, so that
intrusions are blocked before they enter the system. Such a situation occurred at August 1st
2006, when the American Comedian and Satirist Stephen Colbert told his
viewers to update the Wikipedia article “Elephant* in order to include the
information that “the population of African elephants has tripled within the
last three months.” After this broadcast, dozens of viewers crowded to the
Wikipedia site in order to insert this addition, while policing users quickly
got equally active for permanently reverting such
massive vandalizations. Very soon, administrators exerted their competence to
semi-protect the page: making it temporarily impossible for any unregistered
and new users to implement changes. As even registered users continued to
insert the misinformation, the site then was momentarily completely immunized
against changes by setting it under “full protection.” This example illustrates that in contrast
to their printed predecessors, digital encyclopedias can combine resilience
and resistance in highly variable ways: e. g. by limiting protection to
particular pages, user categories and/or specific spans of time. Of course,
the deliberations on such decisions are also adding to the hypertrophic
overhead of “meta-discussions” as well as to the never-ending expansion of
formalized procedures and rules. 6.10 Unguided incrementalism and unplanned
“memetic evolution” The WP relies on a complex process of
"cultural darwinism" [59]:
which is based on the complementary interplay between three mechanisms: 1) Production of variation:
generated by broad base of users who produce a large pool of memetic
variants: by creating new articles, inserting additional information and
proposing alternative formulations. 2) Selection procedures: provided by
collaborators (including admins and bureaucrats) busy to scan and filter all
these new entries in order to weep out nonsense and to keep the WP's
evolution in line with specific standards. 3) Mechanisms of stabilization:
based on a third layer of activities preventing and reverting cancellations
and vandalisms, so that the acquired quality level of the WP is maintained. The speed and direction of evolution
depends heavily on the working of these three sets of mechanism and on the
specific may they are combined. For instance, too high production rates in
innovative variations will strain even highly efficient filtering mechanisms
beyond the limits of their capacity, and on the other hand, variant
production may shrink drastically when collaborators see that most of their
contributions are constantly weeded out. We may safely contend that at least in
these early phases, the open source model of the Wikipedia favors variation
over selection and stabilization, because inn decentralized peer-to-peer
networks, there are no hierarchical agencies deciding about right and wrong,
effectiveness and uselessness or falsity and truth. Instead, such
authoritative decisions have to be substituted by horizontal control
processes among the collaborators: preferably guided by similar norms of
universalism, communism, disinterestedness and
“organized skepticism” as they (should) reign in ideal type scientific
communities (Merton 1942). Of course, in the case of highly
specialized entries where the number of experts and visitors is very small,
simple lack of manifest dissensus will not be a sufficient indicator that
consensus has been reached: because even major insufficiencies and flaws can
persist for long time spans when nobody takes notice or motivated to make any
additions. The higher the user activity, however, the more the assumption is
justified that lacking criticism indicates that “everybody” (or at least:
many visitors with very different viewpoints) actually agrees – or that some
disagree so little that they don’t find it worthwhile to articulate dissent
or make corrections. As we can learn from successful scientific
or technical communities, such horizontal peer exchanges are most functional
when all members can easily agree whether a contribution made is valuable, a
specific problem has been solved, or particular goal has been achieved,
because the outcomes can be objectively assessed and evaluated. This is
certainly the case in open source software production projects (e. g. Apache
or Linux) where any piece of proposed code can immediately be tested whether
it is functional or not. Under such conditions, no hierarchical evaluations
and authoritative selection processes are necessary because successes and
failures stand out objectively, so that they can easily be verified and
corrected by any member of the community. It is evident that in open source
encyclopedias, such preconditions are not persistently fulfilled. For sure,
there are many contributions whose truth or falsity can easily be assessed,
because they relate to highly indisputable, objective facts, natural laws or
mathematical-logical operations. Here, errors may become rapidly eliminated
because whenever a correction “to the better” has been made, nobody has any
sound reason to return to the earlier version. However, many contributions are “arguable”
in the sense that they rely on viewpoints, opinions and evaluations that vary
between the contributors as well as between the sources on which they rely.
In such cases, the return to hierarchical controls may be inevitable in order
to end “edit wars” that would never end by themselves because there is no
objective test for adequacy or truth (Schiff 2006). The idea of a Wikipedia would be
particularly displaced if a “constructivist” epistemology is maintained:
because this would mean that instead of general theories competing for
universal recognition (in a Popperian sense), there are only co-existing
“narratives” which are consensually accepted only within confined and
transitory "discourse communities". The most adequate epistemology
for the Wikipedia is evidently an objectivist paradigm of truth: the belief
that knowledge about everything can reach a definitive form on which all
reasonable human beings can (or even must) agree. It’s no surprise that Jimmy
Wales clings firmly to an objectivist understanding of knowledge which gives
him the confidence that contributions finally converge in the approximation
to a definitive intersubjective and intercultural truth [60].
Contrary to most contemporary epistemological philosophers, true Wikipedians
believe in an absolute aperspectively constructed truth existing beyond all
cleavages of particularistic and idiosyncratic human opinions and
convictions: While the WP shares this premise with
traditional encyclopedias, it contrasts sharply by following not a deductive,
but a highly inductive way of objectification. Printed encyclopedias have an affinity toward deductive processes of
reasoning and classification because their top-down organization makes it
necessary to begin with blue print knowledge structures which then are filled
out by the different contributors. In natural science, for instance the
editing committee typically relies on highly accepted taxonomic systems, so
that specialists can be searched and invited to deliver contributions about
specific chemical elements, or about different,
orders and genera and species of animals and plants. By functioning as ex
ante premises of encyclopedia organization, such conventional conceptual
frameworks are reinforced rather than called into question - because scholars
that maintain deviant concepts and typologies will not be invited. A most outstanding
example for this deductive top-down conceptualization is the Propaedia that came
with the 15th edition of the Encyclopedia Britannica (in 1974): a 1000 pages
book offering an extremely detailed outlay of all spheres of human knowledge
by classifying it into ten major spheres and by disaggregating each sphere on
seven hierarchical levels [61].
It may be considered one of the most conservative books in recent history:
because whoever uses it has no alternative than to let his searching
activities guide tightly by these authoritative conceptual schemes. Wiki-based
online encyclopedias certainly also cling to these pre-existent
conceptual structures, because most collaborators identify with them, and
because editor use them for channeling incoming contributions (e. g. by
creating "stub"-articles about concepts that deserve a more
elaborate treatment).In addition, however, they have an intrinsic leaning
toward inductive conceptualizations that are arising out of an uncoordinated
multitude of independent proposals. Such "folksonomies" are
characterized by a more prototypic than categoric way of categorization: so
that imprecise and overlapping interpretations and attributions may occur.
Such inductive terminologies have the
advantage that they remain open for flexible innovation - due to the rise of
new phenomena or the change of relevant differentiations (e. g. when new,
cultural fashions - like music styles, or art forms - or unprecedented
ideological or religious movements arise). On the other hand, they have
extreme shortcomings because their usage remains basically restricted to the
collectivities that have produced them, and they remain ambiguous (e. g.
because often several different meanings are given to the same terms). In the wide areas (like politics,
ideologies and religion) where objective truth can never be attained, the
Wikipedia tries to achieve consensus by clinging to the “Neutral Point of
View” (NPOV): one of the thee highest-ranking guiding principles of the
official Wikipedia policy that is defined to be immutable even if all editors
would agree on a modification. [63]
By aiming at a “neutral point of view”, WP
envisages an optimistic belief in the possibility of reaching at least a
minimum universal canon of human knowledge that is accepted consensually by all
“rational human subjects”, because it cannot be meaningfully refuted. In the tradition of
rationalistic strands of philosophical thinking (Leibniz, Kant and Habermas),
it is supposed that there are highest level principles of “formal reason” on
which all human subjects – irrespective of any divergences on any “material”
questions – may voluntarily agree.
In a multicultural world, such a consensus
about evident truth can evidently most often not be reached on the primary
level of substantive evaluations or empirical facts, but only on the
secondary formal level: on the assertion that there exist people who hold
certain principles for valid or who hold certain facts to be true.
Thus, only noncontroversial topics can be
treated on a primary level (=discussion of facts); all controversies have the
effect that a topic can only be discussed on a meta-level: representing
“fairly” all the different positions and beliefs. (Sanger 2001). [66]
This statement clearly demonstrates how
difficult it may be to avoid all perspectivism even on subtle, inexplicit
levels of textual structuring and linguistic expression. For instance, the
mere sequence in which positions are represented (or the volume of text
allocated to them) implies decisions which are most certainly guided by
subjective preferences. Similarly, authors will reveal their subjective
opinions in innumerable other ways: e. g. by characterizing various positions
as “popular”, “sectarian” or “empirically founded” views, or by focusing
content ethnocentrically on their own nation and culture (Sanger 2001). If it is difficult to describe an empirical
fact or development fairly, why should it be less difficult to describe
disputes about such facts or developments in fair, objective terms? Can any
contributor be expected to have full knowledge about any such dispute and
about the number and quality of its supporters (even within a small time span
and geographical area), especially in cases where they have been shaped by
many scientists and intellectuals with very different positions? As Sanger
states, it is useful to treat this as an empirical not as a philosophical
question. It can be answered affirmatively in all cases where articles have reached
a stage where they are factually accepted (=not generating any additional
controversial discussions) (Sanger 2001). In fact, however, such highly relativistic
principles are not fully upheld in the Wikipedia, because in most cases, the
positions that claim “scientific” validity are privileged in relation to
“sectarian” exotic positions (even when these would have a higher absolute
number of believers). (e. g. Darwinist evolution theory is taken much more
serious than creationist views). If this “unity of scientific doctrine” would
be abandoned, the Wikipedia would degenerate into a universe of ethnographic
narratives that would have to give room to all indigenous cultures and all
(even highly exotic) minorities of dissident believers. While the strategy
of representing different viewpoints or theories cannot be stretched to
include every possible position maintained by any individual or tiny group,
it can nevertheless be applied in order to end “edit wars” between highly
articulate disputants.. Thus, we arrive at the
conclusion that the “truth” developed in the Wiki process merely represents a
reconciliation between positions actively maintained by online editors: just
a “truce” between adversaries who have decided to end edit wars because they
all find their own different views adequately represented – or because they
have become just too tired to fight on.
As current events and developments
(discussed in the media) are most likely to engender heated debates, an
effective measure to deescalate conflicts may be called the "strategy of
deactualization". For
instance, there was much debate about articles called "Persecution by
Christians" (Muslims or Jews), and votings nearly resulted in their
deleting. However, these pages were kept, but partially neutralized by being
renamed into "Historical persecution of Christians" [68] (or
Muslims [69]): in order to avoid overt conflicts about
current events. (Unsurprisingly, an even stronger measure of neutralization
was implemented in the case of Jews by renaming the entry "Ancient
historical persecutions by Jews" [70]). Evidently, the Wikipedia invites us to see
the process of human knowledge production as a process of Darwinian
"memetic evolution" [71].
The cognitive patterns fittest for survival are those maintained by strong,
highly articulate individuals or groupings motivated and able to defend their
views successfully in “edit wars”. If they are completely victorious, they
may be able to define their opinion as the only “scientifically founded
position”: so that alternative positions receive much less (or even no)
explicit recognition. Thus, the Wikipedia is exposed to the same critical
arguments as they were directed by the ancient Greek sophists against any
consensualist theory of truth:
The problem arises from the fact that
whenever there is a memetic competition, it is highly probable that the
engagement of the different sides is not equal in strength. For instance,
religious believers may be extremely determined that the entry on their
founder does not contain any "negative" biographic information,
while all the outsiders may have very little interest in this whole matter. As a consequence, the
believer's zeal to keep the article "clean" is not counteracted by
a similar effort of nonbelievers to keep it in accordance with the standards
of the "Neutral Point of view".
It is evident that the Wikipedia has to rely
very much on widespread groups of liberal nonbelievers that are ready to
fight for their Western standards of tolerance, openness and objectivity with
the same fervor and zeal religious fundamentalists defend their dogmatic
beliefs. Evidently,
this implies an openness toward multiple and
changing viewpoints that is not consistent with closed dogmatic belief
systems as they are maintained by Islamists or other adherents of religious
fundamentalism. It's no surprise therefore that such medieval minds feel
threatened by an intellectual enterprise in which they see no chance to
dominate and to eradicate unwelcome "dissident" views. This position is well
formulated in an essay of Abid Uallah Jan who criticizes that in he WP
article on Islamism, "cultists" like Ahmadis, Habashis and Ismaelis
are considered to be Muslims despite the fact that in contrast to "True
Moslems", "...they do not believe in the totality of the Qur’an
and the finality of the Prophethood". (Abid Ullah Jan 2006a). A WP editor has
responded that these groups are considered to be Moslems because they
themselves maintain such an identification.
Abid Uallah Jan's essay makes it evident
that from an Islamist point of view, the Wikipedia is a particularly
effective weapon in the War of "Islamophobes against islam" because
is contains innumerable formulations that appear faulty, inimical or even
blasphemic from a strictly fundamentalist perspective: statements hard to
fight against because they stem from so many different (and mostly anonymous)
sources:
Of course, trying to synthesize a “neutral
assessment” is in itself an authoritarian endeavor because all other (e. g.
monographic) representations are implicitly degraded as one-sided and
ethnocentric, as they have not passed through this elaborate process of
synthesis and purification. While the “neutral article” occupies the center place
of attention, all these more subjective or ethnocentric articulations are
marginalized by being diverted to the collateral “discussion page” where
controversies can go on that may later have visible impacts on the article
itself. These “talk pages” are the very for a where memetic evolution
processes go on and where everybody can observe how “reality” is constructed
as an emerging result of free intersubjective communication. Such
constructive endeavors are particularly prominent in the case of unprecedent
new unfolding events or developments, where fundamental problems of
conceptualization have to be solved. This was vividly illustrated in the
entry “Israeli-Lebanon conflict" in Summer 2006. While an impressively
equilibrated exposition has soon be realized as a result of 9000 edits
(between July 12th and July 29th), extensive controversies about very subtle
terminological points were fought out on the parallel discussion page:
whether the process described should be named “conflict” or “war”, or whether
Israel soldiers have been “captured”, “kidnapped” or “abducted”. Some articles may even become temporarily
protected from editing until fundamental disputes have been resolved. For
instance, the article of “New anti-Semitism” was frozen by administrators in
May and June 2006 “until disputes on the talk page have been resolved”. [75]
The controversy resulted from the fact that he concept “new antisemitism” is
used by rightists for defamating the political left: by attributing them a
generalized new tendency to take sides against Israel (and even worse:
sympathizing with blatantly antijudaist Moslems). Leaving the page
unprotected would have resulted in a permanent edit fight between rightists
who want to upheld this attribution and leftist liberals who deny the
justification of the term because they want to draw a clear division line
between decrying Israel and defamating the Jews.
While this controversy cannot be avoided of
course, it is dealt with in a deescalating manner by diverting it to the
discussion page associated with the article. In this particular instance,
however, protection was lifted after two months without that the conflict has
been settled by discussion. Instead, some steam has been removed in the
meantime because the leftist opponents to the page have founded a “revenge
page” about Israel's alleged “Apartheid” policy. [77] From such examples, we
may draw the unsurprising conclusion that like the UN and other global
institutions, the Wikipedia cannot expected to solve persistent global
conflicts, but at best to offer some new opportunities for extensive
discourse and sophisticated verbal clarification. 6.11 "WP-Notability" as a new digital
divide In contrast to printed encyclopedias, the
total volume of the Wikipedia is not limited by physical and economic
factors. Nevertheless, in proportion to the huge number of edits the WP shows
rather modest rates of growth, many new articles are quickly eliminated by
admins who think that the topic is "not notable" enough to be
included in a repository of universal knowledge; and many enlargements of
existing articles become quickly "reverted" because information is
judged to be too trivial or beside the point. The problem with this filtering
is that it is not guided by any consensual explicit rules and not executed by
a clearly defined decision making body. Anybody can post a request that a
specific article should be cancelled, and for any type of intransparent
reasons, it may occur that a "majority" for such an action can be
found. Given the rising significance and
popularity of the WP as a reference source of information, such filterings
become increasingly important because a Wikipedia article may soon be
considered as an indicator of relevance, eminence, popularity and reputation
- for persons as well as for music bands, art works, localities, historical
events and any kind of voluntary association.
Currently, such decisions are guided by a
multitude of informal criteria that primarily reflect the personal values and
preference of the "Wiki mandarins" (mostly between 20 and 30)
because they have never been submitted to a public voting or any other
legitimating procedure.
It is no surprise that the WP leadership is
often inundated by protest emails from the "victims" of such harsh
elimination procedures - users who do not know about these rules or who do
not agree with them. Of course such elimination strategies may promote the
installation and growth of provincial "minority language
Wikipedias" because they provide an at least small forum for many
"domestic" personalities and topics that have no chance of being
considered in the global English edition. In the future, we will certainly see much
more conflictive action concerning the "rules of notability" as
well as on the admission or omission of particular entries. This
"politicization" of exclusion/inclusion will certainly raise the
need to clarify selective criteria and rules - as well as the procedures
dedicated to their constitution, change and specific applications. Such processes will of
course be facilitated by the fact that filtering takes place in full public
light. For instance, everybody can consult the daily lists of articles
nominated for deletion. For the first time in history, a broad open
discussion about "encyclopedia notability" has been started that
has already given rise to intensive debates and detailed - while still
unfinished and unofficial - lists of possible criteria. In the guideline page
dedicated to the notability of people, for instance, it is stated that among
others. Persons with the following characteristics should be included: ·
Published authors, editors and
photographers who received multiple independent reviews of or awards for
their work; ·
Painters, sculptors, architects, engineers,
and other professionals whose work is widely recognized (for better or worse)
and who are likely to become a part of the enduring historical record of that
field; ·
Persons achieving renown or notoriety for
their involvement in newsworthy events, such as by being assassinated. Such sentences - like many others -
illustrate that the Wikipedia sees itself as a publication that relies on
reputation that has already been produced ex ante: especially when it is
based on consensual mass media judgment or - in the case of lesser known
individuals - on different smaller, but mutually independent sources. Of
course, this policy does not acknowledge that a Wikipedia entry may itself
become a factor in reputation building: especially when the information that
this entry exists is propagated by journalists and other potent "multiplicators". 7.
Conclusive remarks The Wikipedia is an extremely comprehensive
object to study, because it is at the same time
Given its amazing complexity and volatility
as a product as well as a production process and organizational structure, it
is difficult to achieve any definitive assessment whether it is currently
approximating, equalizing or even surpassing conventional encyclopedias on
any criteria of quality, or whether it has any chances to continue its
spectacular growth (or at least survive on the present level) in the near and
more distant future. In a least controversial functionalist
view, nobody will deny that the Internet offers a technological platform
particularly instrumental for very large scale collective publication
projects, so that the old idea of producing a universal encyclopedia seems
better realizable than in any earlier periods of history. Evidently, online
encyclopedia projects imply the possibility ·
to realize collaboration among any number
and composition of contributors: irres-pective of their geographical location
or any status characteristics and institutional affiliation; ·
to make use of a widest spectrum of highly
specialized and volatile expertise whose whereabouts have not to be known in
advance; ·
to give a voice to knowledgeable
individuals who may have no other channels for expression; ·
to lower overhead costs to a minimum by
relying on "discretionary resources", already existing
infrastructure and privately owned “means of production”; ·
to allow highly accessible and flexible
ways of collaboration without compulsory commitments; ·
to ease collaborative writing in a way that
not only articles, but even smallest passages and wordings can be
collectively produced; ·
to create multimedia productions where
texts can be amalgamated with pictures, videos and audio files; ·
to keep even largest and most complex
bodies of knowledge tightly integrated by hyperlinking; ·
to keep pace with even very sudden new
events and developments by immediate adding new or updating existing entries; ·
to facilitate processes of intersubjective
knowledge production by providing discussion discussion fora where dissensus
can be explicitly expressed and consensus-seeking deliberation processes can
be enacted; ·
to make encyclopaedic knowledge easily
accessible in any individual role contexts and situations: so that it can
penetrate any area of everyday culture, human activity and social
cooperation; ·
to increase the congruence between demand
and supply of knowledge: by encouraging recipients to become contributors
(“customer-made production”); ·
to cope with abuses and other disturbances
by relying on “user patrolling” and by creating in a democratic fashion
various protective structures, norms and procedures; ·
to create separate encylcopedias in all
languages and within even tiny ethnicities and cultures almost without any
costs and efforts (by simple “forking”); ·
to document the whole process of
production: by saving (and keeping fully retrievable) all intermediary steps;
·
to
increase the stock of “public domain” knowledge that can flow freely because
it is not subject to copyright or any other proprietary control. Since he initiated his project in January
2001, Wikipedia founder Jimmy Wales has gone a long way to realize his bold
promise to “distribute a free encyclopedia to every single person on the
planet in their own language”. In the meantime, about 5 million articles in
more than 150 languages have been created, and the number of visitors is
currently (November 2006) higher than that of any other non-commercial site. More than that: the
Wikipedia has grown not only to be one of the most popular web platforms, but
also one of the most authoritative Net Institutions which is daily consulted
by thousands of students, teachers, journalists and others who multiply WP
knowledge orally or by writing to many other receivers. This trend is supported by the exploding mass
of web information sources that causes most surfers to reduce complexity by
confining their regular surfing to about eight to ten Web sites (the
equivalent of "anchors" in shopping malls) which they deem
reliable, timely, accurate, objective, authoritative, and credible. Many of these visitors
may not be aware that the Wikipedia is the product of anarchic and amateurish
procedures; they fully trust the information they find, and they are careless
(or lazy) enough to consult additional corroborating sources. As a
consequence, the Wikipedia has ever more influence on worldwide processes of
knowledge acquisition and knowledge diffusion. Unquestionably, it determines
the information transported by innumerable academic papers, magazine
articles, written memoranda and oral talks and lectures all over the world. Given all these striking measures of
success (and indicators of unimpeded further growth), there are still reasons
for doubt whether the whole project is sustainable because with increasing
size and societal prominence, it may become more manifest that it is built on
rather shaky grounds. First of all, the breathtaking popularity of WP
contrasts sharply with the fact that it has no secure basis for trust. Its
rising status as a first order web knowledge resource site is somewhat
free-floating, because there is no correlative emergence of actors to which
such far-reaching responsibilities could be attributed: no individuals nor
collective bodies that could be made accountable for the information existing
or lacking in this amorphous heap of collective contributions (Brandt 2006). Somewhat similar to
democratic votings, the resulting articles have to be seen as the products of
anonymous collective processes that derive their legitimacy and acceptance
form the fact that a set of unknown participants have come to a certain (at
least majority) agreement. Not only is there any lack of professional expertise as
a source of authority: users must live with the suspicion that any page they
visit has been vandalized recently or is the product of completely uniformed
authors. As anybody can edit and modify anything, even people maintaining
highly optimistic views about human nature will not be ready to trust fully
any article or bit of information, Thus, the Wikipedia is constantly accused
of being unreliable, or even more strongly – being just a garbage can filled
with trivia and trash. This
lack of trust has grave behavioral consequences, because for several reasons,
the Wikipedia is more disposed than conventional encyclopedias disposed to be
heavily criticized form many sides:
Since its inception, the Wikipedia is
vehemently denounced by individuals who base their judgment not on extensive
empirical research, but just on deductive common sense arguments: As
everybody can edit and change articles, there must be a high level of
vandalism and misinformation; as nobody is paid for fact-checking, it is
certain that errors remain uncorrected; as experts face the risks that their
contributions are subsequently modified or erased by laymen, their motivation
to collaborate will inevitably be reduced to zero, as nobody can be made
liable and legally sought, slandering will spread without limits. Of course, such
deductive arguments abound because it much more cumbersome to base judgment
on inductive procedures: by selecting a representative sample of Wikipedia articles
and analyze to what degree they meet standards of quality, consistency and
reliable truth. While
this unprotected exposure is a source of vulnerability, it is on the other
hand also an excellent precondition for further learning processes and evolution:
e. g. for developing norms and organizational procedures in order to raise
the level of linguistic expression and the reliability of information. The problem to be solved is the following:
which minimal measures of access control, hierarchical supervision and
professional expertise are necessary in order to wipe out vandalism and
errors and to ensure reliable, high quality contributions? Instead of relying
on a thin elite of professional authors and editors
from the onset, the Wikipedia has begun with an extremely open structure
which of course can be modified ad libitum according to emerging needs. Its
open-ended evolution is based on similar principles as the liberal state
where the problem is to find out which minimal constraints on the citizen’s freedom
are indispensable in order to prevent public disorder. It would seem very reasonable to raise the
trust in Wikipedia entries by aggregating user judgments: either judgments of
experts who evaluate entries within the specialized fields, or general user
judgments as it is done in many other Web 2,0 sites
today. Paradoxically, the Wikipedia doesn't lend itself well to such
procedures, because any aggregation of judgments has to rely on the premise
that the object to be judged remains invariant over time. The WP's openness
for modification has not only the consequence that every user may meet a
different article, but that judgments themselves may cause such changes: to
the degree that judges correct themselves
immediately the errors they see. This second consequence could mean: the
larger the number of judgments, the less useful the aggregated judgment,
because the object to which it refers has considerably changed. In a way, the Wikipedia resembles physical
quantum objects in the sense that it cannot be observed because observations
themselves are causing it to change. Thus, journalists may not find it
fruitful to write critical articles about WP on the basis of major errors
they have found in it, because only hours after publication, these same
errors may have already been eliminated. A second vulnerability stems from the
rising eagerness of individuals and organizations to manipulate the
Wikipedias contents in accordance with their interests and preferences. The
higher the popularity and reference status of the site, the less a politician
can ignore when his biography contains embarrassing and compromising facts,
and hundreds of supporters, election contest managers and “media advisers”
may become active to “correct” the corresponding entries. Likewise, every
corporation will care that its economic performance and the way it treats its
employees and customers will be described in a favorable way, and it will
mobilize its public relations specialists to do the necessary job. This
inherent danger is illustration by the start of “MyWikiBiz.com” in August
2006: a firm who offers to all companies the service of authoring Wikipedia
articles about their enterprise and their operations. While living persons, active organizations
and contemporary events may be most hit by such massive interventions, even
historical articles (e. g. about the dead founders of still living religions)
may become the center of heated editing contests. Thus, the “resilient”
capacities of the Wikipedia may be more profoundly tested in the future, and
ever higher numbers of highly motivated and activated “true Wikipedians” may
be necessary to cope with such collective attempts of manipulation. While
straightforward “vandalizations” often stand out so clearly that they are
easily corrected (sometimes even by automated "Vandalbots" without
human intervention), such manipulations may be much more difficult to
discover, because only very few “patrollers” have the respective knowledge. Somewhat different dangers arise from the
inherent tendencies of “self-accelerating growth". The more popular the
Wikipedia, the more individuals will develop an interest to find themselves
and their acquaintances as well as their home village and high school and
their most preferred movies and music bands adequately represented. Thus,
universal encyclopedic knowledge will give way to “multi-particularistic”
knowledge serving idiosyncratic interests of families, localities or
sectarian movements. More
and more, organized attempts may be made to instrumentalize the Wikipedia for
purposes of “self presentation” or to even “kidnap” it for specific
ideologies or propaganda purposes. For instance, in July 2006 the Akron Beacon Journal in
Ohio has published an article where readers are invited to write additional
Wikipedia entries related to the history of the city of Akron; and detailed
technical instructions are provided how articles are generated, edited and
changed. Sometimes,
even competitive races are unleashed that may lead to uncontrolled
self-escalating editing endeavors:
Such collective “assaults” could well lead
to a highly disequilibrated coverage of different geographic regions and
locations, and it is not clear how such one-sided hypertrophies could be held
in check. Evidently,
they can only be counteracted by cultivating strong, highly explicit and
consensual views within the “Wikipedia community” about the scope and limits
of “encyclopedic knowledge”: so that all contributions transcending such limits
will be rapidly eliminated. A third latent instability arises from the
spectacular degree to which the whole project is based on a highly
regularized flow of unpaid voluntary collaboration. Such volunteering may
well encourage the creation of ever new articles, because many participants
may be highly motivated to leave their personal footprints by adding
something new. However, the more articles, the higher the subsequent volume
of constant maintenance work that has to be carried out by the whole WP community. The more the
Wikipedian diversifies into millions of entries, the less it is possible to
allocate the "watching capacities" in a way that all articles are
permanently corrected within short time when vandalizations or other forms of
degradation occur. In fact, the Wikipedia community and the administrators
maintain highly specific assumption about which sites are very likely to be
attacked and which sites are highly important to keep clean. This explains
why vandalizations of the G. W. Bush article usually don't survive longer
than two minutes, because it is constantly patrolled by policing participants
who get readily alerted
whenever revisions are made (Kelley 2005). Usually, such
maintenance work is much less motivating because authors find little room for
creative performance. Consequently, the probability is very high that the
Wikipedia process will soon be slowed down or stopped by simple fatigue;
especially when alternative projects that allow more creative expression are
absorbing the volunteer's attention.
Unhappily, the transition to fully paid
staff is no viable alternative, because thousands of employees would be
necessary to carry on all volunteering activities. Therefore, stagnation and
decline will only be prevented when active participation is stabilized by
either by very tight internal community controls or by exogenous
institutionalized norms. For instance, schools and universities could oblige
their students to engage in “Wikipedia maintenance work” for acquiring some
their points and grades; scholars may accept the informal responsibility to
look constantly after the WP entries most akin to their specialized field;
and even national or worldwide associations may emerge just for the purpose
keeping “their” Wikipedia sections up to date. In addition, the strict anonymity of
contributions may in the long run be disfunctional because collaborators see
no chances of getting any personal reputation (Ciffolilli 2003).As many
articles are in their major parts written by single contributors (or very
small groups of them), it would be possible to make at least these names
visible - in contrast to all the smaller contributors who have only added
words, commas, references or links. Some of these problems are aggravated by
the fact that the Wikipedia is not a “Net Encyclopedia” in its fullest sense,
but an intermediary product that still clings to some premises and
constraints of the printed paper era. When seen in isolation, it is certainly
impressive how radical WP has implemented new online technologies in
literally all its activities. When looked at as a component of the larger
Internet, however, it is conspicuous that it has still problems to define its
place and hesitates astonishingly to make full use of the potentialities of
the World Wide Web. Like a conventional multivolume
encyclopedia that can be put on a library shelf (and like the EB or Encarta
on CD-Rom), it still aspires to remain a relatively closed, self-contained
universe: so that visitors are supposed to navigate mainly within the
site to find all necessary information. As explained above, this
self-isolation may be understood as a correlate of community building and
collective identity formation. It is expressed in the emphatic assertion that
the WP should not be a “web directory”, and in an “external hyperlink paranoia”
(see 6.6) for keeping visitors away from propaganda or commercializations.
However, this “isolationist” stance ignores that the WP is just a node within
an ever expanding web of knowledge resource sites, and that unlike the
community-oriented “Wikipedians” who want to perfect their mighty cathedral,
typical visitors are quite indifferent whether they find the desired
information within the Wikipedia or on any other accessible site. The Internet makes it fundamentally easy to
corroborate any kind of information by searching for second or third opinions
in different websites. As a consequence, the idea of the WP to be right in
all matters is fundamentally flawed: its a relic of the printing age where
having the EB usually excluded the possession of alternative encyclopedias,
so that there had to be complete trust in exactly this single publication.
Thus, for the WP, the way to perfection does not mean to become error-free,
but to make available gateways for corroborating information: e. g. by adding
hyperlinks to more specialized and professionalized sources. As it is used as a portal
site by so many users, it should accept its responsibility to be exactly such
a gateway: by guiding users from the more fundamental information provided in
its own articles to deeper and more detailed information on other sites. By
setting links to the primary sources from which it has drawn its information,
errors would also become less consequential (and therefore: more tolerable),
because users would be enabled to make independent checks (Benkler 2006:
218). Of
course, this would imply that Wikipedia editors accept the duty to evaluate
and select such external sites: so that the Wikipedia would not just be an
encyclopedia, but also an encompassing directory: a universal gateway to
human knowledge by connecting to all sorts of high-quality informational
resources. More than that: it would constantly adjust its mission in relation
to complementary sources arising on the WWW: carving out an ever more
specialized and more precisely defined niche. Only by stripping off all aspirations of
isolative self-sufficiency, the Wikipedia will burn its mental bridges to the
old age of printing and become a true contemporary of the Internetted Digital
Age. Finally, we may speculate that the most
profound effect of the WP is associated with a much more encompassing process
it has set in motion: the rapidly proceeding "wikification" of the
World Wide Web. On the one hand, there has already been a rapid
multiplication of Wikipedias in almost all human languages On the other hand:
there is an emergence of specialized Wikis centering on particular topics.
Such processes have been catalyzed by the foundation of the
"Wikicities" site which offers the free MediaWiki" software to
everybody who wants to install his own Wiki: e. g. on Star Treck, Harry Potter,
Basketball, genealogy or quit smoking. As exemplified by "Beijingology"
page which aims to collect all available knowledge on this major Chinese
city, geographical entities like countries, provinces or municipalities may
be particularly prone to become attractors for wiki-guided knowledge
aggregation, because such knowledge is very multifaceted and distributed to
large and constantly changing variety of residents, visitors and external
observers. While such proliferations may weaken the
central encyclopedia endeavors by diluting work capacities on a multitude of
smaller projects, they have themselves a centralizing impact: e g. by
convincing former authors of individual websites to pool their endeavors.
Thus, the Psychology Wiki founded in January 2006 has expanded so
quickly that already at the end of the same year, it has become one of the
most comprehensive psychology sources on the Net (with about 22000 pages). While the general Wikipedia still functions
as a model and paradigm, such specialized Wikis may have better chances for
survival and continuous upgrading because most of their contributors may
possess a rather high expertise. Starting in Jan. 2007, these services have
been expanded by openserving.com which offers also free bandwidth and storage
space to all Wiki holders. It is evident that apart from the
encyclopedic project, the Wikipedia has now kicked off a far-ranging process
of "Wikification" that may easily spread over major parts of the
Internet subsystem by giving rise to thousands of knowledge accumulation
projects united by using the same standardized Wiki software as well as by
dense mutual hyperlinking and uninhibited content transfers (based on
"free license"). Abid Ullah Jan 2006a Wikipedia: A tool for expediting
the clash of religions. Media Monitors Network Febr. 22, 2006. Abid Ulla Jan 2006b Wikipedia: Good Intentions,
Horrible Consequences. Al-Jazeerah, Febr. 27, Allen, C. 2005 ‘Future topics’, Life with Alacrity. http://www.lifewithalacrity.com/2005/04/future_topics.html Anderson, Chris 2004 The Long Tail. WIRED MAGAZINE,
Issue 12,10. http://www.wired.com/wired/archive/12.10/tail.html Barnett, Cynthia 2005 Wiki Mania. Florida Trend, pp. 1.
9. Bauwens Michael 2005 P2P and Human Evolution: Peer to
peer as the premise of a new mode of civilization. http://www.networkcultures.org/weblog/archives/P2P_essay.pdf Blau P.
M. 1994 Inequality and Heterogeneity. A Primitive Theory of Social Structure.
New York: Free Press. Benkler, Yochai 2006, The Wealth of
Networks, How Social production Transforms Markets and Freedom. Yale
University Press, New Haven and London. Berger, P. and Luckmann, T.
1967 Social construction of reality: a treatise in the sociology of
knowledge. (Garden City, NY: Doubleday. Berinstein, Paula 2006 Wikipedia and Britannica. the Kid's All Right (And So’s
the Old Man). Information Today, March 2006. Brandt Daniel 2006 Wikipedia’s accountability problem.
Press Action Jan 21th 2006. Campbell, Donald.T. 1965
Variation and Selective Retention in socio-cultural Evolution (in: Barringer, Herbert R. et. al. [eds.] Social Change in
Developing Areas. Schenkman Publishing Company
Cambridge Mass. 1965, 19-48). Ciffolilli, Andrea 2003 Phantom authority, self–selective
recruitment and retention of members in virtual communities: The case of
Wikipedia. In: First Monday, Volume 8, Number 12. http://firstmonday.org/issues/issue8_12/ciffolilli/index.html Dawkins, Richard 1993 Viruses of the Mind. (In: Dahlblom, Bo (ed.) Dennett and His Critics: Demystifying
Mind. Cambridge, Mass.: Blackwell). Dorroh,
Jennifer, 2005 Wiki: don't lose that number..
American Journalism Review August/September. http://www.ajr.org/Article.asp?id=3947 Gallupe, R. and McKeen , J. 1990 Enhancing computer-mediated communication: An
experimental investigation into the use of a group decision support system
for face-to-face versus remote meetings. In: Information and Management, 18,
pp. 1-13. Geser
Hans 2002, Toward a (Meta-)Sociology of the digital
Sphere, Zuerich. http://socio.ch/intcom/t_hgeser13.htm Greenstein Shane / Devereux, Michelle 2006 Wikipedia in
the Spotlight. Kellog School of Management Goffman, Erving 1959 The Presentation of Self in Everyday
Life. Doubleday: Garden City, New York. Kelley, Jeffrey, 2005 Wikipedia is always a work in
progress. Richmond Times-Dispatch Sept. 3. Kennedy, James, Eberhart,
Russell C. 2001 Swarm Intelligence. Morgan Kaufmann Kerr Elaine B. /Hiltz, Starr
Roxanne 1982 Computer-Mediated Communication Systems. Status And Evaluation.
Academic Press New York 1982. Klein, Naomi 2000 The Vision Thing. In: The Nation, July 10, 2000. Kleinz, Torsten 2006 Fünf
Herausforderungen für die Wikipedia. Telepolis, Jan. 5th 2006.http ://www.heise.de/tp/r4/artikel/21/21787/1.html Luhmann,
Niklas 1968 Legitimation durch Verfahren. Westdeutscher Verlag, Opladen Mehegan,
David 2006 Bias, sabotage haunt Wikipedia's free world. Boston Globe,
February 12, 2006 Segal,
David 2006 Look me up under "Missing Link". Washington Post Dec 3,
2006. Teece,
D. 1988. Technological change and economic theory. London: Pinter. Goffman,
Erving 1959 The Presentation of Self in Everyday Life. Doubleday: Garden
City, New York. Kelley,
Jeffrey, 2005 Wikipedia is always a work in progress. Richmond Times-Dispatch
Sept. 3. Kennedy,
James, Eberhart, Russell C. 2001 Swarm Intelligence. Morgan Kaufmann Klein,
Naomi 2000 The Vision Thing. In: The Nation, July 10, 2000. Luhmann,
Niklas 1968 Legitimation durch Verfahren. Westdeutscher Verlag, Opladen Mehegan,
David 2006 Bias, sabotage haunt Wikipedia's free world. Boston Globe,
February 12, 2006 Segal,
David 2006 Look me up under "Missing Link". Washington Post Dec 3,
2006. Teece,
D. 1988. Technological change and economic theory. London: Pinter. [2]
http://en.wikipedia.org/wiki/Wikipedia:Million_pool#March_2005 [3]
http://radar.oreilly.com/archives/2006/03/bionic_software_1.html [4]
http://www.nielsen-netratings.com/pr/PR_060810.PDF [5]
For extensive discussions on
the Long Tail concept, see the Blog http://longtail.typepad.com/the_long_tail/ [8]
http://en.wikipedia.org/wiki/WP:NOT [10] http://yro.slashdot.org/article.pl?sid=06/07/10/2224223 [11] http://avc.blogs.com/a_vc/2006/05/comscore_world_.html
[12] http://www.spiegel.de/netzwelt/netzkultur/0,1518,429099,00.html [13] See for instance Lanier 2006. [1] [14] In
comparison, there were only 26500 hits for the phrase "According to the
Encyclopaedia Britannica" (in Jan. 2006). [15] http://c2.com/cgi/wiki?ThreadMode [16] http://c2.com/cgi/wiki?DocumentMode [17] Wikipedia: Resolving Disputes http://en.wikipedia.org/wiki/Wikipedia:Resolving_disputes [18] Following the metaphor step
further, it could be mantained that In contrast to Fordist production systems, no individual “self-estrangement”
(in the Marxian sense) is created, because [19] See: Encyclopaedia Britannica
2004; entry “encyclopedia”. [20]
http://stats.wikimedia.org/EN/TablesDatabaseWords.htm [21] http://en.wikipedia.org/wiki/Wikipedia:Multilingual_ranking_December_2006 [22] In Taiwan, this competition stems
particularly from the “Encyclopaedia of Taiwan” (http://taipedia.cca.gov.tw/)
that includes also Wiki features since 2005. However, it is a highly
nationalistic endeavour because it covers exclusively domestic topics and
editing access is restricted to citizens of Taiwan. [23] http://en.wikipedia.org/wiki/Wikipedia:Awareness_statistics [24] This concept of "connectness" is extensively discussed in Blau 1994. [25] cited in: Greenstein/Devereux 2006 [26] http://en.wikipedia.org/wiki/Wikipedia:Wikipediholic [27] http://en.wikipedia.org/wiki/Wikipedia:WikiProject_Countering_systemic_bias [28] http://en.wikipedia.org/wiki/WP:AGF [29] In fact, while large numbers of
active collaborators identify themselves as graduate students, rather few of
them identify as professors (Read 2006). [1] [30] http://www.theregister.co.uk/2005/10/24/wikipedia_letters/
[1] [31] "Erneut
scheitert Buchprojekt mit Wikipedia" Sueddeutsche Zeitung 23. 12.
2006. http://www.sueddeutsche.de/computer/artikel/152/84068/article.html [32] Encyclopaedia Britannica 2004. article on “Encyclopedia”. [1] [33] http://en.wikipedia.org/wiki/The_Long_Tail [34] http://enciclopedia.us.es/index.php/Enciclopedia_Libre_Universal_en_Espa%F1ol [1] [35] http://www.wikinfo.org/wiki.php? [1] [36] In the corsic WP, for instance, many entries contain popular
proverbs (e. g. the pages dedicated to each month). [37] “Power structure” http://meta.wikimedia.org/wiki/Power_structure [38]
see: Wikipedia: Machtstruktur http://de.wikipedia.org/wiki/Wikipedia:Machtstruktur [39] for a discussion of this term, see for instance Schoonhoven
1981. [1] [40] http://en.wikipedia.org/wiki/Wikipedia:Office_Actions [1] [41] Critical Views of Wikipedia http://www.wikinfo.org/wiki.php?title=Critical_views_of_Wikipedia [42] Wikinfo: Critical Views about Wikipedia [43] for the meaning of this term, see Klein 2000. [44] http://en.wikipedia.org/wiki/Wikipedia:Wikipedia_Signpost/2005-01-10/Features [46] The Wikipedia Community [47] Encyclopaedia Britannica 2004;
entry “Encyclopedia”. [48] The first edition (1768–71) was
replaced by an essentially new and enlarged second edition in 1777–84; while
the ninth edition (1875–89), remained in print until 1910. [1] [49] http://en.wikipedia.org/wiki/2006_Israel-Lebanon_conflict [1] [50] http://en.wikipedia.org/wiki/Execution_of_Saddam_Hussein [51] Wikipedia: No Original Research. http://en.wikipedia.org/wiki/Wikipedia:No_original_research [1] [52] http://encyclopodia.sourceforge.net/en/index.html [1] [53] http://infodisiac.com/Wikipedia/index.html [1] [54] http://www.wapipedia.org/wikipedia/mobiledefault.aspx [55] Encyclopaedia Britannica 2004;
article „Encyclopedia“ [1] [56] http://en.wikipedia.org/wiki/Talk:Right_Wing_Authoritarianism [1] [57] Wikipedia:Three-revert
rule: http://en.wikipedia.org/wiki/WP:3RR [1] [58] Wikipedia: protection policy http://en.wikipedia.org/wiki/Wikipedia:Protection_policy [60] http://dv.wikipedia.org/wiki/Jimmy_Wales [1] [61] http://en.wikipedia.org/wiki/Prop%C3%A6dia [1] [62] http://en.wikipedia.org/wiki/Folksonomy [1] [63] http://en.wikipedia.org/wiki/Wikipedia:Neutral_point_of_view [64] http://en.wikinews.org/wiki/Wikinews:Neutral_point_of_view [1] [65] Wales, Jimmy „Neutral Point of View“ 2001 http://web.archive.org/web/20010416035757/http://www.wikipedia.com/wiki/NeutralPointOfView [1] [66] Sanger, Larry, Neutral point of
view-draft (20. Dec. 2001) [67] Sanger, Larry, Neutral point of
view-draft (20. Dec. 2001) [1] [68]
http://en.wikipedia.org/wiki/Historical_persecution_by_Christians [1] [69] http://en.wikipedia.org/wiki/Historical_persecution_by_Muslims [1] [70] http://en.wikipedia.org/wiki/Ancient_historical_persecution_by_Jews [71] for a clarification of this term,
consult Dawkins 1993 and Lynch 1998. [1] [72] Comment of User:Rcpaterson
in: Wikipedia: Expert retention http://en.wikipedia.org/wiki/Wikipedia:Expert_Retention [1] [73] User:Nikodemos/Asymmetric
controversy [1] [74] cited in Abid
Uallah Jan , 2006b [75] http://en.wikipedia.org/w/index.php?title=New_anti-Semitism&diff=50525733&oldid=50525704 [76] http://en.wikipedia.org/w/index.php?title=Talk:New_anti-Semitism&diff=51209794&oldid=51207003 [77] http://en.wikipedia.org/wiki/Israel_apartheid [1] [78] http://en.wikipedia.org/wiki/Wikipedia:Notability_%28people%29 [79] http://mywikibiz.com/ordernow.html [80] http://www.ohio.com/mld/ohio/news/15133629.htm [1][81]
Wikipedia:Wikipedia Signpost/2005-05-23/In the news [83] http://en.wikipedia.org/wiki/WP:NOT [1] [84] http://www.wikia.com/index.php/Wikicities [1] [85] http://beijing.wikia.com/wiki/Main_Page [1] [86] http://psychology.wikia.com/wiki/Main_Page [1] [87]
http://www.openserving.com/ Last update: 08. Okt 14
Contact: |