Login

Enter your username and password

Forgotten your username or password?

Your Shopping Cart

There are no items in your shopping cart.

31 July 2007

Digital Inequality


Digital Inequality
© Danielle Carey

Introduction

This paper examines dialogue in the academic arena and dialogue in the public arena, and how they are being shaped by the digital age represented by the Web. One of the major areas for concerns is ‘digital inequality’, and I will examine some remedies to the ills of operating academically and publicly on the Internet, including the implications for Australian culture.

Academic dialogue: researching, writing, publishing

Scholarly communication

...by ‘asking the audience’, you’re more likely to get a correct answer than by asking a single individual.In an article titled ‘Rethinking Scholarly Communication’ the authors argue that there is ‘growing dissatisfaction with the established scholarly communication system’ (Van de Sompel et al. 2004, p. 1). They highlight concerns such as: journal prices, copyright, the time lag between research and publication, and restrictions on what is published and how it is published.

The authors contend that scholarly research has changed in the way it is conducted with improvements in digital technologies enabling ‘research practices that are highly collaborative, network-based, and data-intensive’ (Van de Sompel et al. 2004, p. 2). Established systems for publication have not kept pace with changing research practices, and nor have scholarly communication systems. While electronic publishing has partly displaced paper-based publication, electronic publishing has not reached the level of maturity demanded by changing research practices. Systems face problems of interoperability, workflow, service sharing, and information modelling (Van de Sompel et al. 2004, p. 2). Van de Sompel et al. assert that the scholarly communication system should be integrated with scholarly work and should not be a tacked on afterthought.

Australia’s new music culture is at risk of drowning in the relentless drizzle of an invisible, impalpable electric rain.Van de Sompel et al. point out that there should be a revision of what constitutes a ‘unit of communication’ away from the journal publication, which is dominated by textual information and has difficulties with non-textual materials, to a new perspective that includes entities like datasets, simulations, software, dynamic representations, and complex documents. These could be considered as representations in their own right or be aggregated into a related combination to function as a new unit. They argue for early registration of units such as preprints, raw datasets, or prototype simulations, allowing for ‘collaborative network endeavours’ and ‘speed of discovery’.

Roosendall and Geurts (1997) have identified functions of scholarly communication that are independent of the system used:

Registration allows precedence of a finding

Certification establishes the validity of a scholarly claim

Awareness allows participants to be aware of the latest claims and findings

Archiving preserves the scholarly record

Rewarding rewards individuals for their performance in the communication system based on metrics derived from the system

Recent digital implementations of this functionality within such a scholarly communication system have enabled more flexibility in the system compared with paper-based implementations, but they are in their infancy. The institutional repository movement is ‘leading to the creation of many new hubs of scholarly content’ - using systems such as DSpace, Fedora, and E-Prints (Van de Sompel et al. 2004). When combined with Grid or other network technologies, heterogeneous repositories become openly accessible and these new units of communication can help with the formation of ‘value chains’ such as ‘quality control (certification), discovery (awareness), and archiving’.

Already we are seeing the emergence of many distributed hubs, each performing a specific scholarly function. A single unit of scholarly communication may proceed simultaneously through different value chains across the network. In this situation, it is argued that a scholarly communication system should consist of interconnected service ‘as if they were a part of a global scholarly communication workflow system’ (Van de Sompel et al. 2004). Such a system would permit the chaining of Roosendall and Geurts’s functions in a flexible way.

Flexible combinations would facilitate innovation, adaptability, and democratisation (Van de Sompel et al. 2004). With such a dynamic scholarly communication network, the authors argue the need for a way to record the dynamics of scholarship.

They provide an example of a scholarly manuscript uploaded to a system as a pre-print, which is then peer reviewed and published in an electronic journal. The publisher enters a metadata record which enables scholars to discover, read and cite the paper. Finally, a government or agency conducts an audit process in order to reward the scholarship demonstrated by this process. This process could also be tracked and recorded.

What authors want

In this digital age, how do scholarly authors want to publish? We can get some insight from a report written by Rowlands, Nicholas & Huntington (2004). Their report documents the findings of a survey of nearly 4,000 researchers from 97 countries with regard to scholarly communication in the digital environment. It was commissioned in response to what they see as a crisis in journal publication, brought about by the failure of institutional purchasing power to keep pace with the number of new journals emerging as the pursuit of knowledge becomes more specialised.

They point out that what academic authors want has not changed much over time (Rowlands, Nicholas & Huntington 2004, p. 1):

They want the ability to target a very specific group of key readers, narrowcasting to those working on similar problems, and they want the imprimatur of quality and integrity that a good peer-reviewed, high impact title can offer, together with reasonable levels of publisher service.

These seem very traditional views and values, so what has changed? The means of publishing has also proliferated, but the awareness of issues surrounding alternative forms of publishing seems confined to the publishing and library community rather than the research community.

Here is a snapshot of some of the main attitudes to emerging publishing models that authors reported in the Rowlands study:

Self-Publishing: Defined as making ‘some of their materials available on their home page or departmental website’, 32% of researchers had self-published in this way, and 53% said they would consider it in the future. Researchers under 35 are more likely to self-publish. (Rowlands, Nicholas & Huntington 2004, p. 19)

Institutional Repositories: Defined as a ‘collection of scholarly materials in digital form that is managed – at an institutional level – by a research community, typically a university or sectoral grouping’ (Rowlands, Nicholas & Huntington 2004, p. 20). 21% had deposited into institutional repositories, and, once again, younger respondents were more likely to do so (28%). There were mixed attitudes to publishing in purely electronic forms. (Rowlands, Nicholas & Huntington 2004, p. 20-21)

Open Access Publishing: The architects of this study used a narrow definition of open access publishing. They referred to journals that ‘use a funding model that does not charge readers or their institutions for access. In an open access journal, readers are able to read, download, copy, distribute, and print papers and other materials freely from the Web. The costs of producing this type of journal are met by charging authors for publishing services provided by a third party’ (Rowlands, Nicholas & Huntington 2004, p. 21). They cite the example of BioMed Central as a commercial open access publisher. The survey found that debates around publishing business models are of little concern to researchers: “more than a third (34%) admitted they know ‘nothing at all’ about open access journals while 48% said they knew ‘little’ ” (Rowlands, Nicholas & Huntington 2004, p. 22).

While very few authors had experience in open access publishing, the idea of reader open access was appealing, and ‘respondents seem to associate the idea of open access with reasonably high-quality, well-indexed materials that are free at the point of use’ (Rowlands, Nicholas & Huntington 2004, p. 23). However, authors did not associate open access with a business model where the authors bear the cost. There were no significant concerns regarding the preservation of scholarly records in an electronic environment.

Many authors felt that open access systems would lead to easier access to articles and this view was especially prevalent in younger authors as well as authors from Africa and Asia. Negative views were that fewer papers would be rejected, papers would be longer, and the quality may decline.

Almost half said they wouldn’t accept a business model that involved authors contributing to the costs of publication.

Public dialogue: online communication and Web 2.0

What about writing in the public domain? Traditionally, criticism and writing on music or the arts has been the preserve of the critic in newspapers or feature writers in specialist publications such as Rolling Stone, Downbeat or Sounds Australian.

In the digital age, the development of the World Wide Web fostered the idea that the ordinary person could become both reader and writer (or publisher). eZines flourished, online interest groups were established, and artist websites allowed feedback from fans.

However, the real revolution in Web publishing has come in recent years with the development of the notion of Web 2.0. Paul Anderson has cleverly characterised these developments as the ‘tale of two Tims’ (Anderson 2007, p. 5) – Tim Berners-Lee as the ‘inventor’ of the Web, and Tim O’Reilly whose company first coined the term Web 2.0 (O’Reilly 2005).

So what is Web 2.0? While there is no one definitive answer, writers tend to talk about two basic aspects of Web 2.0. One aspect is the collection of ideas behind Web 2.0 – ways of thinking about Web 2.0. The other is the collection of technologies that support Web 2.0 modes of working.

In terms of Web 2.0 concepts, Tim O’ Reilly and John Battelle summarised some key principles of Web 2.0 ‘applications’, but Anderson (2007) has slightly reconceptualised these to produce the following list of ‘key ideas’:

  1. Individual production and user-generated content.
  2. Harness the power of the crowd.
  3. Data on an epic scale.
  4. Architecture of participation.
  5. Network effects.
  6. Openness.

The people formerly known as the audience

User-generated content refers to the ‘audience’ being able to upload their own content. “Alternatives to this phrase include content self-publishing, personal publishing (Downs 2004) and ‘self expression’.” (Anderson 2007, p. 15)

Sites such as YouTube, Flickr, and Odeo not only allow users to browse images, video, or audio, but also facilitate uploading of user content. Motivation to publish in this manner can vary from fortune to fame (Anderson 2006), and several authors have advocated a healthy scepticism regarding the scale of participation: ‘Over 10 Million of the 13 Million blogs in Blogger, a major blog provider, are inactive …’ (Anderson 2007, p. 15)

While we may quibble over numbers, there is no doubt that today’s audience wants to create their own content:

A little group of Web 2.0 technologies…is placing the media creation and distribution firmly into the hands of ‘the people formerly known as the audience’ (Rosen 2006).

People power or mass hysteria?

The phenomenon that O’Reilly calls ‘harnessing collective intelligence’ and Surowiecki terms the ‘wisdom of crowds’, is something that Anderson prefers to label ‘harnessing the power of the crowd’ since the demonstration of intelligence or wisdom is difficult to evaluate when analysing crowd behaviour. But what is this phenomenon? It is the idea that by ‘asking the audience’, you’re more likely to get a correct answer than by asking a single individual.

A tool used in this kind of ‘straw polling’ is the act of tagging Internet content (usually a URL) by an individual and then categorising it or giving it some meaning. Individual tags are aggregated, and a so-called folksonomy emerges from the crowd, showing important sites, interesting ideas or popular trends.

Another form of collective intelligence is what Anderson (2007) and others have described as ‘crowdsourcing’. This is similar to open source research and development where a number of people in a loose online community are working on the same or a similar problem. When a solution is found, it is shared with the community. If more than one solution is found, then crowd behaviour may determine which solution is picked up, for example through voting, or by which solution is more celebrated.

Drowning in data from ‘invisible electric rain’

By now you may be starting to think: ‘Gee, we’ve digressed a long way from discussing writing, analysis, and criticism.’ But wait, there’s more, there’s … data.

All of this audience-created material produces an ever-increasing amount of data: ‘Information gently but relentlessly drizzles down on us in an invisible, impalpable electric rain’ (von Baeyer 2003, p. 3). Web 2.0 companies have sprung up to deal with large amounts of data. Companies like Google, Amazon, and Ebay collect, process, analyse and sort data. They not only deal with data, but they also analyse human behaviour in the way that information is used.

Another way of accumulating data is in the form of a ‘mashup’. Mashups assemble data or media from disparate sources into a new presentation form, and there are many Web 2.0 applications that have been developed with this specific task in mind.

A serious consequence of these applications and behaviours provokes us to ask the question: Who owns what data? And since the audience are the producers, then it is their data that Google, Amazon, Ebay and MySpace are messing about with, and making huge amounts of money from, in the process. The tension between privacy and openness is of immediate concern in the Web 2.0 world. Issues of access and dissemination will be addressed further in the ‘digital divide’ section below.

I’ll get my network to talk to your network

Much has been made of the so-called social networking applications of Web 2.0, eg. MySpace, Facebook. Arising from the notion of user-generated content, collaboration, and an interest in common pursuits, such social networking sites rely on openness and a willingness to participate. The theory is that when a new participant joins the network and contributes, all other participants benefit. ‘Open content’ production allows for re-use within the online community, just as open source software development provided for code re-use in the pre-Web 2.0 era.

According to Anderson, Web-based services that are most successful are: ‘those that encourage mass participation and provide architecture (ease-of-use, handy tools, etc.) that lowers the barriers to participation’ (Anderson 2007, p. 19-20). These sites tend to have facilities for users to set up a profile, a blog, a file repository, and some means of creating an interest group.

Wikipedia’s entry on social networking provides a listing of notable social networking websites. The following table provides a small selection that demonstrates the mass participation that is occurring in these sites (Wikipedia 2007):

Name

Description

Users

BlackPlanet.com

African-Americans

16,000,000

Babbello

Australian teenagers

30,000

Care2

Green living and activism

6,900,000

Facebook

European Young Adults

11,000,000

Hi5

General

50,000,000

LinkedIn

Business

9,000,000

LiveJournal

Blogging

10,921,263

MySpace

General

140,000,000

Playahead

Swedish teenagers

530,000

Vampire Freaks

Gothic industrial culture

766,000

Windows Live Spaces

Blogging (formerly MSN Spaces)

30,000,000

Digital divide – Digital inequality

The above discussions on academic dialogue and public dialogue assume that contributors not only have access to the Internet, but also have access to the knowledge and the tools that allow them to participate. But even if you can contribute, who is going to read, view, or listen to your creations?

Hargittai has extended the notion of ‘digital divide’ into ‘digital inequality’, and states that previous academic researchers have characterised the digital divide in very binary terms: ‘someone either has access to the medium or does not, someone either uses the Internet or does not’ (Hargittai 2003, p. 3). With respect to the ‘digital divide’ in the United States, Hargittai shows that while Internet use increased from 12.77% of the population in 1994 to 54.66% in 2001, it spread at varying rates across different parts of the population. Differentials based on race, ethnicity, income, education, and location (urban/non-urban) were observed. Merton (1973) called this the ‘Matthew Effect’ where advantages are amplified over time: ‘unto every one who hath shall be given’.

Further studies highlighted that not only does connectivity need to be considered, but so too does the potential for inequality resulting from the ways in which the Internet is used. Wilson (2000) talked about four components for ‘full social access’: financial access; cognitive access; production of content access; political access. Hargittai adds to these measures of access with the following: technical means, autonomy of use, social support networks, and experience (Hargittai 2003, p. 10).

Digital inequality is also manifest on a global scale. Hargittai demonstrates that while the number of Internet users grew from 20 million in 1995 to 520 million in 2001, the proportion of use in different geographic regions varies widely. In 2001, the USA and Canada had 5.22% of the world’s population but 41.05% of Internet users. The Asian-Pacific region had 60.95% of the world’s population but only 25.76% of Internet users, and Africa only had 0.76% of Internet users although it had 12.65% of the world’s population.

There are also other forces that shape how the Internet is used. While there are billions of webpages accessible to the ‘public’, and the public is theoretically able to contribute to this vast resource, Hargittai refers to ‘attention scarcity’ as a phenomenon whereby a content creator needs to attract the attention of users to their content. Creators have come to rely on ‘gatekeepers’ to ‘channel their material toward users’ and users rely on services to find content (Hargittai 2003, p. 17). In particular, users rely on search engines to get access to an ever-increasing amount of Web material.

Search engines are not neutral in determining which sites they index and which ones they exclude. A whole industry has grown whereby commercial interests guide content selection: ‘The concern is that search engines that are guided by profit motives will point people away from the most relevant and the best-quality sites in favour of those that have paid the highest bids for placement on the results page’ (Hargittai 2003, p. 18). Hargittai quotes studies that show that the vast majority of users (85%) only view the first page of search results, and other studies have shown that this behaviour has actually become more pronounced over time. So inequality exists in terms of content production and dissemination as well as ‘use’.

The work of Hargittai and others debunks the notion that the Web is some kind of force for democratisation.

Summary and recommendations

Academic dialogue

In summary we can say for academic dialogue:

  • Academics need to ‘publish’ aggregates of both text and non-text data types quickly.
  • There needs to be more support for Institutional Repositories for publication and archiving of research.
  • There is a role for professional organisations in the ‘value chain’ of certification for quality and integrity.
  • But what about discovery and awareness of research outputs?
  • Academics want to target specific groups of readers, viewers, etc. Perhaps the function of an online journal is similar to that of an aggregator.
  • There is a lot of support for reader open access, but who pays is a different story.

Public dialogue

For public dialogue we can say:

  • It is facilitated by user-generated content sites.
  • Public dialogue is a mass phenomenon.
  • It is driven by what is popular and what appeals to a wide audience.
  • It encourages mass participation and is driven by commercial models.

Digital inequality

The problems of digital inequality are principally:

  • The most serious digital inequality is lack of access, but we need to think of lack of access in the following forms: financial, cognitive, production of content, and political.
  • Barriers to effective use are: technical, autonomy of use, social support, and experience.
  • Mass participation creates problems of ‘attention scarcity’ for individuals’ creations.

Remedies

In terms of the Australian cultural debate and dissemination of discourse from the academic point of view, we need:

  • Academics committed to analysing and debating Australian music. I’m not convinced we have that commitment at this point in time.
  • A far greater investment in Institutional Repositories to facilitate publication and archiving of research outputs. At the moment the efforts of educational institutions in this country have been tokenistic at best. There needs to be a twenty-fold increase in investment in this area.
  • Professional organisations could provide research quality endorsement for digital publications, or aggregations of related research into digital journals or collections. I’m not sure which organisations could do that role for Australian music at the moment, since the few organisations we have are fractured and in disarray. There have been several attempts at documentation and dissemination of Australian music on the Web, such as Ros Bandt and Ian Mott’s Australian Sound Design Project (www.sounddesign.unimelb.edu.au) and the Mikropolyphonie online journal, but these have relied on the honorary efforts of individuals after initial grants have run out. They then struggle to be sustainable – Mikropolyphonie, for example, is currently relegated to an archive site available only through the National Library’s Pandora service.

In the public arena, the documentation and dissemination of Australian music is up against a global juggernaut. To overcome it we need:

  • To increase the participation by ordinary Australians through better access (financial), education (cognitive), specialised sites (technical tools for production), change in government policies (eg. extend broadband and make it more affordable).
  • To encourage niche sites and niche social networks. There is a role for the Australian Music Centre (www.amcoz.com.au) here. It could provide some of the social networking tools and infrastructure on its website to its members. Further government assistance must come into play here.
  • To try and overcome the ‘attention scarcity’ problem through government-funded initiatives such as targeted search engines that are not commercially driven. This might be achieved through existing infrastructures such as the National Library, but existing sites like MusicAustralia (www.musicaustralia.org) fall short in that they use an outdated ‘transmission’ model and are probably grossly under-funded.

Australia is a small country and it is up against the might of other English-speaking cultures such as those of the USA and the UK in terms of claiming a place on the world’s virtual stage – the Web. The Horizon Report tells us that: ‘New forms of scholarship, including fresh models of publication and non-traditional scholarly products, are evolving with the changing process’ (Horizon Report 2007, p. 21). We run the risk of ignoring these new possibilities through lack of resources and attention from our educational institutions, libraries, and governments.

Australia’s new music culture is at risk of drowning in the relentless drizzle of ‘an invisible, impalpable electric rain ’ – global data.

References

[All websites below were accessed on 27/03/2007 except for Wikipedia.]

Anderson , C. 2006, The Long Tail: How endless choice is creating unlimited demand. Random House Business Books, London.

Anderson, P. 2007, ‘What is Web 2.0? Ideas, technologies and implications for education’. Report for JISC Technology Standards Watch, February 2007. http://www.jisc.ac.uk/whatwedo/services/services_techwatch/techwatch/techwatch_ic_reports2005_published.aspx

von Baeyer, H. 2003, Information: The New Language of Science. Weidenfeld & Nicolson, London.

Downes, S. 2004, ‘Educational Blogging’, EDUCAUSE Review 39 (5) (September/October 2004), pp. 14-26. http://www.educause.edu/pub/er/erm04/erm0450.asp?bhcp=1

Hargittai, E. 2003, ‘The Digital Divide and What to Do About It’. Pre-print of a book chapter to appear in Jones, D. New Economy Handbook. Academic Press, San Diego. http://www.eszter.com/research/c04-digitaldivide.html

Horizon Report. 2007, The New Media Consortium and EDUCAUSE Learning Initiative. http://www.educause.edu/LibraryDetailPage/666?ID=CSD4781

Merton, R. 1973, The Sociology of Science: Theoretical and Empirical Investigations. University of Chicago Press, Chicago.

O’Reilly, T. 2005, What is Web 2.0: Design Patterns and Business Models for the next generation of software. O’Reilly website, 30 September 2005. O’Reilly Media Inc. Available Online at: http://www.oreillynet.com/pub/a/oreilly/tim/news/2005/09/30/what-is-web-20.html

Rosen, J. 2006, ‘The People Formerly Known as the Audience’, PressThink, June 27. http://journalism.nyu.edu/pubzone/weblogs/pressthink/2006/06/27/ppl_frmr.html

Roosendaal , H. & Geurts, P. 1997, ‘Forces and functions in scientific communication: an analysis of interplay’. Cooperative Research Information Systems in Physics, August 31-September 4 1997, Oldenburg, Germany. http://www.physik.uni-oldenburg.de/conferences/crisp97/roosendaal.html

Rowlands, I., Nicholas, D. & Huntington, P. 2004, Scholarly communication in the digital environment: What do authors want? Centre for Information Behaviour and the Evaluation of Research, Department of Information Science, City University, London.

Van de Sompel, H., Payette, S., Erickson, J., Lagoze, C. & Warner, S. 2004, ‘Rethinking Scholarly Communication’, D-Lib Magazine 10 (9). http://www.dlib.org/dlib/september04/vandesompel/09vandesompel.html

Wikipedia. 2007, Entry for Social networking sites: http://en.wikipedia.org/wiki/Social_networking_sites [Accessed 6/02/2007]

Wilson , E. 2000, Closing the Digital Divide: An Initial Review. Briefing the President. Washington: The Internet Policy Institute. May, 2000.


David Hirst is an academic and electroacoustic music composer with a PhD in auditory cognition and music composition. Formerly the Head of Music at La Trobe University, he is currently Senior Lecturer in Educational Technologies and Senior Fellow in the School of Behavioural Science at the University of Melbourne.

Comments

Be the first to share add your thoughts and opinions in response to this article.

You must login to post a comment.