The Net Generation is a popular term used to describe children born after the development of the Internet, a publicly available, global communications network. It was first used in 1998 by Donald Tapscott, a social commentator observing how young people were using new Internet technologies such as the World Wide Web with apparent ease and confidence. When the
term first appeared it was quickly circulated via the Internet and the attributes assigned to the Net Generation were rapidly adopted by the popular media. Tapscott’s book Growing Up Digital: The Rise of the Net Generation (1998) is written in a conversational style that is easily accessible by the general public. His commentary was timely and occurred as the popularity
of the Internet increased seemingly overnight and began to become part of mainstream society in homes and in the workplace. Understanding the background of the Internet, theBy 1984 the Domain Name System (DNS) had been introduced. DNS was a distributed Internet directory service designed to control the delivery of email communications (Salamon, 2008).
opened up the Internet to other
universities and made robust and widespread connectivity possible. By 1985 the first registered domain names began appearing and the basic structure and nomenclature we associate with the Internet today were in place. William Gibson's cyberpunk classic Neuromancer was also published in 1984 and introduced the idea of a cyberspace or virtual reality of the future by alluding to contemporary developments in technology (Gibson, 1984).
As has often happened in the past, terms used in popular science fiction literature were adopted by researchers and users working with these emerging technologies (Technovelgy, 2012). Much later, terms such as cyberspace, virtual reality and virtual worlds would be picked up by the media and become part of popularBy 1984 the first microcomputers were also appearing in schools. While initial forays into the use of technology in education occurred in
the early 1970s; cost, poorly designed software; a lack of support by the teaching profession and the fact that complicated systems were required to manage these projects; plus a lack of recognition for innovators; meant that computers in classrooms were the exception (Charp, 1997; Combes, 2005a). This changed in the mid-1980s and by the end of this decade
computers in the workplace and education
were becoming more commonplace. The convergence of these developments perhaps explains the 1984 cut-off date between the X Generation and the Net Generation often used by social commentators. Tapscott’s original work was also based on observations of how teenagers who were born around this period were using the Internet. By the end of the 1980s
and during the early 1990s members of the academic community were developing online communities using Telnet and Usenet, early adopters were conducting collaborative email projects in schools, while tertiary institutions were experimenting with student-led discussions, bulletin boards and the first virtual campuses appeared (Ring, & Watson, 1995; Anderson,
Clayden, Combes, Ring & Williams, 2005; Combes & Valli, 2007). However, use of the Internet by non-specialists was still in its infancy. The cost of the technology and connection fees were still prohibitive for most people and using the Internet required a certain level of technical knowledge and skill. Up until the end of the 1980s development and use of the
Internet occurred mainly behind
the closed doors of the military and in universities. It was not readily available to the general publicA series of major developments in technology occurred in the early 1990s which would have far-reaching effects on the global information, political, social and economic landscapes. These events also exacerbated the perception by the general public that the rate of
technological development was and continues to outstrip the ordinary person’s capacity to cope with it. In the early 1990s the development of user-friendly graphical user interfaces (GUIs) first pioneered byApple, became part of the Microsoft operating system (Windows) and through clever marketing during the 1980s, Microsoft became the standard operating system for IBM computers in business organisations (The Linux Information Project, 2006). During
1990–1993 Tim Berners-Lee developed and refined the concept and technologies that would become the World Wide Web (WWW). Berners-Lee was motivated to produce a single information space that would be universal and consistent. It would be a system that would ‘not constrain the user’ but one that would lead to an ‘enormous, unbounded world’ (Berners-Lee
Conclusion
Fischetti, 1999, p.36-37). The early Internet was closely aligned with academic research institutions, required some technical knowledge and worked on a complex arrangement of protocols, trust and a sense of shared community (Coyne, 2001). This philosophy was carried over to the Web by its developers and sharing and collaboration have remained two of the principal values that underpin both the core technologies and the philosophy of Internet
The Web, with its easily accessible graphical interface, however, would be an equaliser and provide access to information for everyone, a perception that still has resonance with the general public, politicians and educators.By 1994 the first commercial browsers appeared (Living Internet, 2012) alongside a global computer network that utilised the existing telephone network to provide worldwide connectivity and graphics-based interfaces that were
more user-friendly. The cost of computer hardware and software had also been falling steadily, making personal computing at home a real possibility for the average citizen for the first time. 1995 was the breakout year for the Internet, when the connection of the large, online service populations to the Web made it known throughout the world. After a lot of technical and popular press covered use of the Web in university and corporate



Comments
Post a Comment