Net 11 ~ Assignment #1

Annotated bank of resources
~
being an advanced Internet user

Module 1

Barry M. Leiner, Vinton G. Cerf, David D. Clark, Robert E. Kahn, Leonard Kleinrock, Daniel C. Lynch, Jon Postel, Lawrence G. Roberts, Stephen Wolff (23 Jan 1999). A brief History of the Internet

Retrieved 20 January 2008 from http://arxiv.org/html/cs/9901011

While a nine year old document on the Internet is usually out-of-date, this paper, owing to its topic and the credibility and prestige of the authors, is one of the most comprehensive summaries available. The authors are academics and lead researchers, hands-on participants in the development of the Internet's fundamental technologies.

This paper delivers a precise summary of the technology and the basic sociological and commercial drivers in the evolution of the Internet. It is a valuable starting point for communication and computing students as well as professionals wishing to gain a thorough understanding of the history of the Internet and its development.

The integration of telecommunication and computer technology revolutionized the information and communication world in an unprecedented way by enabling resource sharing on a global scale. The paper summarises the technological, sociological and commercial development over a period of nearly 50 years which lead to the Internet as we knew it at the millennium. The authors examine information from the DARPA and ARPANET projects, discuss packet switching (TCP/IP) and other key components that enabled an open architecture for the network. This free structure and design ensured the scalable development without any centralized or global control. It discusses the problem of the ever growing number of interconnected networks and the issue of their identification which has been resolved by the DNS system.

~~~

Dr. Daniel Leo Lau (7 June 2005). Human-Computer Interfacing

Retrieved 22 January 2008 from http://www.engr.uky.edu/~dllau/Research/humancomputer.html

A brief overview of the relatively recent achievements of human - computer interfacing with some suggestions on the future direction of research.

The article illustrates, with the aid of video presentations, existing technology and research lab prototypes. The investigation pays special attention to commanding computers by people with disabilities, occupying body mounted equipment, such as special gloves etc. (with emphasis on its shortcomings e.g. expense) or video / flash technology to recognize body and facial expressions.

The study doesn't mention voice commands, yet this is an advanced field within the area with successful implementations available in consumer products, e.g. voice recognition in mobile phone devices, call centre menus etc.

The article lacks references to other studies conducted in parallel. The document hasn't been updated for two and a half years, which may suggest that these tests have been suspended at the University of Kentucky, Department of Electrical and Computer Engineering - where the reported research was conducted. Despite these shortcomings, the article, and the supplied presentations, provide valuable information while the video-presentations help to realize the potential of the technology.

The article is brief and covers only the topmost level of the subject matter - due to this, it is best used as a starting point.

Dr. Daniel Leo Lau CV

Module 2

Tom Van Vleck (10 Sept 2004). The History of Electronic Mail

Retrieved 21 January 2008 from http://www.multicians.org/thvv/mail-history.html

Technical explanation and overview of the evolution and development of the email technology and communication form from its 1961 inception through to the invention of the user@host formula in 1972. The author examines in the embraced 11 year period, all major challenges regarding electronic mail, from placing "addressed" files in remote folder sharing via FTP to actually writing up and implementing the first MAIL command on CTSS.

The paper points out the cornerstones of the development work, including dates and contributors and provides further references for research in to mail protocols from 1972 onwards, until the settlement at the current SMTP solution. The author also mentions the first spam and the immediate recognition of problems around the issue of unsolicited email.

By mentioning solutions later regretted and insight gained after the fact, the author demonstrates his analytical approach to his own work. As a result, the article is unbiased, a clear and critical summary of the evolution of email.

The paper provides a technical insight to students inquiring about the development of the Internet mailing system. The details given are on mainly obsolete technology (as far as email is concerned) which means the article is, as titled, a historical summary.

Tom Van Vleck CV

~~~

Elisabeth Byrne (November 1994). The Formation of Relationships on Internet Relay Chat (IRC)

Retrieved 22 January 2008 from http://www.irchelp.org/irchelp/communication-research/academic/byrne-e-cyberfusion-1993/thesis1.html

A thesis submitted for the Bachelor of Arts (Honours) in Applied Communication Studies in the early days of Internet (1994), six years after the IRC scene emerged. The paper discusses the similarities and differences in human relationships and their formations via CMC in contrast to FTF interactions. The author spent nearly a whole year researching, interviewing and logging IRC users and activity and has been an active participant in the IRC community herself. It is inevitable for the writer to have fully integrated in the culture she researched and the article communicates a genuine interest and respect for the medium. The research is comprehensive and the author's approach is well thought-through, open and impartial.

While the IRC has been one of the revolutionary applications in synchronous CMC, the past 14 years since this article was published, has seen newer applications appearing with more user friendly interfaces - designed to accommodate the less computer and Internet savvy audience of the wide world and these applications have succeeded over IRC and have also moved to further levels (chat + VoIP = Skype), however the way people communicate via IRC remains the same. IRC may not be in the headlines nowadays, but it is still a very useful method of communication.

The thesis is a helpful resource and enjoyable reading, particularly if approached from the field of psychology, anthropology, communication or technology / Internet science.

Module 3

Bill Stewart (7 January 2000). Hypertext Markup Language (HTML)

Retrieved 23 January 2008 from http://www.livinginternet.com/w/ww_html.htm

HTML is the most simple and widespread (technical) language on the Internet. Its purpose is data structuralisation. The aim of HTML is to provide cross-platform data architecture for the sake of accessibility. The combination of the XML and HTML in the 1998 specs of xHTML 1.0 resulted in the basic web language being enhanced with additional data management capabilities. The most commonly known technology utilizing XML is the RSS feed.

Stewart emphasises the difference between hand-coded HTML and dynamically generated HTML. Hand coded HTML (whether hand coded or created by a desktop application, such like Dreamweaver) is static data which is first created locally and later transferred as a static document to the web server. This resource bank document is a typical example. On the contrary there are dynamically generated HTML pages (shouldn't be mixed with DHTML, a combination of techniques, namely HTML, CSS, JavaScript at the least) which typically pull separate data blocks and appends them together on the fly, to present it in a single document (a web page) at the time of request from the client side. This method is the key component of today's interactive web experience, often referred to as web 2.0.

Livinginternet.com is a comprehensive online source for basic information on probably any topic related to technology, design and history of the Internet. Its introduction to HTML is probably the most compact and digestible description I have read yet. Each topic is supplemented with external references to credible sources such as the W3C website.

Comments from significant figures of the web's history verify the work and give credit to its author. The site has also won the Scientific American's award in COMPUTER SCIENCE.

~~~

Tim Berners-Lee (27 October 2006). Reinventing HTML

Retrieved 22 January 2008 from http://dig.csail.mit.edu/breadcrumbs/node/166

The article from "the father" of the "WWW" discusses the difficulties of web standard creation, which first and foremost lies on the basic language of the web, the Hypertext Language. Nearly a decade since the first draft of the xHTML 1.0 specification was published the discussion is now dividing the web publishing public and the professionals between two proposed standards, HTML5 and xHTML2. The problem of creating double standards is that it may undermine the original concept of accessibility, especially when all the efforts are dedicated to this exact purpose. The difficulty lies in the fact that while new standards could provide hack-free solutions to developers, the system must remain, at the same time, backwards compatible to preserve existing content. This is made even more complex by the different interpretations applied by the different browser types. Berners-Lee collects and presents the option of all parties - developers, browser makers and end users.

While it's not explicitly communicated, the articles tone conveys the frustration and exasperation of the author. It is apparent that the task at hand is incredibly difficult due to the pressure that large corporations, the vast general public and very heated freedom activist (the open source people) pose; each trying to prioritize its interests.

The article was published on Tim Berners-Lee's blog at the MIT CSAIL (Computer Science and Artificial Intelligence Laboratory) domain. The article may be of interest to anyone who have ever published or will publish any documents on the Internet. For further information on the author, visit Tim Berners'Lee's profile page.

Module 4

Yochai Benkler (Dec 2002). Coase's Penguin, or, Linux and The Nature of the Firm

Retrieved 24 January 2008 from http://yalelawjournal.org/images/pdfs/354.pdf

This paper appears in a number of different versions on the Internet. The version referred herein is the original and / or final version published in The Yale Law Journal, Issue 3, December 2002: 369-680, which I believe to be the original and most reliable source.

The paper critically analyses the appearance of the Open Source Software movement. Loosely coordinated by the FSF (Free Software Foundation) and protected by the GNU General Public License, these projects are still considered an economical miracle. There is no market pressure or management order that distributes tasks amongst the participants as every task is picked up by volunteers. Yet this method delivered such products to the world and the audience of the WWW as the Apache server package and the Linux operating system to mention only a sample of the movement's flagship products. It is in fact the Internet that made it possible for such a large number of people to work together and to coordinate their works. There are tens or hundreds of thousands of participants on various levels hence open source projects often being referred to as Commons Based Peer Production

The article is a detailed and academic resource which explores and explains the motivation, situation and economic force of the OSS movement and its position on the global market at the time of its publication.

At the time of publishing the author holds the position Professor of Law at the New York University School of Law. For further details see his CV

~~~

Lyman, Peter and Hal R. Varian (2003). "How Much Information" 2003

Retrieved 23 January 2008 from http://www2.sims.berkeley.edu/research/projects/how-much-info-2003/printable_report.pdf

A study produced by the faculty and students at the University of California in Berkeley. Conducted in 2000 and repeated in 2003 to evaluate the volume of information produced throughout our history compared with the volume produced, on a daily basis, at the present time. The paper presents some surprising numbers, such as the fact that the daily email transfer data volume is hundreds of times higher than that available in a university research library.

The study collects information on printed and electronic media (i.e. radio, television, Internet) and also takes into consideration the volume of storage media produced (such as hard disks and CDs). The authors also try to assess the volume of information available, but mostly undetected on the Internet. This data is largely inaccessible for search engines, often as they are stored as segments in databases and are only assembled on request from a client application. The article discusses that 30% of a random sample of web site index pages contained the term "search" suggesting a complex (and often dynamically assembled and often not indexable) data structure in the background.

A study noteworthy for students and professionals alike as it not only provides overwhelming numbers in kilo and tera bytes, but also presents a number of approaches on the assessment of this data.

The authors gained information and data from a wide variety of reliable sources and provide a substantial list of references for further research.

For further information on the authors, visit Peter Lyman's profile page and Hal Varian's profile page.

INFO

This page has been submitted to the Curtin University of Technology as an assignment on the 26th January 2008 @ 3:50am AEST

For further information, please contact the author Peter Borbely.

Valid XHTML 1.0 Strict

Valid CSS!