Sunday, February 2, 2014

TheGuardian.Com: Battle for the Internet -

“The Online Copyright War: The Day the Internet Hit Back At Big Media” by Dominic Rushe

On January 18, 2012 Wikipedia and Google went dark. But this wasn’t a power outage but a deliberate protest against the proposed Stop Online Piracy Act, or SOPA. Introduced in October 2011, SOPA was a continuation in the fight against online copyright infringement by aggressively pursuing and prosecuting online service providers who utilize and/or promote websites that feature copyrighted material. Essentially, its purpose was to make it even harder and undesirable for individuals to not only gain access to but to share any information deemed outside the public realm.

This week’s readings discussed the advent of the Internet, the World Wide Web (the Web), and how users have been adapting this medium to create faster ways to communicate and share the plethora of information we create everyday with one another. As librarians, as information specialists, we are keenly invested with providing easy access and dissemination points for this created information. But as Dominic Rushe’s article reminds us there are other organizations that are just as invested with denying this access and blocking online dissemination points. It is a question of a free, open Web versus one ruled by, what global free culture leader Elizabeth Stark calls, “a closed, copyright-protected world from before the digital age.” Similarly, Jimmy Wales, Wikipedia co-founder, believes that while copyright laws have some validity, “applying these rules to the digital age isn't going to work.” For now, SOPA has been defeated. The Internet fought back, which is to say that millions of Internet users made their opinions heard. But only time will tell and I believe that library professionals will have a big say in which way this battle goes. 

Visit theguardian.com: battle for the internet series for more editorials and comments.

How Stuff Works: ARPANET

I love How Stuff Works (the show and the website) and I was curious for more information about the Advanced Research Project Agency (ARPA) and the interconnectedness of networks. I think it's interesting how many inventions or new ideas come about because of the government, and in this case, the government wanted to find a way to distribute information in the event of a catastrophe. The first model of ARPANET was only a network of 4 computers, linked together with phone lines and Interface Messaging Processors (IMPs). The computers' locations were chosen based on the location's relationship with the government: UCLA, Stanford Research Institute, UC Culler-Fried's Interactive Mathematics center, and the University of Utah.

For more info: ARPANET

The Internet Under the Hood by Robert Molyneux


The Internet Under the Hood by Robert Molyneux may become a great basic introduction to the concepts of network applications. Being published in 2003 and designed for librarians, educators and information professionals, with a different level of technology experience, now this edition may serve a sort of reference purpose. As it was mentioned in Library Resources & Technical Services Book Review (Swanson & Gregory, 2005) the author provides “with bibliographies for future reading” and is “combining information on aspects of his subject that are generally found in several different types of works into one book”. Four sections: Overview, Technology, Applications and Social issues are complemented with some case studies and additional information about privacy, legal issues and intellectual property related to Internet use.

How to Infrastructure

Chapter Summary

In the chapter “How to Infrastructure,” information specialists Susan L. Star and Geoffrey C. Bowker hope to explore the processes by which communication infrastructures, the prime example here being the internet, are formed and maintained. Though many attributes of successful infrastructures are listed in the chapter, the main thrust of the piece seems to be that the continuation of infrastructures is completely dependent on the implementation of standards. The authors also draw attention to concerns about how information infrastructures are creating divisions between users who can access the modes of communication, and those who can not. As Star and Bowker suggest, there are always political, ethical, and social choices that come into play when an information infrastructure becomes more visible (233).


While the authors are quick to point out that the common sense definition of infrastructures as “that upon which something else rides, or works, a platform” is a bit too simplistic, it offers a good launching point for their arguments. Infrastructures are everywhere, and the more successful ones share the commonalities of having stable bases upon which people can build, compatibility between systems, and an offer of support to a community of users (231).  A helpful set of  examples to think of could be local gas, electric, and sewage infrastructures. Many of these were set in place long ago, and require relatively small amounts of upkeep  to continue serving users (237). If infrastructures are frameworks that reach beyond temporal usage, the question becomes, how do our modern communications infrastructures maintain their usefulness?


With a stable base from which to build, infrastructures only thrive when they can offer standards. The authors eloquently imagine information standards as a “shared set of handshakes among the various media they pass through” (234). This means that information systems must be able to communicate with existing programs, while allowing for updates to progress towards advancements. To put it more of a practical context, the use of standards is what allows e-mails from Yahoo accounts to be read by Gmail, and what allows people to work on the same document in Microsoft Word 2003 and 2007. These standards require constant communication to ensure each infrastructure agrees with the next, and this is why Star and Bowker claim “the internet is only virtually stable.”


To complete their investigation of what makes infrastructures effective, the authors make the claim that accessibility must be accounted for. When groups of users are left out from receiving the benefits of an infrastructure, it creates gaps in progress. In using the example of how lack of internet access creates sets of people who are “information rich” and”information poor,” Star and Bowker imply that open access for all could be the next information infrastructure that needs to be put into place (239).

How is this Information being used?

This chapter is part of a 2002 book entitled Handbook of New Media , which is an exploration of how new technologies affects the social aspects of everyday life. The work is geared toward more scholarly analysis, and a search for articles that cite “How to Infrastructure” reveals that many of the discussions that point to the work of Star and Bowker continue the conversation of how cultures responds to the more expansive information infrastructure. One example is Sonia Livingstone’s article “The changing nature of audiences, from the mass audience to the interactive media user,” which explores how more personalized choices of media (ex: On Demand type programming) challenges the established infrastructure, and how consumers will interpret changes based on their preferences. While this article was mostly chosen at random, and is not a perfect example, it continues the line of thinking Star and Bowker started by examining how accessibility plays a large role in how individuals “buy in” to a new infrastructure.  

Conclusion

The most fascinating idea encountered in this article is that advancements in media only exist when infrastructures “allow” for such progress. The authors give examples of  how products that we may think of as revolutionary are only slight modifications of existing infrastructures. These include the rise of computers, which Star and Bowker claim was due to the changing needs of the office infrastructure, and the rise of the health care industry and increased life expectancy due to changes in the water sanitation infrastructure (233). When viewing our media as such small parts of a larger picture, it makes one realize the impermanence of the “next great technological breakthrough.” But because of the argument set forth in this chapter, there also comes a realization that any good infrastructure will allow for and absorb this change, as it progress towards something new.

__________________________________________________________________________
Sources 

Susan Leigh Star and Geoffrey C. Bowker, "How to infrastructure," in Leah Lievrouw et al., es., Handbook of New Media (London: Sage Publications, 2006), pp. 230-245. 

Sonia Livingstone, "The changing nature of audiences : from the mass audience to the interactive media user." In: Valdivia, A., (ed.) Companion to media studies. Blackwell companions in cultural studies (6). (Blackwell Publishing, Oxford, UK, 2003) pp. 337-359. 

Saturday, February 1, 2014

"What Things Regulate"

            “What Things Regulate” is the seventh chapter in Code: and Other Laws of Cyberspace by Lawrence Lessig. Published in 2000, his book discussed the impending regulation of cyberspace, and what laws or codes would look like in the internet. This particular chapter discussed the forces of  law, social norms, market, and technology/architecture and how they constrain the choices an individual might make. He discusses these four constraints using various legal, "real space" examples to illustrate what could happen in the cyberspace.
           This book is intended to be a call to action. Lessig argues that people need to understand how regulation works and “think beyond the narrow threat of government. The threats to liberty have never come solely from government, and the threats to liberty in cyberspace certainly will not” (86). Lessig believes it is the responsibility of the citizens to guide the regulation of the internet in their favor rather than accept the guidance of the "invisible hand" principle that will allow government or commerce to write the code of the cyber space. Left to its own devices, Lessig argues, the cyber space would fall victim to direct and indirect government control.

            Lawrence Lessig is a currently a professor at Harvard Law School, and still advocates for limited government regulation and the power of the citizenry. An updated version of his work was published in 2006 as Code: and Other Laws of Cyberspace 2.0. According to the book description on Amazon.com, it is "the first reader-edited version of a popular book" as many revisions came from readers via a wiki on his website. Before the days of Code, Lessig was was an expert witness during the Microsoft anti-trust trial in 1997. More recently, in April 2013, Lessig spoke about his opinions through a Ted Talk, “We the People, and the Republic We Must Reclaim”. This Talk was not about internet regulation, but about how political campaigns are funded by the smallest percentage of Americans and why citizens should be concerned. He continues to encourage the citizenry to take charge and make their democracy what they want, just as he encourages them to take responsibility for the cyber space in Code.
          Lessig's Code was received well upon its publication. Temple University Law Professor David Post published an interesting review where he discussed and countered some of Lessig's points about the detrimental effects of allowing the "invisible hand" to determine cyberspace law. Although this book was published over ten years ago, government regulation of the internet remains a relevant issue. As information professionals, we have an obligation to promote equal access to information. How does government regulation of the Internet, the largest information resource at our disposal, affect the LIS field? Should the internet be truly free of legal, norm, market, and technology constraints? Is there any need for government intervention in the cyber space?

Lessig, Lawrence. “About.” Lessig. Web. 2 Feb. 2014.
Code and Other Laws of Cyberspace. Web. 2 February 2014.
"Jacket." Code and Other Laws of Cyberspace. Web. 2 February 2014.
"Code: Version 2.0." Amazon.com. Web. 2 February 2014.
Lawrence Lessig, "What things regulate," in Code and other Laws of Cyberspace (New York: Basic Books, 1999), pp. 85-99. 
Markoff, John. “Judge’s Ruling Is a Setback For Microsoft.” The New York Times 12 Dec. 1997. NYTimes.com. Web. 2 February 2014.
Post, David G. “What Larry Doesn’t Get: Code, Law, and Liberty in Cyberspace.” Stanford Law Review 52.5 (2000): 1439–1459. JSTOR. Web. 2 February 2014.
"We the People, and the Republic We Must Reclaim." Ted Talks. Web. 2 February 2014.

The ALA and CISPA


This section of the ALA’s Advocacy and Legislation page summarizes the ALA’s stand against CISPA (the Cybersecurity and Information Sharing and Protection Act of 2011), SOPA (Stop Online Piracy Act), and PIPA (Preventing Real Online Threats to Economic Creativity and Theft of Intellectual Property Act). It lists the actions that the ALA has taken against the bill and also posts an interesting explanation for why ALA has sided against CISPA and why it urges its members to take a stand for online privacy. The webpage also gives useful information on actions regarding CISPA (as the most recent incarnation of the bill) throughout the past year.

This information ties in specifically with our reading on government regulation and cyberspace, but it is also a useful resource for current and future LIS professionals, regardless of whether CISPA passes the Senate or fails and is sent back to reincarnate under another name and try again.
"The Cybersecurity Intelligence Sharing and Protection Act (CISPA) ", American Library Association, April 17, 2012.
http://www.ala.org/advocacy/cispa (Accessed February 1, 2014)
Document ID: 4df23ed3-7251-94b4-1542-42efd055f5f0

The internet's first and next 30 years


In "The first 30 years of the internet through the lens of an academic library," Beth Sandora Namachchivaya claims, “The internet is arguably the single most significant technology advancement to occur at the end of the twentieth and the beginning of the twenty-first centuries.” She supports this claim through examples from three time slices, 1982-1991 or the “startup” decade, 1992-2002 or the “discovery, access, and organization” decade, and 2003-2012 or the decade of “integration and interdependence” (624-625).
Through providing the reader with a 30-year history of library technologies and systems, in less than 20 pages, Namachchivaya supports her claim and gives the reader context by focusing on an academic library at the University of Illinois Urbana-Champaign.
She concludes the article with, “For libraries, the internet’s greatest strength is, paradoxically, its greatest weakness: its openness and chaotic nature makes it a natural environment for support for access to published and unpublished text, gray literature, reality media, twitter feeds, blogs, the making of social and political history – all the content that libraries typically collect, and that scholars integrate into their research. The opportunities for individuals, groups, and organizations to contribute to the utility and the market around the internet are vast. This openness also contributes to one of the biggest challenges for libraries – the fleeting nature of the content and services that are built around it” (639).
Keeping this conclusion in mind, I’d like to expand on a few sentences referencing Brett Sutton, “Sutton also had the foresight to incorporate digital preservation into the many useful roles that the internet might support in libraries, suggesting that research libraries could play a key stewardship role on behalf of smaller libraries in the development of repositories for documents, software, journals, and other research content” (631). What is happening now that will improve the libraries, archives, and museums of the future and who is at the forefront?
This question led me to Alan Inouye’s “The Future of Libraries at Thirty Thousand Feet” from the fall of 2013. Inouye comments libraries need to be the centers of content production (12). He goes on to include libraries as publishers and as one of our few noncommercial places the public trusts. Viewing libraries as places of production fits with the progression of the internet in terms of a way to connect people to not only each other but also to the information needed to produce what they find important, meaningful, or necessary. Is production what is happening now that will improve the development of libraries, archives, and museums of the future or is it something else? I’m interested in hearing your thoughts. What do you think is happening now that will serve library and information services 30 years from now? Who do you feel is leading the field of library and information services? Why may production be where information services are headed?
I look forward to reading your comments on these questions as well as anything else you’d like to propose related to current practices and how they will serve the future of library and information services.
Works Cited
Alan S. Inouye. "The Future of Libraries at Thirty Thousand Feet: Strategy and Public Policy." Young Adult Library Services 12:1 (2013), pp. 9-12.

Beth Sandore Namachchivaya, "The first 30 years of the internet through the lens of an academic library," Library Hi Tech 30:4 (2012), pp. 623-642. 

"The Internet" - Chapter 12 from Computer: A History of the Information Machine



The chapter in Computer: a History of the Information Machine focusing exclusively on the birth and evolution of the Internet (titled, “The Internet”, cleverly enough) is an account of the beginnings of the World Wide Web. It begins with a focus on the idea of the “World Brain”, a concept dreamed up by the Science-Fiction Nostradamus, HG Wells. Initially inspired by the idea of accumulating a wealth of human knowledge in a network, Wells sadly was never able to actively work on the project thanks to World War II. The concept was later worked on by several other visionaries, including Vannevar Bush (attributed to the idea of the “memex” information-retrieval device) and Tim Berners-Lee, who is attributed to being the creator of the World Wide Web as we know it. The chapter then outlines more modern additions to the Internet culture, including (but not limited to) the evolution of the smartphone, short biographies of the internet’s heavy hitters (Amazon, Google), and the evolution of the social network.

Also of note is how the chapter outlines very carefully the reasoning that many of the attributed creators used when contributing to the Internet's creation. There is a great amount of information about the early days of web browsers and their fierce battles for supremacy (I admit, I fondly remember utilizing Netscape as a child). It also helps explain logistical decisions, such as why the United States is exempt from a country suffix in URLs.

The chapter doesn't claim anything in a traditional sense; it’s a recollection (and often a dramatization) of how the Internet as we know it came to be. It looks to clear up misconceptions with documented facts and helps give context to how these developments arose. It shows that, much how the Internet is a collection or collaboration of many different computers, it was also the collection of many peoples’ ideas and efforts to evolve into the cultural staple we know of today.

This is a relatively new chapter; this version is from the third edition of the book which was released in July of 2013. What this means is that it is more up-to-date than its previous iteration (from 2004), which didn't know what an iPhone was and couldn't fathom how influential a social media site like Facebook could be, which had just been dreamed up by Mark Zuckerberg.


As mentioned earlier, this book is a third edition, meaning the authors of the book are aware of the mercurial nature of technology and its effects on society and have revised it accordingly; the first publication of this book dates back to 1996, with its second edition being published in 2004.The primary author, Martin Campbell-Kelly, certainly knows his salt about computer systems and their influence, as he’s an emeritus professor from the University of Warwick. Campbell-Kelly was able to get help from distinguished colleagues, as well, such as William Aspray (a professor at the School of Information at the University of Texas at Austin) who is the only collaborator from the 1st edition on, Nathan Ensmenger (a professor of Information Technologies from the University of Indiana), and Jeffrey R. Yost of the Charles Babbage Institute at the University of Minnesota.

Ted Talk: Andrew Blum "Discover the Physical Side of the Internet"



This Ted Talk by Andrew Blum discusses the “Physical Side of the Internet.” It talks about the internet being a fictional place that functions within our realities. It connects people, but people do not fully understand it. He sought to find the reality of the internet. This fits with the discussion, as many of the articles talk about the history and makeup of computers and the internet and how it has helped form connections that may otherwise be impossible. The internet not only connects people in cyberspace, it physically connects people as well. 

Blum, A. (2012, June). Andrew Blum: Discover the Physical Side of Internet. [Video File]. Retrieved from  http://www.ted.com/talks/andrew_blum_what_is_the_internet_really.html