Use of Internet in China 2005
By Spring of 2005 it was estimated that over 100,000,000 people in China used the Internet.
By Spring of 2005 it was estimated that over 100,000,000 people in China used the Internet.
The Internet Archive in San Francisco archived forty billion web pages from 1996 to 2005.
The National Science Foundation funded research headed by Gabor Forgacs at the University of Missouri-Columbia on what was called "Organ Printing," to "further advance our understanding of self-assembly during the organization of cells and tissues into functional organ modules."
From ABC News 2-10-2006:
"In what could be the first step toward human immortality, scientists say they've found a way to do all of these things and more with the use of a technology found in many American homes: an ink-jet printer.
"Researchers around the world say that by using the technology, they can actually 'print' living human tissue and one day will be able to print entire organs.
" 'The promise of tissue engineering and the promise of 'organ printing' is very clear: We want to print living, three-dimensional human organs,' Dr. Vladimir Mironov said. 'That's our goal, and that's our mission.' "
"Though the field is young, it already has a multitude of names.
" 'Some people call this 'bio-printing.' Some people call this 'organ printing.' Some people call this 'computer-aided tissue engineering.' Some people call this 'bio-manufacturing,' said Mironov, associate professor at the Medical University of South Carolina and one of the leading researchers in the field."
Tall as a four-story building, the Mitsubishi DIAMONDSTAR 90, produced by Mitsubishi Heavy Industries Printing & Packaging Machinery Ltd., Hiroshima, Japan, was the world's fastest double width newspaper offset press, with a printing speed of 90,000 full color, 96-page broadsheet copies per hour.
"a multi-year undertaking of Corpus Christi College, the Stanford University Libraries and the Cambridge University Library, to produce a high-resolution digital copy of every imageable page in the 538 manuscripts described in M. R. James Descriptive Catalogue of the Manuscripts in the Library of Corpus Christi College, Cambridge (Cambridge University Press, 1912), and to build an interactive web application in which the manuscript page images can be used by scholars and students in the context of editions, translations and secondary sources" (Parker Library on the Web site, accessed 11-27-2008).
The project was expected to be completed in 2009. The website of the Parker Library is at this link.
The author/editor of this database, Jeremy Norman, issued From Gutenberg to the Internet: A Sourcebook on the History of Information Technology.
This printed book was the first anthology of original publications, reflecting the origins of the various technologies that converged to form the Internet. Each reading is introduced by the editor.
Google launched Google Earth, a virtual globe, map and geographical information program, which mapped the Earth by the superimposition of images obtained by satellite.
The program, which Google acquired when it purchased Keyhole, Inc., was originally called EarthViewer 3D.
American journalist and non-fiction writer Richard Louv published Last Child in the Woods: Saving Our Children From Nature-Deficit Disorder.
In this book Louv studied the relationship of children and the natural world in current and historical contexts, creating the term “nature-deficit disorder” to describe possible negative consequences to individual health and the social fabric as children move indoors as a result of immersion in television, Internet, and computer games, and away from physical contact with the natural world – particularly unstructured, solitary experience.
Louv cited research pointing to attention disorders, obesity, a dampening of creativity and depression as problems associated with a nature-deficient childhood. He amassed information on the subject from practitioners of many disciplines to make his case, and is commonly credited with helping to inspire an international movement to reintroduce children to nature.
I first learned about Louv's book in a lecture by paleontologist, educator, and television broadcaster Scott D. Sampson held at Marin Academy in San Rafael, California on October 26, 2011. Sampson's lecture was the first in a science lecture series organized by my son, Max, in his junior year in high school. An extremely engaging speaker, Sampson uses the electronic media to promote the disengagement from media, and active exploration of nature, especially in childhood. He also promotes the use of social media in promoting scientific exploration of nature by the individual in each person's locality.
"Features of NarusInsight include:
"♦ Scalability to support surveillance of large, complex IP networks (such as the Internet)
"♦ High-speed Packet processing performance, which enables it to sift through the vast quantities of information that travel over the Internet.
"♦ Normalization, Correlation, Aggregation and Analysis provide a model of user, element, protocol, application and network behaviors, in real-time. That is it can track individual users, monitor which applications they are using (e.g. web browsers, instant messaging applications, email) and what they are doing with those applications (e.g. which web sites they have visited, what they have written in their emails/IM conversations), and see how users' activities are connected to each other (e.g. compiling lists of people who visit a certain type of web site or use certain words or phrases in their emails).
"♦ High reliability from data collection to data processing and analysis.
"♦ NarusInsight's functionality can be configured to feed a particular activity or IP service such as security, lawful intercept or even Skype detection and blocking.
"♦ Compliance with CALEA and ETSI.
"♦ Certified by Telecommunication Engineering Center (TEC) in India for lawful intercept and monitoring systems for ISPs.
"The intercepted data flows into NarusInsight Intercept Suite. This data is stored and analyzed for surveillance and forensic analysis purposes.
"Other capabilities include playback of streaming media (i.e. VoIP), rendering of web pages, examination of e-mail and the ability to analyze the payload/attachments of e-mail or file transfer protocols. Narus partner products, such as Pen-Link, offer the ability to quickly analyze information collected by the Directed Analysis or Lawful Intercept modules.
"A single NarusInsight machine can monitor traffic equal to the maximum capacity (10 Gbit/s) of around 39,000 DSL lines or 195,000 telephone modems. But, in practical terms, since individual internet connections are not continually filled to capacity, the 10 Gbit/s capacity of one NarusInsight installation enables it to monitor the combined traffic of several million broadband users.
"According to a company press release, the latest version of NarusInsight Intercept Suite (NIS) is "the industry's only network traffic intelligence system that supports real-time precision targeting, capturing and reconstruction of webmail traffic... including Google Gmail, MSN Hotmail, Yahoo! Mail, and Gawab Mail (English and Arabic versions)."
"It can also perform semantic analysis of the same traffic as it is happening, in other words analyze the content, meaning, structure and significance of traffic in real time. The exact use of this data is not fully documented, as the public is not authorized to see what types of activities and ideas are being monitored" (Wikipedia article on Narus [company], accessed 01-14-2012).
"Nothing is more striking, over the years covered by this survey, than the progressive dematerialization of the means by which texts are prepared for reproduction. At one extreme, in 1915, are the thousand pages of hand-set type for Fortescue's History, waiting to be printed at R & R Clark's works in Edinburgh. At the other are the resources used to produce this book, where the single concrete realization of the completed text that existed before printing was begun was the output from a laser imagesetter. In betwen are the disappearance of three-dimensional punches, matrices and type that came with direct-photography photocomposition, and the disappearance of the photographic matrix with the electronic technologies that followed.
"Over the same period the means used to produce the types with which text is composed have followed a similar course. The ranks of drawing desks or pantographs receding into the distance at Salfords date from the great days of the Monotype Corporation between the wars; but even in 1918 Rudolf Koch's Die Schriftgeisserei im Schattenbild shows 27 men, two women, two boys and two horses at work on the manufacture and despatch of foundry type. By contrast, the team that worked on the Colorado project, which in two years after 1995 produced all the type used for residential and business entries in telephone directories for most of the western United States, was made up at its largest of six people. The work was done in three different countries; the only concrete objects exchanged between the participants were character drawings and photocomposed proofs of type.
"In some ways the end of the twentieth century has brought the business of type manufacture back almost to where it began. Claude Garamont cut the punches for the grecs du Roy himself and had the matrices justified by Paterne Robelot, whom he chose for the task because he was clever at it. In the last couple of decades the development of computer-based typemaking tools and the world-wide web have meant that designers can now make and distribute type entirely on their own; though unless, like Garamont or the Colorado group, they are fulfilling a specific commission, marketing their work is still a problem.
"For the manufacture and composition of printer's type, paradoxically enough, the first decade of the twenty-first century is a period of relative technological calm. The basic tools - PostScript, TrueType, networked personal computers, page makeup programs and desktop laser printers - all appeared in the whirlwind of the 1980s. Increasing computing power has meant that more can now be done with them, and done more quickly; but the processes of type design, and the fundamental tehcnologies that underlie them, are very much the same today as they were for Sumner Stone in the 1980s.
"If typemaking tools changed beyond recognition in the early 1960s and again in the 1980s, there has been no change at all since 1445 or so in the task that types for composing text - or rather, the character images the types give rise to- are required to perform. The first objective in the design, manufacture, composition and reproduction of text types remains the same as it has always been: to put legible character images, legibly arranged, before the reader's eyes. It is the means of doing this that changed during the twentieth century, not the objective itself. The second objective - to give the type a voice of its own to speak with - has also remained the same, altthough changes in rendering techniques have had more effect on the difficulty of achieving this" (Southall, Printer's type in the twentieth century. Manufacturing and design methods  223-24).
Three former employees of Paypal — Steven Chen, Chad Hurley, and Jawed Karim — founded the video sharing website, YouTube. Its first headquarters were above a pizzeria and Japanese restaurant in San Mateo, California.
On March 17, 2005 The European Library, a free service that offered access to the resources of the 48 national libraries of Europe in 20 languages, was launched from its headquarters at Koninklijke Bibliotheek, Den Haag (The Hague), Netherlands. Resources included both digital or physical (books, posters, maps, sound recordings, videos, etc.).
"Currently The European Library gives access to 150 million entries across Europe. The amount of referenced digital collections is constantly increasing. Quality and reliability are guaranteed by the 48 collaborating national libraries of Europe. The European Library is a non-commercial organisation" (European Library website, accessed 11-21-2008).
Sony's PlayStation and PS 1 reached "a combined total of 102.49 million units shipped", becoming the first video game console to reach the 100 million mark.
The U. S.- China Economic and Security Review Commission (USCC.gov) issued the report of Xiao Qiang, University of California, Berkeley, on The Development and the State Control of the Chinese Internet.
By Feburary 2011 this brief video had been viewed 4,282,497 times.
The Huffington Post, which launched on May 9, 2005 with a meager $1 million investment, and grew into one of the most heavily visited news sites in the country, announced that it would be acquired by AOL for $315 million, $300 million of it in cash and the rest in stock.
"Arianna Huffington, the cable talk show pundit, author and doyenne of the political left, will take control of all of AOL’s editorial content as president and editor in chief of a newly created Huffington Post Media Group. The arrangement will give her oversight not only of AOL’s national, local and financial news operations, but also of the company’s other media enterprises like MapQuest and Moviefonea' (http://www.nytimes.com/2011/02/07/business/media/07aol.html?_r=1&hp).
"The company that brought dial-up Internet to millions of people is dead. In its place is a massive media empire that refuses to be ignored.
"With its blockbuster acquisition of The Huffington Post, AOL has catapulted itself back into relevancy. It has sent a clear signal to the rest of the world that it is a media company and it is in this game to win.
"AOL has been on a content acquisition spree recently, not only acquiring the technology blog network TechCrunch, but also snagging up Thing Labs, Brizzly and most recently About.me in the past few months" (http://mashable.com/2011/02/07/aol-huffington-post/?utm_source=feedburner&utm_medium=email&utm_campaign=Feed:+Mashable+(Mashable), accessed 02-07-2010).
"The invention of the printing press with movable type fanned religious wars in the 16th century. The onset of telegraphy, photography, and the power-driven printing press in the 19th century created mass journalism that fulminated nationalistic passions and world wars in the 20th century. The arrival in the late 20th century of instantaneous, networked, global communication may well have facilitated the targeted propaganda, recruitment, and two-way communication of transnational terrorist organizations more than it has helped combat them.
"We are now discovering—painfully and much too slowly—that deep conflict between cultures is in many ways being fired up rather than cooled down by this revolution in communications, as was the case in the 16th and 19th centuries. Whenever new technology suddenly brings different peoples closer together and makes them aware of certain commonalities, it seems simultaneously to create a compensatory psychological need for the different peoples to define—and even assert aggressively—what is unique and distinctive about their own historic cultures."
In the wake of the July 7, 2005 London bombings and the Buncefield oil depot fire, the British Broadcast Corporation (BBC) expanded its user-generated content team, established in April 2005. After the Buncefield disaster the BBC received over 5,000 photos from viewers. This may be the beginning of adoption of citizen-generated journalism by mainstream industrial media.
Beth Noveck, director of New York Law School's Institute for Information Law and Policy, issued “Peer to Patent” (PtoP): A Modest Proposal in her blog. The proposal "would shift the patent-application process away from individual examiners to an internet-based, peer-review method."
In response to copyright problems Google announced a moratorium on the scanning of copyrighted books for its Google Print Library Project.
"Khipu [quipu] are knotted-string devices that were used for bureaucratic recording and communication in the Inka [Inca] Empire. We recently undertook a computer analysis of 21 khipu from the Inka administrative center of Puruchuco, on the central coast of Peru. Results indicate that this khipu archive exemplifies the way in which census and tribute data were synthesized, manipulated, and transferred between different accounting levels in the Inka administrative system" (Science).
"Researchers in the US believe they have come closer to solving a centuries-old mystery - by deciphering knotted string used by the ancient Incas.
"Experts say one bunch of knots appears to identify a city, marking the first intelligible word from the extinct South American civilisation.
"The coloured, knotted pieces of string,known as khipu, are believed to have been used for accounting information.
"The researchers say the finding could unlock the meaning of other khipu.
"Harvard University researchers Gary Urton and Carrie Brezine used computers to analyse 21 khipu.
"They found a three-knot pattern in some of the strings which they believe identifies the bunch as coming from the city of Puruchuco, the site of an Inca palace.
" 'We hypothesize that the arrangement of three figure-eight knots at the start of these khipu represented the place identifier, or toponym, Puruchuco,' they wrote in their report, published in the journal Science.
" 'We suggest that any khipu moving within the state administrative system bearing an initial arrangement of three figure-eight knots would have been immediately recognisable to Inca administrators as an account pertaining to the palace of Puruchuco.' (http://news.bbc.co.uk/2/hi/americas/4143968.stm, accessed 04-28-2009).
"The home page consists of a million pixels arranged in a 1000 × 1000 pixel grid; the image-based links on it were sold for $1 per pixel in 10 × 10 blocks. The purchasers of these pixel blocks provided tiny images to be displayed on them, a Uniform Resource Locator (URL) to which the images were linked, and a slogan to be displayed when hovering a cursor over the link. The aim of the site was to sell all of the pixels in the image, thus generating a million dollars of income for the creator. The Wall Street Journal has commented that the site inspired other websites that sell pixels.
"Launched on 26 August 2005, the website became an Internet phenomenon. The Alexa ranking of web traffic peaked at around 127; as of 18 February 2009 (2009 -02-18)[update], it is 42,735. On 1 January 2006, the final 1,000 pixels were put up for auction on eBay. The auction closed on 11 January with a winning bid of $38,100 that brought the final tally to $1,037,100 in gross income" (Wikipedia article on The Million Dollar Homepage, accessed 05-08-2009).
"By its one-year anniversary in August 2006, LibraryThing had attracted more than 73,000 registered users who had cataloged 5.1 million individual books, representing nearly 1.2 million unique works; in May 2008 they reached over 400,000 users and 27 million books" (Wikipedia article on LibraryThing, accessed 12-15-2008).
Classes began at the University of California, Merced. At the opening of this new campus focused on math, science, and engineering the library included approximately 10,000 journal subscriptions all available online, with no print journals. This "21st century research library" contained a limited collection of about 30,000 physical books, and offered interlibrary loans from other University of Calfornia libraries. It emphasized providing access to digital books and the "deep web"—databases available by subscription:
"The Internet is wide-ranging, but the bulk of the information needed for scholarly study and research is not freely available and cannot be found in a Google search. The UC Merced Library acquires and manages subscriptions to millions of scholarly articles in electronic journals, tens of thousands of electronic books, and hundreds of databases. Thanks to the Library, UC Merced students and faculty can access these scholarly electronic resources at any time with a connection to the Internet.
"The collection has what you want.
"The Library has many books and DVD movies on the shelves to support study in the areas of UC Merced specialization and to also provide a break from study with recreational reading and viewing. If what you need is not in the building, then use the University of California systemwide library catalog to request free, overnight courier delivery for any of the 32 million volumes at the other UC campuses" (UC Merced Library website, accessed 01-28-09).
The National Archives and Records Administration (NARA) selected Lockheed Martin Corporation to build the Electronic Records Archives (ERA) system, a permanent electronic archives system to preserve, manage, and make accessible the electronic records created by the federal government. The ERA system would capture electronic information – regardless of its format – save it permanently, and make it accessible on whatever future hardware or software is currently in use. Development of the system would continue over the next six years, and cost $308,000,000.
The second International Conference of the Preservation of Digital Objects took place in Göttingen, Germany. (The first international conference in this series took place in 2004 in Beijing.)
Neuroscientists Olaf Sporns of Indiana University, Giulio Tononi of the University of Wisconsin, and Rolf Köttler of Heinrich Heine University, Düsseldorf, Germany, published "The Human Connectome: A Structural Description of the Human Brain," PLoS Computational Biology I (4). This paper and the PhD thesis of Patric Hagmann from the Université de Lausanne, From diffusion MRI to brain connectomics, coined the term connectome:
In their 2005 paper Sporns et al. wrote:
"To understand the functioning of a network, one must know its elements and their interconnections. The purpose of this article is to discuss research strategies aimed at a comprehensive structural description of the network of elements and connections forming the human brain. We propose to call this dataset the human 'connectome,' and we argue that it is fundamentally important in cognitive neuroscience and neuropsychology. The connectome will significantly increase our understanding of how functional brain states emerge from their underlying structural substrate, and will provide new mechanistic insights into how brain function is affected if this structural substrate is disrupted."
In his 2005 Ph.D. thesis, From diffusion MRI to brain connectomics, Hagmann wrote:
"It is clear that, like the genome, which is much more than just a juxtaposition of genes, the set of all neuronal connections in the brain is much more than the sum of their individual components. The genome is an entity it-self, as it is from the subtle gene interaction that [life] emerges. In a similar manner, one could consider the brain connectome, set of all neuronal connections, as one single entity, thus emphasizing the fact that the huge brain neuronal communication capacity and computational power critically relies on this subtle and incredibly complex connectivity architecture" (Wikipedia article on Connectome, accessed 12-28-2010).
Global sales of J. K. Rowling's Harry Potter book series surpassed 300,000,000 printed copies.
Scientists at the Armed Forces Institute of Pathology deciphered the genetic code of the 1918 avian flu virus H5N1, which killed as many as 50,000,000 people worldwide, from a victim exhumed in 1997 from the Alaskan permafrost. They reconstructed the virus in the laboratory and published the genetic sequence.
Google CEO Eric Schmidt speculated that it may take three hundred years to index all the world's information and make it searchable.
" 'We did a math exercise and the answer was 300 years,' Schmidt said in response to an audience question asking for a projection of how long the company's mission will take. 'The answer is it's going to be a very long time.'
"Of the approximately 5 million terabytes of information out in the world, only about 170 terabytes have been indexed, he said earlier during his speech."
Microsoft announced that it was joining the Open Content Alliance founded by Brewster Kahle of the Internet Archive. The Open Content Alliance was formed partly in response to Google Print, renamed Google Books.
The National Nuclear Security Administration (NNSA) announced that the BlueGene/L supercomputer built by IBM at Lawrence Livermore National Laboratories performed at 280.6 trillion operations per second (teraflops) on the Linpack benchmark, the standard by which major supercomputers were measured. This shattered the previous high mark of performing at 135.3 teraflops.
"IBM said in a statement that if every person in the world had a handheld calculator it would still take decades to perform the number of calculations Blue Gene performs every single second."
"a crowdsourcing marketplace that enables computer programs to co-ordinate the use of human intelligence to perform tasks which computers are unable to do."
This was the first business application using Collaborative Human Interpreter, a programming language "designed for collecting and making use of human intelligence in a computer program. One typical usage is implementing impossible-to-automate functions."
"The sudden and unexpected importance of the Wikipedia, a free online encyclopedia created by tens of thousands of volunteers and coordinated in a deeply decentralized fashion, represents a radical new modality of content creation by massively distributed collaboration. This talk will discuss the unique principles and values which have enabled the Wikipedia community to succeed and will examine the intriguing prospects for application of these methods to a broad spectrum of intellectual endeavors."
The Library of Congress announced a plan to create the World Digital Library of works in the public domain. Google donated $3,000,000 toward the costs of planning this project.
The Google Print project morphed into Google Books.
The British Library with about 150,000,000 physical items on 625km of shelves might have been the world's largest physical library in 2005, though the U.S. Library of Congress also made this claim. The British Library added about 3,000,000 physical items per year, which occupied about 12km of new shelving. At the end of 2005 the Library of Congress held about 130,000,000 physical items and had more than 8,000,000 digital items online.
In December 2005 Heritage Preservation, the U.S. National Institute for Conservation, and the Institute of Museum and Library Services published The Heritage Health Index Report on the State of America's Collections. Among the conclusions of this report were that there were 4.8 billion cultural heritage materials in the U.S. and over 1.3 billion of those items were at risk. Forty percent of the surveyed institutions that housed those items reported no budget allocated for preservation while 80% of the institutions had no disaster plan.
A peer-review comparison of selected science articles in the printed Encyclopedia Britannica, published in Chicago, with 65,000 articles by 4,000 contributors, and the online user-edited Wikipedia, conducted by the journal Nature, published in London, rated the Wikipedia nearly as accurate as Britannica.
The Museum of Modern Art (MoMA), New York, opened PIXAR: 20 Years of Animation:
"The Most Extensive Gallery Exhibition that MoMA has ever devoted to Animation along with a Retrospective of Pixar Features and Shorts."
Notably MoMA found it unnecessary to characterize the exhibition as "computer animation" since by this time virtually all animation was done by computer. They published a 175 page printed catalogue of the exhibition.
Google issued their first monthly newsletter for librarians, the Google Librarian Newsletter.
"Librarians and Google share the same mission: to organize the world's information and make it universally accessible and useful. The goal of this newsletter is to highlight ways that we can work together to fulfill that mission, for patrons, students, and users."
The Wayback Machine, a digital time capsule at the Internet Archive, San Francisco, contained almost 2 petabytes of data, and was growing at a rate of 20 terabytes per month, a two-thirds increase over the 12 terabytes/month growth rate reported in 2003. Its growth rate eclipsed the amount of text contained in the world's largest libraries, including the Library of Congress.
The Center for Informatics Research in Science and Scholarship (CIRSS), formerly the Library Research Center (LRC), of the Graduate School of Library and Information Science at the University of Illinois at Urbana-Champaign, began funding the Data Curation Education Program (DCEP).
"Data curation is the active and on-going management of data through its lifecycle of interest and usefulness to scholarly and educational activities across the sciences, social sciences, and the humanities. Data curation activities enable data discovery and retrieval, maintain data quality, add value, and provide for re-use over time. This new field includes representation, archiving, authentication, management, preservation, retrieval, and use. Our program offers a focus on data collection and management, knowledge representation, digital preservation and archiving, data standards, and policy, providing the theory and skills necessary to work directly with academic and industry researchers who need data curation expertise. To this end, DCEP has established a number of educational collaborations with premier science, social science, and humanities data centers across the country to prepare a new generation of library and information science professionals to curate materials from databases and other formats. We anticipate that our graduates will be employed across a range of information-oriented institutions, including museums, data centers, libraries, institutional repositories, archives, and private industry."
The program began with a focus on "data curation curriculum and best practices for the LIS and scientific communities. IMLS provided additional funding in 2008 to extend the curriculum to include humanities data" (Data Curation Education Program website, accessed 01-28-2009).
PricewaterhouseCoopers reported that US$16.9 billion was spent on Internet marketing in the U.S. during 2006. (PricewaterhouseCoopers website, accessed 05-10-2009).
In 2009 MySpace acquired iLike.
Springer, which initiated its eBook program in 2006, announced the publication of its 50,000 eBook on January 19, 2012, available through SpringerLink.com. Springer also stated that it would digitize nearly all books it had published since its foundation in 1842. This would increase the number of titles to over 100,000. Based on the number of titles available in January 2012, Springer claimed to be the "largest eBook publisher."
Having initially registered the domain name for free, after which he temporarily lost it to a con man, Gary Kremen won a lawsuit and sold Sex.com for Boston-based Escom LLC $14,000,000 or "$15 million in cash and stock." This was the highest price obtained for a domain name at the time. Maybe ever?
In 2006 free file-sharing of digital music on the web exceeded the sale of digital music downloads by many fold:
"Total music sales - including online - are off some 20 percent from five years ago. Songs traded freely over unlicensed Internet sites swamp the number of legal sales by thousands to one."
Dirk Brockmann, a theoretical physicist and computational epidemiologist at Northwestern University in Evanston, Illinois, L. Hufnagel, and T. Geisel published "The scaling laws of human travel," Nature 439 (2006) 46265.
Using statistical data from the American currency tracking website, Where's George?, the paper described statistical laws of human travel in the United States, and developed a mathematical model of the spread of infectious disease.
[By January 31, 2009, Where's George? tracked over 149 million bills totaling more than $810 million. (Wikipedia).]
Apple launched iTunes U, a service that offered college-level lectures via podcasts.
On January 31, 2006 The Electronic Frontier Foundation (EFF) filed a class-action lawsuit against AT&T accusing the telecom giant of violating the law and the privacy of its customers by collaborating with the National Security Agency (NSA) in "its massive illegal program to wiretap and data-mine Americans' communications."
"In Hepting v. AT&T, EFF sued the telecommunications giant on behalf of its customers for violating privacy law by collaborating with the NSA in the massive, illegal program to wiretap and data-mine Americans’ communications.
"Evidence in the case includes undisputed evidenceprovided by former AT&T telecommunications technician Mark Klein showing AT&T has routed copies of Internet traffic to a secret room in San Francisco controlled by the NSA.
"In June of 2009, a federal judge dismissed Hepting and dozens of other lawsuits against telecoms, ruling that the companies had immunity from liability under the controversial FISA Amendments Act (FISAAA), which was enacted in response to our court victories in Hepting. Signed by President Bush in 2008, the FISAAA allows the Attorney General to require the dismissal of the lawsuits over the telecoms' participation in the warrantless surveillance program if the government secretly certifies to the court that the surveillance did not occur, was legal, or was authorized by the president -- certification that was filed in September of 2008. EFF is planning to appeal the decision to the 9th U.S. Circuit Court of Appeals, primarily arguing that FISAAA is unconstitutional in granting to the president broad discretion to block the courts from considering the core constitutional privacy claims of millions of Americans" (https://www.eff.org/nsa/hepting, accessed 01-14-2014).
In D-Lib Magazine researchers at Cornell University from the departments of Computer Science, Information Science, and the Cornell Theory Center described plans for A Research Library Based on the Historical Collections of the Internet Archive. The library, a super-computing application consisting of 10 billion web pages, was intended to be used by social scientists.
By some estimates 92 percent of all cameras sold in 2006 were digital.
Vital US infrastructure, including power grids and banking systems, were put under simulated attack in a week-long security exercise called Cyber Storm.
♦ FROM THE U.S. GOVERNMENT'S PUBLISHED INTERPRETATION OF THE RESULTS
"The U.S. Department of Homeland Security’s (DHS) National Cyber Security Division (NCSD) successfully executed Cyber Storm, the first national cyber exercise Feb. 6 thru Feb. 10, 2006. The exercise was the first government-led, full-scale cyber security exercise of its kind. NCSD, a division within the department’s Preparedness Directorate, provides the federal government with a centralized cyber security coordination and preparedness function called for in the National Strategy for Homeland Security, the National Strategy to Secure Cyberspace and Homeland Security Presidential Directive 7. NCSD is the focal point for the federal government’s interaction with state and local government, the private sector and the international community concerning cyberspace vulnerability reduction efforts."
"The exercise simulated a sophisticated cyber attack campaign through a series of scenarios directed at several critical infrastructure sectors. The intent of these scenarios was to highlight the interconnectedness of cyber systems with physical infrastructure and to exercise coordination and communication between the public and private sectors. Each scenario was developed with the assistance of industry experts and was executed in a closed and secure environment.
"Cyber Storm scenarios had three major adversarial objectives:
"* To disrupt specifically targeted critical infrastructure through cyber attacks
"* To hinder the governments' ability to respond to the cyber attacks
"* To undermine public confidence in the governments' ability to provide and protect service" (http://www.dhs.gov/xnews/releases/pr_1158340980371.shtm, accessed 08-09-2009).
♦ A LESS OPTIMISTIC INTERPRETATION FROM THE WIKIPEDIA
"The Cyber Storm exercise was a simulated exercise overseen by the Department of Homeland Security that took place February 6 through February 10, 2006 with the purpose of testing the nations defenses against digital espionage. The simulation was targeted primarily at American security organizations but officials from Britain, Canada, Australia and New Zealand participated as well.
"The exercise simulated a large scale attack on critical digital infrastructure such as communications, transportation, and energy production. The simulation took place a series of incidents which included.
" * Washington's metro trains mysteriously shutting down.
" * Bloggers revealing locations of railcars containing hazardous materials. * The airport control towers of Philadelphia and Chicago mysteriously shutting down.
" * A mysterious liquid appearing on a London subway.
" * Significant numbers of people on "no fly" lists suddenly appearing at airports all over the nation.
" * Planes flying too close to the White House. * Water utilities in Los Angeles getting compromised.
"During the exercise the computers running the simulation came under attack by the players themselves. Heavily censored files released to the Associated Press reveal that at some time during the exercise the organizers sent every one involved an e-mail marked "IMPORTANT!" telling the participants in the simulation not to attack the game's control computers.
"Performance of participants
"The Cyber Storm exercise highlighted the gaps and shortcomings of the nation's cyber defenses. The cyber storm exercise report found that institutions under attack had a hard time getting the bigger picture and instead focused on single incidents treating them as 'individual and discrete.'
"In light of the test the Department of Homeland Security raised concern that the relatively modest resources assigned to cyber-defense would be 'overwhelmed in a real attack' (Wikipedia article on Cyber Storm Exercise, accessed 08-09-2009).
"Zillow allows users to see the value of millions of homes across the United States, not just those up for sale. In addition to giving value estimates of homes, it offers several unique features including value changes of each home in a given time frame (such as 1, 5, or 10 years), aerial views of homes, and prices of homes in the area. Where it can access appropriate data, it also provides basic information on a given home, such as square footage and the number of bedrooms and bathrooms. Users can also get current estimates of homes if there was a significant change made, such as a recently remodeled kitchen. Zillow provides an application programming interface (API) and developer support network.
"As a part of its API, Zillow assigns a numerical integer to each of the 70 million homes in its database, which is plainly visible as CGI parameters to the URLs to individual entries on its website. The identifier is not obfuscated and is assigned in sequence for each house or condo on the side of a street. Zillow reports on individual units, such as providing street address, latitude and longitude. When integrated with the features of a typical online reverse telephone directory and wiki-mapping services such as WikiMapia, it allows for nationwide "seating assignments" of U.S. neighborhoods for each house that has a listed phone number with a real human name" (Wikipedia article on Zillow.com.)
Using object detection technology, researchers at the University of Buffalo, the University of Massachusetts at Amherst, and the Adaptive Information Cluster at Dublin City University, in association with Google, developed software for scanning historical manuscripts in a way that recognized handwriting to make electronic texts of these manuscripts searchable.
On the 60th anniversary of the public announcement of the ENIAC Computerworld published a previously unknown interview with J. Presper Eckert on the origins of the ENIAC.
Apple iTunes Store surpassed one billion iTunes downloads.
RLG opened ArchiveGrid, a new search engine providing access to nearly a million archive collection descriptions in thousands of libraries, museums, and archives.
Marc Weber and William B. Pickett founded the World Wide Web History Center.
Reflecting the influence of the Internet on physical library access and usage, the Library of Congress published The Changing Nature of the Catalogue and its Integration with Other Discovery Tools by Karen Calhoun.
Carmen Bambach of the Metropolitan Museum of Art, New York discovered on the priceless manuscripts of Leonardo da Vinci's Codex Atlanticus, preserved in the Bibliotheca Ambrosiana in Milan, an extensive invasion of molds of various colors, "including black, red, and purple, along with swelling of pages."
In 2008 The Opificio delle Pietre Dure, in Florence "determined that the colors found on the pages weren't the product of mold, but instead caused by mercury salts added to protect the Codex from mold."
By May 9, 2009 the video had been viewed 119,378,381 times. AT this date it was the Most Viewed (All Time) Video, the Most Favorited (All Time) Video, and the eighth Most Discussed (All Time) Video on YouTube.
In April 2006 the first experimental beta Espresso Book Machine was installed at the World Bank InfoShop in Washington, D.C. to print and bind World Bank publications on demand.
"In September 2006 ODB installed a second beta machine at The Library of Alexandria, Egypt, to print books in Arabic. The first EBM Version 1.5 was introduced for ninety days at the New York Public Library during the summer of 2007."
In September 2008 the first Espresso Book Machine in a retail commercial setting was installed at Angus & Robertson in Melbourne, Australia.
♦ Link to the PDF brochure for Espresso Book Machine 2.0 at ondemandbooks.com, accessed 08-31-2009.
♦ In November 2012 it was my pleasure to see the Espress Book Machine in operation at the privately owned Harvard Book Store in Cambridge, Massachusetts. Humorously nicknamed "Paige M. Gutenborg," the machine produced remarkably high quality paperback books at the speed of around 5 minutes per book. Customers supplied fully formatted black and white text as a PDF plus a separate PDF containing their design for a full color cover. The machine combined a double-sided xerographic laser printer with an ingenious binding and trimming mechanism. It printed the text on regular book paper and the color cover on coated cover stock. Since the binding machine was enclosed in plexiglass it was possible to observe the various binding processes, concluding with the machine dropping each finished copy out a small chute. When I watched the machine in operation it was being observed by a human operator. My impression was that the machine required certain adjustments and worked best when "supervised" by a human.
Representing the Library of Congress Professional Guild, Thomas Mann published A Critical Review of Karen Calhoun's paper published on March 17. This review rebutted various assertions in the Calhoun report.
"From the days of Sumerian clay tablets till now, humans have "published" at least 32 million books, 750 million articles and essays, 25 million songs, 500 million images, 500,000 movies, 3 million videos, TV shows and short films and 100 billion public Web pages. All this material is currently contained in all the libraries and archives of the world. When fully digitized, the whole lot could be compressed (at current technological rates) onto 50 petabyte hard disks. Today you need a building about the size of a small-town library to house 50 petabytes. With tomorrow's technology, it will all fit onto your iPod. When that happens, the library of all libraries will ride in your purse or wallet — if it doesn't plug directly into your brain with thin white cords. Some people alive today are surely hoping that they die before such things happen, and others, mostly the young, want to know what's taking so long. (Could we get it up and running by next week? They have a history project due.)"
"for the act of taking a job traditionally performed by an employee or contractor, and outsourcing it to an undefined, generally large group of people, in the form of an open call. For example, the public may be invited to develop a new technology, carry out a design task, refine an algorithm or help analyze large amounts of data."
Borrowing a technique from genetics, S. Blair Hedges, professor at biology at Penn State, University Park, Pennsylvania published "A method for dating early books and prints using image analysis," Proc. R. Soc. Lond. A: Mathematical, Physical, and Engineering Sciences 462 (2006)3555-3573, describing the "print clock" method for dating examples of printing, including books and copperplates, issued from hand-operated presses. A supplementary appendix was available from Hedges' website.
"advance offerings and drive efficiencies for libraries, archives, museums and other research organizations worldwide."
Rice University Press, which shut down in 1996, announced that it was re-opening as an entirely digital operation:
"As money-strapped university presses shut down nationwide, Rice University is turning to technology to bring its press back to life as the first fully digital university press in the United States.
"Using the open-source e-publishing platform Connexions, Rice University Press is returning from a decade-long hiatus to explore models of peer-reviewed scholarship for the 21st century. The technology offers authors a way to use multimedia -- audio files, live hyperlinks or moving images -- to craft dynamic scholarly arguments, and to publish on-demand original works in fields of study that are increasingly constrained by print publishing.
" 'Rice University Press is using Rice's strength in technology to innovatively overcome increasingly common obstacles to publication of scholarly works,' Rice University President David Leebron said. 'The nation's first fully digital academic press provides not only a solution for scholars -- particularly those in the humanities -- who are limited by the dearth of university presses, but also a venue for publishing multimedia essays, articles, books and scholarly narratives.'
Charles Henry, Rice University vice provost, university librarian and publisher of Rice University Press during the startup phase, said, 'Our decision to revive Rice's press as a digital enterprise is based on both economics and on new ways of thinking about scholarly publishing. On the one hand, university presses are losing money at unprecedented rates, and technology offers us ways to decrease production costs and provide nearly ubiquitous delivery system, the Internet. We avoid costs associated with backlogs, large inventories and unsold physical volumes, and we greatly speed the editorial process.
" 'We don't have a precise figure for our startup costs yet, but it's safe to say our startup costs and annual operating expenses will be at least 10 times less than what we'd expect to pay if we were using a traditional publishing model,' Henry said.
"The digital press will operate just as a traditional press, up to a point. Manuscripts will be solicited, reviewed, edited and resubmitted for final approval by an editorial board of prominent scholars. But rather than waiting for months for a printer to make a bound book, Rice University Press's digital files will instead be run through Connexions for automatic formatting, indexing and population with high-resolution images, audio and video and Web links.
" 'We don't print anything,' Henry explained. 'It will go online as a Rice University Press publication in a matter of days and be available for sale as a digital book.' Users will be able to view the content online for free or purchase a copy of the book for download through the Rice University Press Web site. Alternatively, thanks to Connexions' partnership with on-demand printer QOOP, users will be able to order printed books if they want, in every style from softbound black-and-white on inexpensive paper to leather-bound full-color hardbacks on high-gloss paper.
"As with a traditional press, our publications will be peer-reviewed, professionally vetted and very high quality,' Henry said. 'But the choice to have a printed copy will be up to the customer.'
"Authors published by Rice University Press will retain the copyrights for their works, in accordance with Connexions' licensing agreement with Creative Commons. Additionally, because Connexions is open-source, authors will be able to update or amend their work, easily creating a revised edition of their book. W. Joseph King, executive director of Connexions and co-director of the Rice University Press project, said, 'Connexions' mission is to support open education in all forms, including the publication of original scholarly works. We believe that Connexions has the ability to change the university press at Rice and in general.'
"In the coming months, Rice University Press will name its board of directors and appoint an editorial board in one or two academic disciplines that are especially constrained by the current print model. Over time, Rice University Press will focus on:
"1. Putting out original scholarly work in fields particularly impacted by the high costs and distribution models of the printed book. One such field is art history, in which printing costs are exceptionally high. Over the years, many university presses have slashed the number of art history titles, severely limiting younger scholars' prospects of publication, Henry said. Rice University Press has identified art history as a field that would benefit immediately and therefore it will be the press's first area of major effort.
"2. Fostering new models of scholarship: With the rise of digital environments, scholars are increasingly attempting to write book-length studies that use new media -- images, video, audio and Web links -- as part of their arguments. Rice University Press will easily accommodate these new forms of scholarship, Henry said.
"3. Providing more affordable publishing for scholarly societies and centers: Often disciplinary societies and smaller centers, especially in the humanities, publish annual reports, reflections on their field of study or original research resulting from grants. For smaller organizations, the printing costs of these publications are prohibitive. Rice University Press will partner with organizations to provide more affordable publishing.
"4. Partnering with large university presses: In the wake of rising production costs and overhead, many university presses have closed or reduced the number of titles they publish, especially in the humanities and social sciences. As a result many peer-reviewed, high quality books are waiting on backlog. Rice University Press will work with selected university publishers to inexpensively publish approved works. Henry said two major university presses have already expressed an interest in working with Rice University Press to reduce backlogged titles. Rice University Press plans to partner with these and other presses to produce such works as dual publications.
" 'Technological innovations suffuse academia, but institutional innovation often seems more challenging. The initiative to resuscitate Rice University Press as a fully digital university press is thus doubly exciting,' said Steve Wheatley, vice president of the American Council of Learned Societies, an umbrella organization of 70 scholarly societies in the humanities and social sciences. 'It is particularly encouraging to note that the revived press will give special attention to scholarship that is born digital. Equally commendable -- and perhaps even more important -- is the commitment of the university to support this initiative at this crucial phase for scholarly publishing " (http://media.rice.edu/media/NewsBot.asp?MODE=VIEW&ID=8654, accessed 05-23-2010)/
♦ "Rice University Press ceased operations on September 30, 2010. Certain publications continue to be available on Connexions."
At Siggraph2006, held in Boston, Massachusetts, BioVisions, a scientific visualization program at Harvard’s Department of Molecular and Cellular Biology, and Xvivo, a Connecticut-based scientific animation company, introduced the three-minute molecular animation video, The Inner Life of the Cell.
The film depicted marauding white blood cells attacking infections in the body.
In August 2006 Google began introduction of web-based Google Apps productivity software.
Le Document a la Lumiere du Numerique (The Document in the Digital Era) was published in print by collaborating group of information researchers under the collective pseudonym, Roger T. Pedauque. The surname of the pseudonym meant "web-footed."
A feature of the PRS-500 was that it only used power when a page was turned. Thus, theoretically 7500 pages could be read on the device with one battery charge.
The journal Nature announced that it was opening the peer review process to comments online in the form of a blog.
IBM, the largest patent holder in the U.S., announced that it "will publish its patent filings on the Web for public review as part of a new policy that the company hopes will be a model for others."
The start-up company Obvious, in San Francisco, launched the social networking and micro-blogging service Twitter: What are you doing?. Twitter "allows its users to send and read other users' updates (otherwise known as tweets), which are text-based posts of up to 140 characters in length." This is under the 160 character limit of the SMS communication protocol for mobile phones.
Between downloads on YouTube and on the Will it Blend? website, the advertising program, featuring blending of many absurd items such as blending an iPhone, many of which are listed and linked-to in the Wikipedia article on Will it Blend?, became one of the most successful Internet marketing campaigns, surpassing 100,000,000 hits by May 2009.
The Royal Society of London announced that The Royal Society Digital Journal Archive, dating back to 1665 and containing the full text and illustrations of more than 60,000 articles, was available online.
The Walters Art Museum reported through The New York Times that the Archimedes Palimpsest, the unique tenth century source for two treatises by Archimedes: The Method and Stomachion, and the unique source for the Greek text of On Floating Bodies, also contains ten pages of previously unknown speeches by Hyperides, "one of the foundational figures of Greek democracy," "illuminating some fascinating, time-shrouded insights into Athenian law and social history." The palimpsest includes parchment from seven texts, including two texts which remain to be identified.
This manuscript was purchased by a private collector at an auction at Christie's in New York on October 28, 1998. After a decade of scientific study all the Archimedes Palimpsest images were released to the public on Google Books on October 29, 2008. At the time this was the earliest text available on Google Books.
♦ Several videos, audio presentations and articles about the project are available at www.archimedespalimpsest.org
In November 2006 there were more than 100 million websites on the Internet. Between January and November of this year 27.4 million sites were added to the web. (According to Netcraft.com there were 101,435,253 sites on the Internet.)
Google and various print newspapers, including The New York Times, announced that they would test a modified version of Google's AdWords program to place advertisements in print newspapers.
Google completed the purchase of YouTube for $1.65 billion in Google stock.
"A consortium of seven newspaper chains representing 176 daily papers across the country is announcing a broad partnership with Yahoo to share content, advertising and technology . . . . In the first phase of the deal, the newspaper companies will begin posting their employment classified ads on Yahoo’s classified jobs site, HotJobs, and start using HotJobs technology to run their own online career ads.
"But the long-term goal of the alliance with Yahoo, according to one senior executive at a participating newspaper company, is to be able to have the content of these newspapers tagged and optimized for searching and indexing by Yahoo."
The Boston Globe reported that the The Environmental Protection Agency (EPA) had begun closing its nationwide network of scientific libraries, effectively preventing EPA scientists and the public from accessing vast amounts of data and information on issues from toxicology to pollution. Several libraries were already dismantled, with their contents either destroyed or shipped to repositories where they were uncataloged and inaccessible.
"Anshe Chung [Real life: Ailin Graef] has become the first online personality to achieve a net worth exceeding one million US dollars from profits entirely earned inside a virtual world.
"Recently featured on the cover of Business Week Magazine, Anshe Chung is a resident in the virtual world Second Life. Inside Second Life, Anshe buys and develops virtual real-estate in an official currency, known as Linden Dollars, which is convertible to US Dollars. There is also a liquid market in virtual real estate, making it possible to assess the value of her total holdings using publicly available statistics.
"The fortune Anshe Chung commands in Second Life includes virtual real estate that is equivalent to 36 square kilometers of land – this property is supported by 550 servers or land "simulators". In addition to her virtual real estate holdings, Anshe has 'cash' holdings of several million Linden Dollars, several virtual shopping malls, virtual store chains, and she has established several virtual brands in Second Life. She also has significant virtual stock market investments in Second Life companies.
"Anshe Chung's achievement is all the more remarkable because the fortune was developed over a period of two and a half years from an initial investment of $9.95 for a Second Life account by Anshe's creator, Ailin Graef. Anshe/Ailin achieved her fortune by beginning with small scale purchases of virtual real estate which she then subdivided and developed with landscaping and themed architectural builds for rental and resale. Her operations have since grown to include the development and sale of properties for large scale real world corporations, and have led to a real life "spin off" corporation called Anshe Chung Studios, which develops immersive 3D environments for applications ranging from education to business conferencing and product prototyping.
"Ailin Graef was born and raised in Hubei, China, but is currently a citizen of Germany. She runs Anshe Chung Studios with her husband Guntram Graef, who serves as CEO of the company. Anshe Chung Studios has offices in Wuhan, China and is currently seeking to expand its workforce from 25 to 50" (http://www.anshechung.com/include/press/press_release251106.html, accessed 01-27-2010).
Ranking members of congressional committees wrote to Stephen Johnson, Administrator of the U.S. Environmental Protectional Agency, demanding that the agency desist from destroying its libraries:
"Over the past 36 years, EPA's libraries have accumulated a vast and invaluable trove of public health and environmental information, including at least 504,000 books and reports, 3,500 journal titles, 25,000 maps, and 3.6 million information objects on microfilm, according to the report issued in 2004: Business Case for Information Services: EPA's Regional Libraries and Centers prepared for the Agency by Stratus Consulting. Each one of EPA's libraries also had information experts who helped EPA staff and the public access and use the Agency's library collection and information held in other library collections outside of the Agency. It now appears that EPA officials are dismantling what is likely one of our country's most comprehensive and accessible collections of environmental materials.
The press has reported on the concerns over the library reorganization plan voiced by EPA professional staff of the Office of Enforcement and Compliance Assurance (OECA), 16 local union Presidents representing EPA employees, and the American Library Association. In response to our request of September 19, 2006, (attached), the Government Accountability Office has initiated an investigation of EPA's plan to close its libraries. Eighteen Senators sent a letter on November 3, 2006, to leaders of the Senate Appropriations Committee asking them
to direct EPA "to restore and maintain public access and onsite library collections and services at EPA's headquarters, regional, laboratory and specialized program libraries while the Agency solicits and considers public input on its plan to drastically cut its library budget and services"
(attached). Yet, despite the lack of Congressional approval and the concerns expressed over this plan, your Agency continues to move forward with dismantling the EPA libraries. It is imperative that the valuable government information maintained by EPA's libraries
be preserved. We ask that you please confirm in writing by no later than Monday, December 4, 2006, that the destruction or disposition of all library holdings immediately ceased upon the Agency's receipt of this letter and that all records of library holdings and dispersed materials are being maintained."
In 2006 publishers in the U.S. sold 3.1 billion books. This was up just 0.5 percent from the 3. 09 billion sold in 2005. Of the 3.1 billion, 263.4 million were religious books, then the fastest growing category in U.S. book publishing.
Julian Assange and others founded Wikileaks, a website, with no official headquarters, that published anonymous submissions and leaks of sensitive governmental, corporate, or religious documents, while attempting to preserve the anonymity and untraceability of its contributors.
Within one year of its foundation the site grew to 1,200,000 documents.
"The site states that it was 'founded by Chinese dissidents, journalists, mathematicians and startup company technologists, from the US, Taiwan, Europe, Australia and South Africa". The creators of Wikileaks were unidentified as of January 2007, although it has been represented in public since January 2007 by non-anonymous speakers such as Julian Assange, who had described himself as a member of Wikileaks' advisory board and was later referred to as the 'founder of Wikileaks.' "
"Wikileaks describes itself as 'an uncensorable system for untraceable mass document leaking'. Wikileaks is hosted by PRQ, a Sweden-based company providing 'highly secure, no-questions-asked hosting services'. PRQ is said to have 'almost no information about its clientele and maintains few if any of its own logs'. PRQ is owned by Gottfrid Svartholm and Fredrik Neij who, through their involvement in The Pirate Bay, have significant experience in withstanding legal challenges from authorities. Being hosted by PRQ makes it difficult to take Wikileaks offline. Furthermore, 'Wikileaks maintains its own servers at undisclosed locations, keeps no logs and uses military-grade encryption to protect sources and other confidential information.' Such arrangements have been called 'bulletproof hosting' (Wikipedia article on Wikileaks, accessed 11-25-2009).
"WikiLeaks was originally launched as a user-editable wiki site, but has progressively moved towards a more traditional publication model, and no longer accepts either user comments or edits. The site is available on multiple online servers and different domain names following a number of denial-of-service attacks and its severance from different Domain Name System (DNS) providers" (Wikipedia article on Wikileaks, accessed 12-08-2010).
Yahoo and Reuters introduced programs to place photographs and videos of news events submitted by the public, including cell phone photos and videos, throughout Reuters.com and Yahoo's new service entitled YouWitnessNews. Reuters said that it in 2007 would also start to distribute some of the submissions to the thousands of print, online and broadcast media outlets that subscribed to its news service. Reuters also said that it hoped to develop a service devoted entirely to user-submitted photographs and video.
Time Magazine named "You" as the Person of the Year:
"The "Great Man" theory of history is usually attributed to the Scottish philosopher Thomas Carlyle, who wrote that 'the history of the world is but the biography of great men.' He believed that it is the few, the powerful and the famous who shape our collective destiny as a species. That theory took a serious beating this year.
"To be sure, there are individuals we could blame for the many painful and disturbing things that happened in 2006. The conflict in Iraq only got bloodier and more entrenched. A vicious skirmish erupted between Israel and Lebanon. A war dragged on in Sudan. A tin-pot dictator in North Korea got the Bomb, and the President of Iran wants to go nuclear too. Meanwhile nobody fixed global warming, and Sony didn't make enough PlayStation3s.
"But look at 2006 through a different lens and you'll see another story, one that isn't about conflict or great men. It's a story about community and collaboration on a scale never seen before. It's about the cosmic compendium of knowledge Wikipedia and the million-channel people's network YouTube and the online metropolis MySpace. It's about the many wresting power from the few and helping one another for nothing and how that will not only change the world, but also change the way the world changes."
Shortly after the foundation of Wikileaks, Julian Assange published a kind of Wikileaks manifesto on the Internet:
"The non linear effects of leaks on unjust systems of governance
"You may want to read The Road to Hanoi or Conspiracy as Governance [second essay following]; an obscure motivational document, almost useless in light of its decontextualization and perhaps even then. But if you read this latter document while thinking about how different structures of power are differentially affected by leaks (the defection of the inner to the outer) its motivations may become clearer.
"The more secretive or unjust an organization is, the more leaks induce fear and paranoia in its leadership and planning coterie. This must result in minimization of efficient internal communications mechanisms (an increase in cognitive "secrecy tax") and consequent system-wide cognitive decline resulting in decreased ability to hold onto power as the environment demands adaption.
"Hence in a world where leaking is easy, secretive or unjust systems are nonlinearly hit relative to open, just systems. Since unjust systems, by their nature induce opponents, and in many places barely have the upper hand, mass leaking leaves them exquisitely vulnerable to those who seek to replace them with more open forms of governance.
"Only revealed injustice can be answered; for man to do anything intelligent he has to know what's actually going on" (http://cryptome.org/0002/ja-conspiracies.pdf, accessed 12-08-2010).
By 2007 the Universal Digital Library at Carnegie Mellon University and partners had scanned over 1,000,000 books, surpassing its original goal set in 2001.
In 2007 it was estimated that more than 4.7 billion Bibles (in whole or in part) had been printed since the publication of the Gutenberg Bible in 1455-56.
4.7 billion was more than five times the estimated number of 900 million printed copies of Quotations from Chairman Mao Zedong, the enormous distribution of which occurred in the second half of the 20th century becuase it was "an unoffical requirement for every Chinese ciitzen to own, read and carry it at all times under the latter half of Mao's rule, and especially during the Cultural Revolution."
The Universal Digital Library estimated that there were "no more than 10,000,000 unique book and document editions before the year 1900, and perhaps 300 million since the beginning of recorded history."
According to the Book Industry Study Group in 2007 3,200,000,000 books were sold in the United States. According to The Association of American Publishers net book sales in the U.S. were $25,000,000,000, an increase of 2.5 percent over 2006.
According to Bowker, as cited by Robert Darnton in Publisher's Weekly, 976,000 new book titles were published worldwide in 2007. This represented a significant increase over the 859,000 published in 2003, and the 700,000 published in 1998.
David Ferrucci, leader of the Semantic Analysis and Integration Department at IBM’s Watson Research Center, Yorktown Heights, New York, and his team began development of Watson, a special-purpose computer system designed to push the envelope on deep question and answering, deep analytics, and the computer's understanding of natural language.
The oldest continuously published newspaper in the world, Post- och Inrikes Tidningar (Post and Domestic Newspaper) of Stockholm, the government newspaper and gazette of Sweden, which was published on paper without interruption since 1645, switched to web publication exclusively.
"Hitachi Global Storage Technologies [San Jose, California] is first to the mat with an announcement of a 1-terabyte hard disk drive. Industry analysts widely expected a 1TB drive to ship sometime in 2007; Hitachi grabbed a head start on the competition by announcing its drive today, just before the largest U.S. consumer electronics show starts next week.
"According to Hitachi, the drive ships in the first quarter of 2007, and will cost $399--less than the price of two individual 500GB hard drives today. The drive, called the Deskstar 7K1000, will be shown this weekend in Las Vegas at the 2007 International CES, also known as the Consumer Electronics Show, as well as at the Storage Visions storage conference" (http://www.pcworld.com/article/128400/hitachi_introduces_1terabyte_hard_drive.html, accessed 06-04-2009).
Filed under: Memory / Mnemonics / Data Storage
"Information is expanding 10 times faster than any product on this planet - manufactured or natural. According to Hal Varian, an economist at UC Berkeley and a consultant to Google, worldwide information is increasing at 66 percent per year - approaching the rate of Moore's Law - while the most prolific manufactured stuff - paper, let’s say, or steel - averages only as much as 7 percent annually."
In the February 2007 issue of Wired James Gleick wrote:
"Is the universe actually made of information? Humans have talked about atoms since the time of the ancients, and ever-smaller fundamental particles of matter followed. But no one even conceived of bits until the middle of the 20th century. The bit is a fundamental particle, too, but of different stuff altogether: information. It is not just tiny, it is abstract - a flip-flop, a yes-or-no. Now that scientists are finally starting to understand information, they wonder whether it’s more fundamental than matter itself. Perhaps the bit is the irreducible kernel of existence; if so, we have entered the information age in more ways than one."
On his main website, barackobama.com Presidential candidate Barack Obama launched my.barackobama.com. This social networking site built an online community of over a million members before the presidential election.
"Following their march from standard processors to dual-core and quad-core designs in 2006, Intel Corp. researchers have built an 80-core chip that performs more than a trillion floating-point operations per second (TFLOPS) while using less electricity than a modern desktop PC chip ... 80 cores [on] a 275-square-millimeter, fingernail-size chip ... Intel ... [is] using the chip to explore new forms of tera-scale computing, in which future users could process terabytes of data on their desktops to perform real-time speech recognition, conduct multimedia data mining, play photorealistic games and interact with artificial intelligence.
Shrunk onto a single chip, that power would allow average consumers to use their PCs in new ways. They could use improved search functions on the vast amounts of digital media stored on home desktops, searching large photo archives for specific attributes such as all the shots where a certain person is smiling, or where that person is posing with a friend."
A technology developed at Keio University, Tokyo, Japan, carried with it the possibility that bacterial DNA could be used as a medium for storing digital information long-term—potentially thousands of years.
"Keio University Institute for Advanced Biosciences and Keio University Shonan Fujisawa Campus announced the development of the new technology, which creates an artificial DNA that carries up to more than 100 bits of data within the genome sequence, according to the JCN Newswire. The universities said they successfully encoded "e= mc2 1905!" -- Einstein's theory of relativity and the year he enunciated it -- on the common soil bacteria, Bacillius subtilis."
Using techniques of computational bibliography, in collaboration with Paul Needham at Princeton's Scheide Library, Agüera y Arcas also did significant original research in the technology of the earliest printing from movable type.
According to an article in The New York Times entitled History Digitized (and Abridged), which pointed out that economic and copyright considerations required the digitization of library and archival collections to be very selective, the U.S. National Archives estimated that at the current rate of digitization of its 9 billion text records, it could take 1800 years to convert the paper text records in the National Archives to digital form.
The Walters Art Museum reported through BBC News that through the technique of multispectral imaging a previously unknown commentary on Aristotle was discovered in the Archimedes Palimpsest, which was purchased by a private collector at Christie's in New York on October 28, 1998.
"The crucial part of Schaeffer's computer proof involved playing out every possible endgame involving fewer than 10 pieces. The result is an endgame database of 39 trillion positions. By contrast, there are only 19 different opening moves in draughts. Schaeffer's proof shows that each of these leads to a draw in the endgame database, providing neither player makes a mistake.
"Schaeffer was able to get his result by searching only a subset of board positions rather than all of them, since some of them can be considered equivalent. He carried out a mere 1014 calculations to complete the proof in under two decades. 'This pushes the envelope as far as artificial intelligence is concerned,' he says.
"At its peak, Schaeffer had 200 desktop computers working on the problem full time, although in later years he reduced this to 50 or so. 'The problem is such that if I made a mistake 10 years ago, all the work from then on would be wrong,' says Schaeffer. 'So I've been fanatical about checking for errors.' " (http://www.newscientist.com/article/dn12296-checkers-solved-after-years-of-number-crunching.html, accessed 01-24-2010).
Based on this proof, Schaeffer's checkers-playing program Chinook, can no longer be beaten. The best an opponent can hope for is a draw.
On May 25, 2007 Google introduced the Street View feature of Google Maps in the United States. It provided panoramic views from positions along many streets, eventually including even views of the very small road on which I live in Novato, California, suggesting that coverage of many parts of the United States became extremely comprehensive.
On April 16, 2008, Google fully integrated Street View into Google Earth 4.3.
In response to complaints about privacy, on May 12, 2008 Google announced in its "latlong" blog that it had introduced face-blurring technology for its images of Manhattan. It eventually applied the technology to all locations.
In a real-world announcement, Carl Bildt, Foreign Minister of Sweden, opened the Second House of Sweden, an embassy in the virtual world of Second Life. A replica of the Swedish Embassy to the United States, this was the first embassy of a real country in a virtual world.
On May 31, 2007 the genome of James D. Watson, co-discoverer of the double-helical structure of DNA, was sequenced and presented to Watson. It was the second individual human genome to be sequenced; the first was that of J. Craig Venter, which was sequenced in the Human Genome Project, the first working draft of which was completed and published in February 2001.
On June 29, 2007 Apple introduced the iPhone, an internet-connected multimedia smartphone with a virtual keypad and a virtual keyboard.
"our vision of a voice-driven ecosystem parallel to that of the WWW. WWTW is a network of interconnected voice sites that are voice driven applications created by users and hosted in the network. It has the potential to enable the underprivileged population to become a part of the next generation converged networked world. We present a whole gamut of existing technology enablers for our vision as well as present research directions and open challenges that need to be solved to not only realize a WWTW but also to enable the two Webs to cross leverage each other."
England's Coventry University developed a MSc course in clinical management that held problem based learning groups for students in Second Life. The course trained students in managing healthcare facilities, and was the first healthcare course to use Second Life as a learning platform.
"An innovative tool to analyse and identify computer file formats has won the 2007 Digital Preservation Award. DROID, developed by The National Archives in London, can examine any mystery file and identify its format. The tool works by gathering clues from the internal 'signatures' hidden inside every computer file, as well as more familiar elements such as the filename extension (.jpg, for example), to generate a highly accurate 'guess' about the software that will be needed to read the file. . . .
"Now, by using DROID and its big brother, the unique file format database known as PRONOM, experts at the National Archives are well on their way to cracking the problem. Once DROID has labelled a mystery file, PRONOM's extensive catalogue of software tools can advise curators on how best to preserve the file in a readable format. The database includes crucial information on software and hardware lifecycles, helping to avoid the obsolescence problem. And it will alert users if the program needed to read a file is no longer supported by manufacturers.
"PRONOM's system of identifiers has been adopted by the UK government and is the only nationally-recognised standard in its field."
In November 2007 The Watchtower had an average semi-monthly printing on paper of 28,578,000 copies in 161 languages. This may have been the largest and most linguistically diverse circulation printed on paper of any periodical at that time.
Jeff W. Lichtman and Joshua R. Sanes, both professors of Molecular & Cellular Biology in the Department of Neurobiology at Harvard Medical School, and colleagues, published "Transgenic strategies for combinatorial expression of fluorescent proteins in the nervous system," Nature 450 (7166): 56–62. doi:10.1038/nature06293. This described the visualization process they called "Brainbow."
"Detailed analysis of neuronal network architecture requires the development of new methods. Here we present strategies to visualize synaptic circuits by genetically labelling neurons with multiple, distinct colours. In Brainbow transgenes, Cre/lox recombination is used to create a stochastic choice of expression between three or more fluorescent proteins (XFPs). Integration of tandem Brainbow copies in transgenic mice yielded combinatorial XFP expression, and thus many colours, thereby providing a way to distinguish adjacent neurons and visualize other cellular interactions. As a demonstration, we reconstructed hundreds of neighbouring axons and multiple synaptic contacts in one small volume of a cerebellar lobe exhibiting approximately 90 colours. The expression in some lines also allowed us to map glial territories and follow glial cells and neurons over time in vivo. The ability of the Brainbow system to label uniquely many individual cells within a population may facilitate the analysis of neuronal circuitry on a large scale." (From the Nature abstract).
Historian Anthony Grafton of Princeton University published "Future Reading. Digitization and its Discontents" in The New Yorker Magazine. This was revised and reissued as a small book entitled Codex in Crisis (2008). It was reprinted as the last chapter in Grafton's, Worlds Made by Words. Scholarship and Community in the Modern West (2009).
On December 18, 2008 Grafton spoke about Codex in Crisis at Google, Montain View, in the Authors@Google series. A video of this presentation was available on YouTube at http://www.google.com/cse?cx=002920929640144004653%3A7yibd0sz9ny&ie=UTF-8&q=codex+in+crisis&x=48&y=9.
Amazon.com introduced the Kindle. This unconventially-named ebook reader differed from other ebook readers because it incorporated a wireless service for purchasing and delivering electronic texts from Amazon.com without a computer. The 6 inch wide electronic-paper screen was limited to grayscale at 167ppi resolution. At its introduction 90,000 titles were available for download to the 10 oz. device. The first Kindle could store about 200 books.
Concurrently with the Kindle ebook reader, Amazon launched Kindle Direct Publishing for authors and publishers to publish their books directly to Kindle and Kindle Apps worldwide. This publishing platform was in open beta testing as of late 2007.
"Authors can upload documents in several formats for delivery via Whispernet and charge between $0.99 and $200.00 per download.
"In a December 5, 2009 interview with The New York Times, CEO Jeff Bezos revealed that Amazon.com keeps 65% of the revenue from all ebook sales for the Kindle. The remaining 35% is split between the book author and publisher. After numerous commentators observed that Apple's popular App Store offers 70% of royalties to the publisher, Amazon began a program that offers 70% royalties to Kindle publishers who agree to certain conditions.
"Other criticisms involve the business model behind Amazon's implementation and distribution of e-books. Amazon introduced a software application allowing Kindle books to be read on an iPhone or iPod Touch. Amazon soon followed with an application called "Kindle for PCs" that can be run on a Windows PC. Due to the book publisher's DRM policies, Amazon claims that there is no right of first sale with e-books. Amazon states they are licensed, not purchased; so unlike paper books, buyers do not actually own their e-books according to Amazon. This has however never been tested in the courts and the outcome of any action by Amazon is by no means certain. The law is in a state of flux in jurisdictions around the world " (Wikipedia article on Amazon Kindle, accessed 12-29-2011).
Changeling, an American historical drama film set in Los Angeles in 1928, and based on a true story, produced and directed by Clint Eastwood, written by J. Michael Straczynski, starring Angela Jolie and John Malkovich, and introduced in 2008, had many features to recommend it. Rather than describing the plot in detail and spoiling the story for you, I will mention that the film is tangentially relevant to the topics covered in this database since the central figure played by Jolie works as a supervisor in a telephone exchange, then a manual operation. In the film the operation of the exchange appears to be accurately depicted.
"Later exchanges consisted of one to several hundred plug boards staffed by telephone operators. Each operator sat in front of a vertical panel containing banks of ¼-inch tip-ring-sleeve (5-conductor) jacks, each of which was the local termination of a subscriber's telephone line. In front of the jack panel lay a horizontal panel containing two rows of patch cords, each pair connected to a cord circuit. When a calling party lifted the receiver, a signal lamp near the jack would light. The operator would plug one of the cords (the "answering cord") into the subscriber's jack and switch her headset into the circuit to ask, "number please?" Depending upon the answer, the operator might plug the other cord of the pair (the "ringing cord") into the called party's local jack and start the ringing cycle, or plug into a trunk circuit to start what might be a long distance call handled by subsequent operators in another bank of boards or in another building miles away. In 1918, the average time to complete the connection for a long-distance call was 15 minutes. In the ringdown method, the originating operator called another intermediate operator who would call the called subscriber, or passed it on to another intermediate operator. This chain of intermediate operators could complete the call only if intermediate trunk lines were available between all the centers at the same time. In 1943 when military calls had priority, a cross-country US call might take as long as 2 hours to request and schedule in cities that used manual switchboards for toll calls" (Wikipedia article on Telephone exchange, accessed 04-26-2009).
"AdWords offers pay-per-click (PPC) advertising, and site-targeted advertising for both text and banner ads. The AdWords program includes local, national, and international distribution. Google's text advertisements are short, consisting of one title line and two content text lines. Image ads can be one of several different Interactive Advertising Bureau (IAB) standard sizes" (Wikipedia article on AdWords, accessed 06-09-2009).
On its 450th anniversary, the Bayerische Staatsbibliothek offered selected web services and highlights of its unique collections, as well as a communication forum for library users in Second Life.
"The virtual presence of the Bayerische Staatsbibliothek is a reproduction of the famous library building in Ludwigstrasse, Munich, which is almost true to the original. The floor plan and the outside facades are true to the scale of the original building that was erected from 1832 to 1843. On the inside of the building the historical staircase, the Fürstensaal (prince's hall), the Friedrich von Gärtner hall and the marble hall were reproduced in accordance with the originals by means of state-of-the-art 3D technology. The reproduction of the staircase is particularly impressive in that it is accurate in every detail.
"The Fürstensaal contains an exhibition of a number of valuable manuscripts and historical printed works from the collections of the Bayerische Staatsbibliothek, which can even be browsed virtually. Further virtual exhibits can be seen in the major reading room, which was also reproduced taking account of all details of the original room in Ludwigstrasse. A virtual exhibition of presentation boards informs about the eventful history of the Staatsbibliothek since its foundation in the year 1558. //However, the presence in Second Life primarily offers in-world access to the most frequently used web services of the Bayerische Staatsbibliothek: access to the online catalogue and the web site, a link to the virtual information service "question point" and the comprehensive offer of digital texts of the "Munich Digitisation Centre". Moreover, every avatar can directly access the "Bayerische Landesbibliothek Online" offering a broad variety of information and digital sources on Bavarian culture and history.
"Just like its role model in real life, the virtual Staatsbibliothek is intended to become a place of communication and interaction. Therefore the virtual inner courtyards offer a conference centre for virtual specialized and information events and a coffeehouse inviting visitors to interact casually. Regular in-world events are planned, which will introduce, among others, the multifaceted offers and services of the Bayerische Staatsbibliothek" (http://www.bsb-muenchen.de/Virtual-Services-in-Second-Lif.2264+M57d0acf4f16.0.html, accessed 01-03-2010)
At the Macworld Conference and Expo Apple introduced the Macbook Air with a tapered design just 0.16 in. thick at the front, weighing 3 lb. and with an optional solid-state drive. Apple claimed that the Macbook Air was the thinnest notebook computer.
The MacBook Air was pitched as a laptop for frequent travelers.
"The oldest known oil painting, dating from 650 A.D., has been found in caves in Afghanistan's Bamiyan Valley, according to a team of Japanese, European and U.S. scientists.
"The discovery reverses a common perception that the oil painting, considered a typically Western art, originated in Europe, where the earliest examples date to the early 12th century A.D.
"Famous for its 1,500-year-old massive Buddha statues, which were destroyed by the Taliban in 2001, the Bamiyan Valley features several caves painted with Buddhist images.
"Damaged by the severe natural environment and Taliban dynamite, the cave murals have been restored and studied by the National Research Institute for Cultural Properties in Tokyo, as a UNESCO/Japanese Fund-in-Trust project.
"Since most of the paintings have been lost, looted or deteriorated, we are trying to conserve the intact portions and also try to understand the constituent materials and painting techniques," Yoko Taniguchi, a researcher at the National Research Institute for Cultural Properties in Tokyo, told Discovery News.
" 'It was during such analysis that we discovered oily and resinous components in a group of wall paintings.'
"Painted in the mid-7th century A.D., the murals have varying artistic influences and show scenes with knotty-haired Buddhas in vermilion robes sitting cross-legged amid palm leaves and mythical creatures.
"Most likely, the paintings are the work of artists who traveled on the Silk Road, the ancient trade route between China, across Central Asia's desert to the West" (http://dsc.discovery.com/news/2008/02/19/oldest-oil-painting.html, accessed 07-11-2009).
"The four-dimensional framework described by De Freitas and Martin (2006), plus the learning types described by Helmer (2007), as well as the different aspects of emergent narrative described by Murray (1997) have provided the basis for the design of these game-based learning activities for virtual patients under two different categories: context and learner specification, and narrative and modes of representation. Phase I of this project focused on the delivery of a virtual patient in the area of Respiratory Medicine following a game-based learning model in Second Life."
You can watch the video of Phase 1 on YouTube at this link.
Silver correctly predicted on March 7, 2008, roughly eight months before the election, that Barack Obama would be elected President of the United States.
"The U.S. Department of Homeland Security (DHS) is conducting the largest cyber security exercise ever organized. Cyber Storm II is being held from March 10-14 in Washington, D.C. and brings together participants from federal, state and local governments, the private sector, and the international community.
"Cyber Storm II is the second in a series of congressionally mandated exercises that will examine the nation’s cyber security preparedness and response capabilities. The exercise will simulate a coordinated cyber attack on information technology, communications, chemical, and transportation systems and assets.
" 'Securing cyberspace is vital to maintaining America’s strategic interests, public safety, and economic prosperity,' said Greg Garcia, Homeland Security Assistant Secretary for Cyber Security and Communications. 'Exercises like Cyber Storm II help to ensure that the public and private sectors are prepared for an effective response to attacks against our critical systems and networks.'
"Cyber Storm II will include 18 federal departments and agencies, nine states (Calif., Colo., Del., Ill., Mich., N.C., Pa., Texas and Va.), five countries (United States, Australia, Canada, New Zealand and the United Kingdom), and more than 40 private sector companies. They include ABB, Inc., Air Products, Cisco, Dow Chemical Company Inc., Harris Corporation, Juniper Networks, McAfee, Microsoft, NeuStar, PPG Industries, and Wachovia" (http://www.dhs.gov/xnews/releases/pr_1205180340404.shtm, accessed 08-09-2009).
By 2008 broadband technologies had spread to more than 90% of all residential Internet connections in the United States.
"When one considers a Nielsen’s study conducted in June 2008, which estimated the number of U.S. Internet users as 220,141,969, one can calculate that there are presently about 199 million people in the United States utilizing broadband technologies to surf the Web" (Wikipedia article on Internet marketing, accessed 05-10-2009).
Roadrunner, the first hybrid supercomputer, designed and built by scientists at I.B.M. and Los Alamos National Laboratories from components originally designed for video game machines, became the first computer to go petascale— capable of reaching performance in excess of one petaflop, or one quadrillion floating point operations per second. On May 25, 2008 Roadrunner sustained a performance of 1.026 petaflops, becoming the world's first TOP500 Linpack sustained 1.0 petaflops system.
"To put the performance of the machine in perspective, Thomas P. D’Agostino, the administrator of the National Nuclear Security Administration, said that if all six billion people on earth used hand calculators and performed calculations 24 hours a day and seven days a week, it would take them 46 years to do what the Roadrunner can in one day."
By June 2008 Apple's iTunes Store had reportedly sold five billion songs.
Encyclopaedia Brittanica, first published in 3 volumes in 1771, announced in its blog that it would include wiki-style collaboration from users in it's online edition. At Britannica,
“readers and users will also be invited into an online community where they can work and publish at Britannica’s site under their own names.”
The core encyclopedia itself
"will continue to be edited according to the most rigorous standards and will bear the imprimatur ‘Britannica Checked’ to distinguish it from material on the site for which Britannica editors are not responsible.”
Francis W. M. R. Schwartze of the Section of Wood Protection and Biotechnology, Wood Laboratory, Swiss Federal Laboratories for Materials Testing and Research (Empa) St. Gallen, and Melanie Spycher, and Siegfried Fink published "Superior wood for violins – wood decay fungi as a substitute for cold climate," New Phytologist 179 (2008) 1095-1104.
"• Violins produced by Antonio Stradivari during the late 17th and early 18th centuries are reputed to have superior tonal qualities. Dendrochronological studies show that Stradivari used Norway spruce that had grown mostly during the Maunder Minimum, a period of reduced solar activity when relatively low temperatures caused trees to lay down wood with narrow annual rings, resulting in a high modulus of elasticity and low density.
"• The main objective was to determine whether wood can be processed using selected decay fungi so that it becomes acoustically similar to the wood of trees that have grown in a cold climate (i.e. reduced density and unchanged modulus of elasticity).
"• This was investigated by incubating resonance wood specimens of Norway spruce (Picea abies) and sycamore (Acer pseudoplatanus) with fungal species that can reduce wood density, but lack the ability to degrade the compound middle lamellae, at least in the earlier stages of decay.
"• Microscopic assessment of the incubated specimens and measurement of five physical properties (density, modulus of elasticity, speed of sound, radiation ratio, and the damping factor) using resonance frequency revealed that in the wood of both species there was a reduction in density, accompanied by relatively little change in the speed of sound. Thus, radiation ratio was increased from 'poor' to 'good', on a par with 'superior' resonance wood grown in a cold climate."
According to World Internet Stats in June 2008 1,463,632,361 people used the Internet, out of a total world population of 6,676,120,288.
Filed under: Internet & Networking
Google announced in its blog that it was indexing over one trillion (1,000,000,000,000) unique URLs.
Filed under: Indexing & Searching Information
Apple opened its online iTunes App Store. At launch it contained 522 Apps for the iPhone, including 135 free programs.
Petr Sojka of the Department of Computer Graphics and Design of Faculty of Informatics, Masaryk University, Czech Republic, organized the first conference, held at the University of Birmingham, entitled DML 2008 Towards a Digital Mathematics Library as part of the Conferences on Intelligent Computer Mathematics (CICM) and Mathematics Knowledge Management (MKM).
"Mathematicians dream of a digital archive containing all peer-reviewed mathematical literature ever published, properly linked and validated/verified. It is estimated that the entire corpus of mathematical knowledge published over the centuries does not exceed 100,000,000 pages, an amount easily manageable by current information technologies.
"The workshop's objectives are to formulate the strategy and goals of a global mathematical digital library and to summarize the current successes and failures of ongoing technologies and related projects, asking such questions as:
"* What technologies, standards, algorithms and formats should be used and what metadata should be shared?
"* What business models are suitable for publishers of mathematical literature, authors and funders of their projects and institutions?
"* Is there a model of sustainable, interoperable, and extensible mathematical library that mathematicians can use in their everyday work?
* What is the best practice for
"o retrodigitized mathematics (from images via OCR to MathML and/or TeX);
"o retro-born-digital mathematics (from existing electronic copy in DVI, PS or PDF to MathML and/or TeX);
"o born-digital mathematics (how to make needed metadata and file formats available as a side effect of publishing workflow [CEDRAM model])?"
United States District Court, District of Massachusetts in Boston indicted Albert Gonzalez, a/k/a cumbajohny, a/k/a cj, a/k/a UIN 20167996, a/k/a UIN 476747, a/ak/a soupnazi, a/k/a segvec, a/k/a klngchilli, a/k/a stanozololz, for masterminding a crime ring to use malware to steal and sell more than 170,000,000 credit card and ATM numbers from retail stores during 2005 to 2007.
"On August 28, 2009, his [Gonzalez's] attorney filed papers with the United States District Court for the District of Massachusetts in Boston indicating that he would plead guilty to all 19 charges in the U.S. v. Albert Gonzalez, 08-CR-10223, case (the TJ Maxx case). According to reports this plea bargain would "resolve" issues with the New York case of U.S. v. Yastremskiy, 08-CR-00160 in United States District Court for the Eastern District of New York (the Dave and Busters case).
"Gonzalez could serve a term of 15 years to 25 years. He would forfeit more than $1.65 million, a condominium in Miami, a blue 2006 BMW 330i automobile, IBM and Toshiba laptop computers, a Glock 27 firearm, a Nokia cell phone, a Tiffany diamond ring and three Rolex watches. "
"His sentence would run concurrent with whatever comes out of the case in the United States District Court for the District of New Jersey (meaning that he would serve the longest of the sentences he receives)" (Wikipedia article on Albert Gonzalez, accessed 01-18-2010).
On March 26, 2010 U.S. District Court Judge Douglas P. Woodcock sentenced Gonzalez to twenty years in prison with three twenty year sentences running concurrently.
"The sentence imposed by U.S. District Court Judge Douglas P. Woodlock was for Gonzalez's role in a hacking ring that broke into computer networks of Heartland Payment Systems, which processed credit and debit card transactions for Visa and American Express, Hannaford Supermarkets and 7-Eleven. The sentence is actually 20 years and one day, owing to the need to deal with peculiarities in sentencing statutes, because Woodlock had to take into account that Gonzalez was on pretrial release for an unrelated crime when he took up with the international network of hackers responsible for the security breaches. He was at the time supposed to be serving as an informant for the U.S. Secret Service, but he double-crossed the agency, supplying a co-conspirator with information obtained as part of those investigations" (http://www.sfgate.com/cgi-bin/article.cgi?f=/g/a/2010/03/26/urnidgns852573C400693880002576EF004839D0.DTL, accessed 03-27-2010).
According to a Netcraft survey in September 2008 there were 181,277,835 active websites on the Internet.
Craigslist, the leading classified advertising service, provided free local classifieds and forums for more than 550 cities in over 50 countries, generating more than 12 billion page views per month, used by more than 50 million people each month. Craigslist users self-published more than 30 million new classified ads each month and more than 2 million new job listings each month. Each month craigslist also posted more than 100 million user postings in more than 100 topical forms. All of this it did with only 25 employees.
Because craigslist did not charge for classified advertising it replaced a large portion of the classified advertising that historically was placed in print newspapers. By doing so it substantially reduced the significant revenue that print newspapers historically generated from classified advertising. This contributed to an overall reduction of profits for many print newspapers. Similarly, craigslist's policy of charging below-market rates for job listings impacted that traditional source of newspaper revenue, and impacted profits at physical employment agencies, and the more expensive online employment agencies.
On September 23, 2008 T-Mobile, headquartered in Bonn, Germany, announced the first cell phone powered by the Android operating system, developed by Google in association with the Open Handset Alliance.
" 'ISP' represents a new direction for OSA publications. The ISP articles, which appear in OSA journals, link out to large 2D and 3D datasets—such as a CT scan of the human chest—that can be viewed interactively with special software developed by OSA in cooperation with Kitware, Inc., and the National Library of Medicine."
Gordon Cheers of Millennium House, North Narabeen, Australia, published a world atlas called Earth. The World Atlas. Containing 576 pages with 154 maps and 800 photographs, the volume measured 610 x 469 millimeters and weighed over 30 kilos. The publishers described it as the largest atlas ever published as a printed book.
"The book also includes four monster-sized gatefolds which, unfurled, measure six x four feet (1.82 x 1.21 meters) and reveal pinpoint sharp satellite images including shots of the earth and sky at night" (http://www.cnn.com/2008/TECH/science/10/16/earth.atlas/index.html#cnnSTCText, accessed 12-05-2008).
You could take a virtual tour of a few pages of the atlas on the Millenium House website at http://www.millenniumhouse.com.au/title-earth.html, accessed 10-2009.
The book was offered for sale in two versions: "Royal Blue," limited to 2000 copies, and available in bookstores, and "Imperial Gold," limited to 1000 copies and for sale only by Millenium House. In October 2009 Amazon.com offered a copy of an unspecified version for about $7200 plus $3.99 shipping and handling. There was also a regular trade edition available in a 325 x 250 mm format called Earth Condensed.
When I revisited the Millenium House website in March 2012 I noticed that the publishers had surpassed their previous size record by producing the Platinum edition of their atlas in an enormous 6 foot x 4.5 foot format (1.8m x 1.4m) in an edition limited to 31 copies at the price of $100,000 USD per copy.
"Once in a lifetime, the opportunity comes along to acquire something truly exquisite and unique—a piece of history, a rare collectible, a masterpiece... EARTH Platinum Edition is such an acquisition. With only 31 individually numbered copies of this immense, limited edition atlas available, this beautifully presented book will be sought after by fine institutions and discerning collectors. Superb cartography is displayed on the massive pages when opened: each spread measures a breathtaking 6 feet x 9 feet (1.8m x 2.7m), presenting an unsurpassed view of the world. . . ." (http://www.millenniumhouse.com.au/title-earth-plat.html, accessed 03-24-2012).
Though I was unsure whether the original 2008 version of Earth. The World Atlas was, as the publisher's claimed "the largest atlas ever published as a printed book," we can safely say that the enormous Platinum edition knocks out any previous competition in the size category.
Thirteen universities of the Committee on Institutional Cooperation and the University of California founded the HathiTrust, a very large scale collaborative repository of digital content from research libraries, including content digitized via the Google Books project, and Internet Archive digitization initiatives, as well as content digitized locally by member libraries. The name came from the Hindu word for elephant, as in "an elephant never forgets."
♦ As of January 2011 over 50 academic research libraries were members of the HathiTrust. Its website published the following statistics:
7,909,950 total volumes, 4,057,969 book titles, 189,013 serial titles 2,768,482,500 pages, 355 terabytes, 94 miles, 6,427 tons, 1,972,865 volumes (~25% of total) in the public domain.
♦ In March the HathiTrust website published the following statistics:
10,109,695 total volumes, 5,371,919 book titles, 266,508 serial titles 3,538,393,250 pages, 453 terabytes, 120 miles, 8,214 tons, 2,802,045 volumes (~28% of total) in the public domain.
"This morning, the Obama-Biden campaign announced that has launched Facebook Connect integration at My.BarackObama.com, the grassroots organizing social network set up by the Obama campaign many months ago. The integration will allow users to find their Facebook friends who are also on the site, and will automatically publish users’ activity on the site (like signing up for a campaign event or to make phone calls) on their Facebook wall feed. In some ways it comes as no surprise that the Obama campaign would launch Facebook Connect support early on, as Facebook co-founder Chris Hughes now runs many of Obama’s social media efforts. It will be interesting to see how much of an impact the integration will have in the final 2 weeks of the campaign season, and potentially beyond" (InsideFacebook.com).
Apple's iTunes App Store reported on October 21, 2008 that it had sold 200,000,000 million downloads sinces its opening on July 10, 2008. By this time the store iTunes App Store had 5500 Apps available for purchase.
The conversion of the old format of the From Gutenberg to the Internet Timeline, begun in 2005, to this interactive database format, was complete on October 24, 2008. Reflecting its coverage of the history of information since the beginning of records, I renamed the it From Cave-Paintings to the Internet.
By the end of the conversion there were 1535 timeline entries, nearly all of which had one or more hyperlinks to reference sources. There were also more than sixty themes by which the timeline could be searched. Timeline items were indexed by up to six themes.
In the process of converting from the old list format to the new interactive database I checked all hyperlinks, corrected mistakes, added new hyperlinks, and added a numerous new entries.
The database remained a work in progress.
"There are more than 75,000 active contributors working on more than 10,000,000 articles in more than 250 languages. As of today, there are 2,603,373 articles in English; every day hundreds of thousands of visitors from around the world make tens of thousands of edits and create thousands of new articles to enhance the knowledge held by the Wikipedia encyclopedia."
After 100 years of publishing in print, The Christian Science Monitor announced in Boston that in April 2009 it would become "the first newspaper with a national audience to shift from a daily print format to an online publication that is updated continuously each day.
"The changes at the Monitor will include enhancing the content on CSMonitor.com, starting weekly print and daily e-mail editions, and discontinuing the current daily print format."
The Authors Guild, New York, the Association of American Publishers (AAP) Washington, D.C., and New York, and Google announced a groundbreaking settlement agreement "on behalf of a broad class of authors and publishers worldwide that would expand online access to millions of in-copyright books and other written materials in the U.S. from the collections of a number of major U.S. libraries participating in Google Book Search. The agreement, reached after two years of negotiations, would resolve a class-action lawsuit brought by book authors and the Authors Guild, as well as a separate lawsuit filed by five large publishers as representatives of the AAP’s membership. The class action is subject to approval by the U.S. District Court for the Southern District of New York.
"If approved by the court, the agreement would provide:
Raphael's masterpiece, Madonna of the Goldfinch, which survived the collapse of a palace and more than four centuries of decay, reached the completion of a ten year restoration process, and was returned to the Uffizi gallery with a strengthened canvas and its colors restored to their original radiance.
"Raphael painted this work around 1505 for the wedding of his friend Lorenzo Nasi, a rich merchant in Florence. When Nasi’s palace collapsed in 1548, the painting was shredded into 17 pieces. The work was first put together with pieces of wood and long nails. The work later developed a yellowish opaque color. Restorers feared touching it because it was very fragile."
"The painting features a seated Mary with John the Baptist passing on a goldfinch to Jesus as a forewarning of his violent death. The bird has been associated in art with Christ's crucifixion.
"The restoration work began in 1999 using X-rays, microscopes, and lasers to find and seal the ancient fractures."
Apart from the historic election of Barack Obama, the first African American President of the United States, from the standpoint of the history of information and media, one element of this election and the campaign that preceded it was the blending of its coverage by broadcast media and the rapidly evolving interactive media on the Internet. Television networks repeatedly referred viewers to their websites for interactive news stories and additional information. While we watched the election on television or listened to radio we received information in emails, from websites, and from blogging and microblogging sites like Twitter. Within minutes after the election was decided I received an email from the Obama campaign signed by Barack Obama. Online newspapers updated election results in real time. Perhaps most remarkably, even the Wikipedia article on the United States presidential election 2008 was updated in real time on the web as election results were available. This I learned from reading a blog in The New York Times online—an online newspaper blogging about an article in an online encyclopedia. From the standpoint of the history of media this represents a blurring or blending of the historic distinctions that evolved over centuries between news media writing about the moment, and traditionally more static works of reference such as encyclopedias.
An email from firstname.lastname@example.org received 10-04-08 8:18PM PST, 18 minutes after polls closed on the West coast and news media computers declared an Obama victory. Presumbably this email was sent to the millions of people who donated to Obama's campaign:
I'm about to head to Grant Park to talk to everyone gathered there, but I wanted to write to you first.
We just made history.
And I don't want you to forget how we did it.
You made history every single day during this campaign -- every day you knocked on doors, made a donation, or talked to your family, friends, and neighbors about why you believe it's time for change.
I want to thank all of you who gave your time, talent, and passion to this campaign.
We have a lot of work to do to get our country back on track, and I'll be in touch soon about what comes next.
But I want to be very clear about one thing...
All of this happened because of you.
The day after the presdidential election President-Elect Barack Obama launched the website, Change.gov to communicate details of the transition to the presidency.
Timothy J. Ley and numerous collaborators from different countries published in the journal Nature, DNA sequencing of a cytogenetically normal acute myeloid luekaemia genome.
This was first time that researchers decoded all the genes of a person with cancer and found a set of mutations that might have caused the disease or aided its progression. The New York Times online reported:
"Using cells donated by a woman in her 50s who died of leukemia, the scientists sequenced all the DNA from her cancer cells and compared it to the DNA from her own normal, healthy skin cells. Then they zeroed in on 10 mutations that occurred only in the cancer cells, apparently spurring abnormal growth, preventing the cells from suppressing that growth and enabling them to fight off chemotherapy.
"The findings will not help patients immediately, but researchers say they could lead to new therapies and would almost certainly help doctors make better choices among existing treatments, based on a more detailed genetic picture of each patient’s cancer. Though the research involved leukemia, the same techniques can also be used to study other cancers."
Google.org unveiled Google Flu Trends, using aggregated Google search data to estimate flu activity up to two weeks faster than traditional flu surveillance systems.
"Each week, millions of users around the world search for online health information. As you might expect, there are more flu-related searches during flu season, more allergy-related searches during allergy season, and more sunburn-related searches during the summer. You can explore all of these phenomena using Google Trends. But can search query trends provide an accurate, reliable model of real-world phenomena?
"We have found a close relationship between how many people search for flu-related topics and how many people actually have flu symptoms. Of course, not every person who searches for "flu" is actually sick, but a pattern emerges when all the flu-related search queries from each state and region are added together. We compared our query counts with data from a surveillance system managed by the U.S. Centers for Disease Control and Prevention (CDC) and discovered that some search queries tend to be popular exactly when flu season is happening. By counting how often we see these search queries, we can estimate how much flu is circulating in various regions of the United States.
"During the 2007-2008 flu season, an early version of Google Flu Trends was used to share results each week with the Epidemiology and Prevention Branch of the Influenza Division at CDC. Across each of the nine surveillance regions of the United States, we were able to accurately estimate current flu levels one to two weeks faster than published CDC reports" (Google Flu Trends website).
NASA and the Lawrence Livermore National Laboratory developed the first-ever pictures taken from the visible spectrum of extrasolar planets. The images were glimpsed by the Gemini North and Keck telescopes on the Mauna Kea mountaintop in Hawaii.
"British and American researchers snapped the first ever visible-light pictures of three extrasolar planets orbiting the star HR8799. HR8799 is about 1.5 times the size of the sun, located 130 light-years away in the Pegasus constellation. Observers can probably see this star through binoculars, scientists said.
"To identify the planets, researchers compared images of the system, known to contain planets HF8799b, HF8799c, and HF8799d. In each image faint objects were detected, and by comparing images from over the years, it was confirmed that these were the planets in their expected positions and that they orbit their star in a counterclockwise direction.
"NASA's Hubble Space Telescope at about the same time picked up images of a fourth planet, somewhat unexpectedly. The new planet, Fomalhaut b orbits the bright southern star Fomalhaut, part of the constellation Piscis Australis (Southern Fish) and is relatively massive -- about three times the size of Jupiter. The planet orbits 10.7 billion miles from its home star and is approximately 25 light-years from Earth." (quoations from Daily Tech November 16, 2008).
The Getty Museum and website opened an exhibition entitled Tango with Cows: Book Art of the Russian Avant-Garde 1910-1917.
On the website of the show you could turn the pages of virtual copies of the rare art books exhibited, view English translations, and hear readings of the text in Russian. (I last accessed the site on 01-27-2009.)
PC Magazine announced that the January 2009 issue (Volume 28, Issue 1) would be the last printed edition of this "venerable publication," after which it moved to an online only format.
"While most magazines make most of their money from print advertising, PC Magazine derives most of its profit from its Web site. More than 80 percent of the profit and about 70 percent of the revenue come from the digital business, Mr. Young said, and all of the writers and editors have been counted as part of the digital budget for two years." quoted from NY Times online 11-19-08).
Scientists from the Mammoth Genome Project at Pennsylvania State University, University Park, reported the genome-wide sequence of the woolly mammoth, an extinct species of elephant that was adapted to living in the cold environment of the northern hemisphere. This was the first sequence of the genome of an extinct animal, and it opened up the possibility of reconstructing species from the last ice age
"They sequenced four billion DNA bases using next-generation DNA-sequencing instruments and a novel approach that reads ancient DNA highly efficiently."
'Previous studies on extinct organisms have generated only small amounts of data," said Stephan C. Schuster, Penn State professor of biochemistry and molecular biology and the project's other leader. "Our dataset is 100 times more extensive than any other published dataset for an extinct species, demonstrating that ancient DNA studies can be brought up to the same level as modern genome projects' (quoted from Genetic Engineering and Biotechnology News accessed 11-21-2008).
" 'By deciphering this genome we could, in theory, generate data that one day may help other researchers to bring the woolly mammoth back to life by inserting the uniquely mammoth DNA sequences into the genome of the modern-day elephant,' Stephan Schuster of Pennsylvania State University, who helped lead the research, said in a statement." (quoted from Reuters 11-19-2008, accessed 11-21-2008)
Europeana, the European digital library, museum and archive, was launched, giving users direct access to some 2 million digital objects, including film material, photos, paintings, sounds, maps, manuscripts, books, newspapers and archival papers.
"The digital content will be selected from that which is already digitised and available in Europe's museums, libraries, archives, and audio-visual collections. The prototype aims to have representative content from all four of these cultural heritage domains, and also to have a broad range of content from across Europe."
"We launched the European.eu site on 20 November and huge use - 10 million hits an hour - meant it crashed. We are doing our best to reopen Europeana.eu in a more robust version" (Europeana website accessed 11-21-2008).
Note: the site re-opened on or before January 1, 2009 after quadrupling server capacity.
Atlantic Records, a unit of Warner Music Group, New York, reported that more than half its revenue came from downloads and ringtones sold over the Internet, rather than CDs. This was the first major record label to record this change.
Stanford University Libraries' HighWire Press, announced over the DIGLIB newsgroup that it
"reached a significant milestone this week with the posting of the five millionth article on its e-Publishing platform. HighWire, a division of the Stanford University Libraries, provides technology and customized online services to 140 publishing partners ranging from independent non-profit societies and associations, to university presses and large commercial publishers.
"The milestone occurred while loading a substantial amount of journal backfiles on behalf of the American Medical Association. Bringing the HighWire total article count over the 5 million mark was an article dating from 1884, “Dermatitis Herpetiformis” by Louis A. Duhring, MD1, published in JAMA: The Journal of the American Medical Association. The JAMA & Archives Journals Backfiles Collection will ensure that 125 years of high quality medical research will be available online at the journals’ Web sites on the HighWire platform."
At this time Highwire Press
"hosts the largest repository of high impact, peer-reviewed content, with 1186 journals and 5,006,835 full text articles from over 140 scholarly publishers. HighWire-hosted publishers have collectively made 2,015,269 articles free. With our partner publishers we produce 71 of the 200 most-frequently-cited journals."
The day after the U.S. government officially declared the U.S. in recession, visitors to the New York Public Library viewed the book, Michelangelo: La Dotta Mano (The Learned Hand) published in Italy by FMR (Franco Maria Ricci), Milan, Italy, and donated to the library by the FMR Foundation.
Limited to 99 copies on hand-made paper, with a cover incorporating a marble relief, and offered at a list price of 100,000 Euros per copy, this may be the most expensive, and also possibly the most over-priced, single volume printed edition ever issued. According to The New York Times online, 33 copies were produced by this date, of which 20 were sold.
In January 2009 you could take a virtual tour of the book at FMR online. This site appeared to be down in March 2012.
"The Commission’s three major findings are: cybersecurity is now one of the major national security problems facing the United States; decisions and actions must respect American values related to privacy and civil liberties; and only a comprehensive national security strategy that embraces both the domestic and international aspects of cybersecurity will improve the situation."
According to the New York Times online:
"A government and technology industry panel on cyber-security is recommending that the federal government end its reliance on passwords and enforce what the industry describes as “strong authentication.”
"Such an approach would probably mean that all government computer users would have to hold a device to gain access to a network computer or online service. The commission is also encouraging all nongovernmental commercial services use such a device.
“' We need to move away from passwords,' said Tom Kellermann, vice president for security awareness at Core Security Technologies and a member of the commission that created the report." (http://www.nytimes.com/2008/12/09/technology/09security.html?_r=1, accessed 12-09-2008).
"The Pulitzer Prizes in journalism, which honor the work of American newspapers appearing in print, have been expanded to include many text-based newspapers and news organizations that publish only on the Internet, the Pulitzer Prize Board announced today.
"The Board also has decided to allow entries made up entirely of online content to be submitted in all 14 Pulitzer journalism categories" (http://www.pulitzer.org/new_eligibility_rules, accessed 04-23-2010).
"A WOMAN in a deep sleep sent emails to friends asking them over for wine and caviar in what doctors believe is the first reported case of 'zzz-mailing' - using the internet while asleep.
"The case of the 44-year-old woman is reported by researchers from the University of Toledo in the latest edition of the medical journal Sleep Medicine.
"They said the woman went to bed about 10pm but got up two hours later and walked to her computer in the next room, Britain's Daily Mail newspaper reports.
"She turned it on, connected to the internet, and logged on before composing and sending three emails.
"Each was in a random mix of upper and lower cases, not well formatted and written in strange language, the researchers said.
"One read: "Come tomorrow and sort this hell hole out. Dinner and drinks, 4pm,. Bring wine and caviar only."
"Another said simply, "What the…".
"The new variation of sleepwalking has been described as "zzz-mailing".
"We believe writing an email after turning the computer on, connecting to the internet and remembering the password displayed by our patient is novel," the researchers said.
"To our knowledge this type of complex behaviour requiring co-ordinated movements has not been reported before in sleepwalking" (http://www.news.com.au/technology/story/0,28348,24802639-5014239,00.html, accessed 12-30-2008)
According to the New York Times online, 2.5 trillion text messages, generally limited to 160 characters per message, were sent worldwide in 2008, up 32% from 2007.
American educator Cathy N. Davidson of Duke University, and David Theo Goldberg, of the University of California at Irvine, with support of the John D. and Catherine T. MacArthur Foundation grant making initiative on Digital Media and Learning, published The Future of Learning Institutions in a Digital Age.
Readability was launched by Arc90 in New York. This service automatically stripped websites of advertising and other distractions, providing a customized reading view, and method of saving articles for future reading.
According to United States Postal Service’s Household Diary Study (HDS) for Fiscal Year (FY) 2009, U.S. households sent and received 176 billion pieces of physical mail in 2009, not including international mail:
"Table E.1: Mail Received and Sent by Households
"(Billions of Pieces) Mail
"Classification Received Sent
"First-Class Mail 53.1 18.3
"Standard Regular Mail 58.2 —
"Standard Nonprofit Mail 12.5 —
"Periodicals 6.0 —
"Package & Shipping Services 1.8 0.5
"Total 131.6 18.8
"Household to Household 5.4
"Total Mail Received and
Sent by Households 145.0
"FY 2009 RPW Total* 176.3
"Non-household Residual 31.3
"Unaddressed 1.6 —
"Source: HDS Diary Sample, FY 2009. *Does not include international mail."
Reflective of the economic realities of small circulation print magazines, Fine Books & Collections, notably a magazine about information in physical form, published in Durham, North Carolina, converted from bi-monthly print publication to monthly electronic publication, retaining in printed form only an annual Fine Books & Collections Compendium.
Having sold over a billion songs through the iTunes store in 2008, Apple announced that it reached agreements with record companies to remove anticopying restrictions on all tunes in the iTunes store. It also allowed record companies to set a range of prices for the songs.
"BEIJING, China (CNN) -- China surpassed the United States in 2008 as the world's top user of the Internet, according to a government-backed research group.
"The number of Web surfers in the country grew by nearly 42 percent to 298 million, according to the China Internet Network Information Center's January report. And there's plenty of room for growth, as only about 1 in every 4 Chinese has Internet access.
"The rapid growth in China's Internet use can be tied to its swift economic gains and the government's push for the construction of telephone and broadband lines in the country's vast rural areas, the report says.
"The Chinese government wants phone and broadband access in each village by 2010.
"Nearly 91 percent of China's Internet users are surfing the Web with a broadband connection -- an increase of 100 million from 2007. Mobile phone Internet users totaled 118 million by the end of 2008" (http://www.cnn.com/2009/TECH/01/14/china.internet/index.html, accessed 01-13-2010).
"The BBC is to put every one of the 200,000 oil paintings in public ownership in the UK on the internet as well as opening up the Arts Council's vast film archive online as part of a range of initiatives that it has pledged will give it a 'deeper commitment to arts and music'."
"A partnership with the Public Catalogue Foundation charity will see all the UK's publicly owned oil paintings – 80% of which are not on public display – placed on the internet by 2012. 'The BBC said it wanted to establish a new section of its bbc.co.uk website, called Your Paintings, where users could view and find information on the UK's national collection.
"The Public Catalogue Foundation, launched in 2003, is 30% of the way through cataloguing the UK's collection of oil paintings.
"In addition the BBC said it was talking to the Arts Council about giving the public free online access to its archive for the first time, including its wide-ranging film collection dating back to the 1950s" (quotations from http://www.guardian.co.uk/media/2009/jan/28/bbc-digitalmedia)
Ipoque, based in Leipzig, Germany, estimated that in February 2009 BitTorrent, based in San Francisco, California, was responsible for more than 45-78% of all P2P traffic and 27-55% of all Internet traffic, depending on geographical location.
Google launched Google Earth 5.0. Among the most significant features were Historical Imagery, Touring, and 3D Mars.
" ♦ Historical Imagery: Until today, Google Earth displayed only one image of a given place at a given time. With this new feature, you can now move back and forth in time to reveal imagery from years and even decades past, revealing changes over time. Try flying south of San Francisco in Google Earth and turning on the new time slider (click the "clock" icon in the toolbar) to witness the transformation of Silicon Valley from a farming community to the tech capital of the world over the past 50 years or so.
" ♦ Touring: One of the key challenges we have faced in developing Google Earth has been making it easier for people to tell stories. People have created wonderful layers to share with the world, but they have often asked for a way to guide others through them. The Touring feature makes it simple to create an easily sharable, narrated, fly-through tour just by clicking the record button and navigating through your tour destinations.
" ♦ 3D Mars: This is the latest stop in our virtual tour of the galaxies, made possible by a collaboration with NASA. By selecting "Mars" from the toolbar in Google Earth, you can access a 3D map of the Red Planet featuring the latest high-resolution imagery, 3D terrain, and annotations showing landing sites and lots of other interesting features" (Official Google Blog, http://googleblog.blogspot.com/2009/02/dive-into-new-google-earth.html, accessed 11-29-2010).
Cultural historian, book historian, and librarian Robert Darnton, of Harvard University, published the insightful and critical article, "Google and the Future of Books" in the New York Review of Books.
"How can we navigate through the information landscape that is only beginning to come into view? The question is more urgent than ever following the recent settlement between Google and the authors and publishers who were suing it for alleged breach of copyright. For the last four years, Google has been digitizing millions of books, including many covered by copyright, from the collections of major research libraries, and making the texts searchable online. The authors and publishers objected that digitizing constituted a violation of their copyrights. After lengthy negotiations, the plaintiffs and Google agreed on a settlement, which will have a profound effect on the way books reach readers for the foreseeable future. What will that future be?
"No one knows, because the settlement is so complex that it is difficult to perceive the legal and economic contours in the new lay of the land. But those of us who are responsible for research libraries have a clear view of a common goal: we want to open up our collections and make them available to readers everywhere. How to get there? The only workable tactic may be vigilance: see as far ahead as you can; and while you keep your eye on the road, remember to look in the rearview mirror." (quotations from the beginning of Darnton's longish article, accessed 01-28-2009).
Italian researchers reported the discovery of a previously unknown self-portrait by Leonardo da Vinci drawn when the artist was a young man. The faint pencil sketch was recognized underneath writing on a sheet of the “Codex on the Flight of Birds”, written between 1490 and 1505 and preserved in the Biblioteca Reale, Torino, Italy.
Piero Angela, an Italian scientific journalist, studying the document noticed the faint outline of a human nose hidden underneath lines of ink handwriting. It struck him as being similar in shape and drawing style to a later self-portrait of Leonardo. It is thought that Leonardo first made the drawing during the 1480s and reused the sheet for his manuscript on bird flight.
"Over months of micro-pixel work, graphic designers gradually 'removed' the text by making it white instead of black, revealing the drawing beneath. "What emerged was the face of a young to middle-aged man with long hair, a short beard and a pensive gaze.
"Mr Angela was struck by similarities to a famous self-portrait of Leonardo, made when the artist was an old man around 1512. The portrait, in red chalk, is kept in Turin’s Biblioteca Reale, or Royal Library.
"The research team used criminal investigation techniques to digitally correlate the newly-discovered sketch with the well-known portrait.
"They employed facial reconfiguration technology to age the drawing of the younger man, hollowing the cheeks, darkening the eyes and furrowing the brow.
"The two portraits were so similar 'that we may regard the hypothesis that the images portray the same person as reasonable', police photo-fit experts declared.
"To make doubly sure, the ageing process was reversed, with researchers using a digital 'facelift' to rejuvenate the older self-portrait.
"After removing the older Leonardo’s wrinkles and filling out his cheeks, the image that emerged was almost identical to the newly discovered sketch.
" 'When I actually tried to age the face [of the newly discovered portrait], and to put the hair and the beard of the famous self-portrait around it, a shiver ran down my spine,' said Mr Angelo. 'It resembled Leonardo like a twin brother. To uncover a new Leonardo drawing was astonishing.'
"The similarities were also studied by a facial reconstruction surgeon in Rome. '[He] said the two faces could well belong to the same man at different times in his life', said Mr Angelo.
"A world expert on Leonardo, Carlo Pedretti from the University of California, described the sketch as 'one of the most important acquisitions in the study of Leonardo, in the study of his image, and in the study of his thought too' (http://www.telegraph.co.uk/news/worldnews/europe/italy/4884789/Leonardo-da-Vinci-self-portrait-discovered-hidden-in-manuscript.html, accessed 02-28-2009).
The building containing the Historic Archive of the City of Cologne (Historisches Archiv der Stadt Köln) collapsed in a pile of rubble. The building was apparently constructed in 1971.
"Fortunately, staffers, researchers, and onsite construction workers inside the building were alarmed by strange noises and left immediately before the structure collapsed earlier today. However, at the time of this writing, three [people who were in buildings adjacent to the archives are still missing.
"At present, the cause of the building's collapse is unknown. A new subway line is being built under the street in front of the facility, but the section of the tunnel adjacent to the building is apparently complete. The building may also have had structural problems.
"Until today, the repository in Cologne was the largest municipal archives in Germany. It held 500,000 photographs and 65,000 documents dating back to 922, including manuscripts by Karl Marx and Friedrich Engels and materials relating to 20th-century writer Heinrich Böll. Government officials have promised to help salvage the archives' records, but street-level and aerial photographs of the building's remains suggest that many of the records are beyond recovery" (http://larchivista.blogspot.com/2009/03/collapse-of-historic-archive-of-city-of.html).
As of March 4, 2009 it was thought that two people from an adjacent building were missing; the Historic Archive of the City of Cologne was successfully evacuated before the building collapsed.
News stories were referenced at http://archiv.twoday.net/stories/5558898/.
A detailed story in in Spiegel Online International was available at this link: http://www.spiegel.de/international/germany/0,1518,611311,00.html, links accessed 03-04-2009)
Johan Bollen of Los Alamos National Laboratory and six co-authors published "Clickstream Data Yields High Resolution Maps of Science" in the open access online journal Plos ONE. The map was based on clickstream data collected when online readers switched from one journal to another, allowing the collection of about one billion data points -- a far greater number and presumably more reflective of actual reading patterns than the prior method of citation analysis developed by the Institute for Scientific Information (Now Thomson Scientific's Web of Science) which traces the relationship of footnotes in scholarly journals.
"Maps of science derived from citation data visualize the relationships among scholarly publications or disciplines. They are valuable instruments for exploring the structure and evolution of scholarly activity. Much like early world charts, these maps of science provide an overall visual perspective of science as well as a reference system that stimulates further exploration. However, these maps are also significantly biased due to the nature of the citation data from which they are derived: existing citation databases overrepresent the natural sciences; substantial delays typical of journal publication yield insights in science past, not present; and connections between scientific disciplines are tracked in a manner that ignores informal cross-fertilization.
"Scientific publications are now predominantly accessed online. Scholarly web portals provide access to publications in the natural sciences, social sciences and humanities. They routinely log the interactions of users with their collections. The resulting log datasets have a set of attractive characteristics when compared to citation datasets. First, the number of logged interactions now greatly surpasses the volume of all existing citations. This is illustrated by Elsevier's announcement, in 2006, of 1 billion (1×109) article downloads since the launch of its Science Direct portal in April 1999. In contrast, around the time of Elsevier's announcement, the total number of citations in Thomson Scientific's Web of Science from the year 1900 to the present does not surpass 600 million (6×108). Second, log datasets reflect the activities of a larger community as they record the interactions of all users of scholarly portals, including scientific authors, practitioners of science, and the informed public. In contrast, citation datasets only reflect the activities of scholarly authors. Third, log datasets reflect scholarly dynamics in real-time because web portals record user interactions as soon as an article becomes available at the time of its online publication. In contrast, a published article faces significant delays before it eventually appears in citation datasets: it first needs to be cited in a new article that itself faces publication delays, and subsequently those citations need to be picked up by citation databases.
"Given the aforementioned characteristics of scholarly log data, we investigated a methodological issue: can valid, high resolution maps of science be derived from clickstream data and can clickstream data be leveraged to yield meaningful insights in the structure and dynamics of scholarly behavior? To do this we first aggregated log datasets from a variety of scholarly web portals, created and analyzed a clickstream model of journal relationships from the aggregate log dataset, and finally visualized these journal relationships in a first-ever map of science derived from scholarly log data" (http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0004803#pone.0004803-Brody1, accessed 03-19-2009).
The Seattle Post-Intelligencer newspaper issued its last printed edition and became an internet-only news source, seattlepi.com.
"The Seattle Post-Intelligencer will print its final edition Tuesday and become the nation's largest daily newspaper to shift to an entirely digital news product "(http://www.seattlepi.com, accessed 03-16-2009).
It is thought that one human brain may store roughly one petabyte. Though there may be some similarity in storage capacity between the quantity of information on the Internet and information stored in the human brain, quantity is the main point of similarity, since the information is stored and processed in totally different ways by people and computers.
Sandra Aamodt and Sam Wang, "Guest Column: Computers vs. Brains," New York Times Blogs, 03-31-2009.
Michael Schmidt and Hod Lipson of Cornell University published "Distilling Free-Form Natural Laws from Experimental Data," Science 3 April 2009: Vol. 324. no. 5923, pp. 81 - 85 DOI: 10.1126/science.1165893. The paper described a computer program that sifted raw and imperfect data to uncover fundamental laws of nature.
"For centuries, scientists have attempted to identify and document analytical laws that underlie physical phenomena in nature. Despite the prevalence of computing power, the process of finding natural laws and their corresponding equations has resisted automation. A key challenge to finding analytic relations automatically is defining algorithmically what makes a correlation in observed data important and insightful. We propose a principle for the identification of nontriviality. We demonstrated this approach by automatically searching motion-tracking data captured from various physical systems, ranging from simple harmonic oscillators to chaotic double-pendula. Without any prior knowledge about physics, kinematics, or geometry, the algorithm discovered Hamiltonians, Lagrangians, and other laws of geometric and momentum conservation. The discovery rate accelerated as laws found for simpler systems were used to bootstrap explanations for more complex systems, gradually uncovering the "alphabet" used to describe those systems" (Abstract from Science)
Ross D. King, Jem Rowland and 11 co-authors from the Department of Computer Science at Aberystwyth University, Aberystwyth, Wales, and the University of Cambridge, published "The Automation of Science," Science 3 April 2009: Vol. 324. no. 5923, pp. 85 - 89 DOI: 10.1126/science.1165620.
They described a Robot Scientist which the researchers believed was the first machine to have independently discovered new scientific knowledge. The robot, called Adam, was a computer system that fully automated the scientific process.
"Prof Ross King, who led the research at Aberystwyth University, said: 'Ultimately we hope to have teams of human and robot scientists working together in laboratories'. The scientists at Aberystwyth University and the University of Cambridge designed Adam to carry out each stage of the scientific process automatically without the need for further human intervention. The robot has discovered simple but new scientific knowledge about the genomics of the baker's yeast Saccharomyces cerevisiae, an organism that scientists use to model more complex life systems. The researchers have used separate manual experiments to confirm that Adam's hypotheses were both novel and correct" (http://www.eurekalert.org/pub_releases/2009-04/babs-rsb032709.php).
"The basis of science is the hypothetico-deductive method and the recording of experiments in sufficient detail to enable reproducibility. We report the development of Robot Scientist "Adam," which advances the automation of both. Adam has autonomously generated functional genomics hypotheses about the yeast Saccharomyces cerevisiae and experimentally tested these hypotheses by using laboratory automation. We have confirmed Adam's conclusions through manual experiments. To describe Adam's research, we have developed an ontology and logical language. The resulting formalization involves over 10,000 different research units in a nested treelike structure, 10 levels deep, that relates the 6.6 million biomass measurements to their logical description. This formalization describes how a machine contributed to scientific knowledge" (Abstract in Science).
According to the New York Times, the government of Australia, Canberra, said that it
"would create a publicly owned company to build a national high- speed broadband network, spending 43 billion Australian dollars in one of the largest state-sponsored Internet infrastructure upgrades in the world.
"Prime Minister Kevin Rudd said the eight-year, $31 billion project would create up to 37,000 jobs at the peak of construction, giving a lift to the economy as retail spending slumps and mining companies cut workers amid weakening demand for Australian metals. The plan is 'the most ambitious, far-reaching and long-term nation-building infrastructure project ever undertaken by an Australian government,' Mr. Rudd told reporters.
"The government’s announcement was a surprise rebuff to five private telecommunications firms, including Optus of Singapore and Axia NetMedia of Canada, that had been bidding to build a slower, less expensive network, with fiber-optic cables reaching as far as local nodes, worth around 10 billion dollars.
"But Mr. Rudd scrapped those proposals in favor of a superior but more expensive network that will deliver broadband speeds of up to 100 megabits per second — fast enough to download multiple movies simultaneously — to 90 percent of Australian buildings through fiber-optic cables that extend directly to the premises. The remaining 10 percent will receive upgraded wireless access."
The YouTube Symphony Orchesta, under the direction of San Francisco Symphony conductor, Michael Tilson Thomas, debuted at Carnegie Hall in New York. Considered the first collaborative online orchestra, promoted on YouTube, auditioned entirely through YouTube videos, and sponsored by Google, the owner of YouTube,
"The YouTube Symphony Orchestra's show features soloists, chamber groups, chamber orchestra, large orchestra, electronica and multi-media, and samples diverse periods and styles of classical music, including works by Gabrieli, Bach, Mozart, Brahms, Villa-Lobos, John Cage and Tan Dun’s Internet Symphony No. 1 'Eroica.'
"It could be described as something between a summit conference, scout jamboree or musical get-together. It'll be the first time that people from so many different countries will have had a chance to discover one another online and then actually meet up and make music together." - Michael Tilson Thomas on NPR’s All Things Considered" (Carnegie Hall website, accessed 04-11-2009).
UNESCO, Paris, France, and 32 partner institutions launched the World Digital Library, a web site that featured unique cultural materials from libraries and archives around the world. The site included manuscripts, maps, rare books, films, sound recordings, and prints and photographs.
"The WDL will function in Arabic, Chinese, English, French, Portuguese, Russian and Spanish, and will include content in a great many other languages. Browse and search features will facilitate cross-cultural and cross-temporal exploration on the site. Descriptions of each item and videos with expert curators speaking about selected items will provide context for users, and are intended to spark curiosity and encourage both students and the general public to learn more about the cultural heritage of all countries. The WDL was developed by a team at the Library of Congress. Technical assistance was provided by the Bibliotheca Alexandrina of Alexandria, Egypt. Institutions contributing content and expertise to the WDL include national libraries and cultural and educational institutions in Brazil, Egypt, China, France, Iraq, Israel, Japan, Mali, Mexico, Morocco, the Netherlands, Qatar, the Russian Federation, Saudi Arabia, Serbia, Slovakia, Sweden, Uganda, the United Kingdom, and the United States" (http://portal.unesco.org/ci/en/ev.php-URL_ID=28484&URL_DO=DO_TOPIC&URL_SECTION=201.html)
David Ferrucci, leader of the Semantic Analysis and Integration Department at IBM's T. J. Watson's Research Center, Yorktown Heights, New York, Eric Nyberg, and several co-authors published IBM Research Report: Towards the Open Advancement of Question Answering Systems.
Section 4.2.3. of the report includes an analysis of why the television game show Jeopardy! provides a good model of the semantic analysis and integration problem.
"IBM is working to build a computing system that can understand and answer complex questions with enough precision and speed to compete against some of the best Jeopardy! contestants out there.
"This challenge is much more than a game. Jeopardy! demands knowledge of a broad range of topics including history, literature, politics, film, pop culture and science. What's more, Jeopardy! clues involve irony, riddles, analyzing subtle meaning and other complexities at which humans excel and computers traditionally do not. This, along with the speed at which contestants have to answer, makes Jeopardy! an enormous challenge for computing systems. Code-named "Watson" after IBM founder Thomas J. Watson, the IBM computing system is designed to rival the human mind's ability to understand the actual meaning behind words, distinguish between relevant and irrelevant content, and ultimately, demonstrate confidence to deliver precise final answers.
"Known as a Question Answering (QA) system among computer scientists, Watson has been under development for more than three years. According to Dr. David Ferrucci, leader of the project team, 'The confidence processing ability is key to winning at Jeopardy! and is critical to implementing useful business applications of Question Answering.
"Watson will also incorporate massively parallel analytical capabilities and, just like human competitors, Watson will not be connected to the Internet, or have any other outside assistance.
"If we can teach a computer to play Jeopardy!, what could it mean for science, finance, healthcare and business? By drastically advancing the field of automatic question answering, the Watson project's ultimate success will be measured not by daily doubles, but by what it means for society" (http://www.research.ibm.com/deepqa/index.shtml, accessed 06-16-2010).
On June 16, 2010 The New York Times Magazine published a long article by Clive Thompson on IBM's Watson's challenge of humans in Jeopardy! entitled, in the question response language of Jeopardy!, "What is I.B.M.'s Watson?."
♦ Link to to FAQs concerning Watson and Jeopardy! on IBM's website, accessed 02-08-2011: http://www.research.ibm.com/deepqa/faq.shtml.
On April 28, 2009 Perry Chen, Yancey Strickler, and Charles Adler launched Kickstarter.com, originally under the url of KickStartr.com. The company was based in New York City.
"One of a number of fundraising platforms dubbed 'crowd funding,' Kickstarter facilitates gathering monetary resources from the general public, a model which circumvents many traditional avenues of investment. Project creators choose a deadline and a goal minimum of funds to raise. If the chosen goal is not gathered by the deadline, no funds are collected (this is known as a provision point mechanism). Money pledged by donors is collected using Amazon Payments. The platform is open to backers from anywhere in the world and to creators from the US or the UK.
"Kickstarter takes 5% of the funds raised. Amazon charges an additional 3–5%. Unlike many forums for fundraising or investment, Kickstarter claims no ownership over the projects and the work they produce. However, projects launched on the site are permanently archived and accessible to the public. After funding is completed, projects and uploaded media cannot be edited or removed from the site" (Wikipedia article on Kickstarter, accessed 02-21-2013).
Psychologist Adena Schachner of Harvard University and co-authors published "Spontaneous Motor Entrainment to Music in Multiple Vocal Mimicking Species," Current Biology (30 April 2009) doi:10.1016/j.cub.2009.03.061.
Basing their research on the examination of more than 1000 YouTube videos of dancing animals, the researchers found 14 parrot species and one elephant genunely capable of keeping time, showing that "an ability to appreciate music and keep a rhythm is not unique to humans.
"Schachner analyzed the videos frame-by-frame, comparing the animals' movements with the speed of the music and the alignment of individual beats. The group also studied another bird, Alex, an African grey parrot, which had exhibited similar abilities to Snowball, nodding its head appreciatively to a series of drum tracks.
" 'Our analyses showed that these birds' movements were more lined up with the musical beat than we'd expect by chance,' says Schachner. 'We found strong evidence that they were synchronizing with the beat, something that has not been seen before in other species.'
"Aniruddh Patel of The Neurosciences Institute in San Diego, who led another study of Snowball's performance, said that the bird had demonstrated an ability to adjust the tempo of his dancing to stay synchronized to the beat.
"Scientists had previously thought that 'moving to a musical beat might be a uniquely human ability because animals are not commonly seen moving rhythmically in the wild,' Patel said.
"Schachner said there was no evidence to suggest that animals such as apes, dogs or cats could recognize music, despite their extensive experience of humans. That leads researchers to believe that an ability to process musical sounds may be linked to an ability to mimic sounds -- something that each of the parrots studied by researchers was able to do excellently, she said.
"Other 'vocal-learning species' include dolphins, elephants, seals and walruses.
" 'A natural question about these results is whether they generalize to other parrots, or more broadly, to other vocal-learning species,' Schachner said.
"Researchers believe a possible link between vocal mimicry and an ability to hear music may explain the development of music in human societies. advertisement
" 'The question of why music is found in every known human culture is a longstanding puzzle. Many argue that it is an adaptive behaviour that helped our species to evolve. But equally plausible is the possibility that it emerged as a by-product of other abilities -- such as vocal learning,' music psychologist Lauren Stewart of Goldsmiths, University of London told CNN.
" 'Parrots and humans both have the ability to imitate sounds that they hear, unlike our closer simian relatives. Once a species has the neural machinery in place for coupling the perception and production of vocal sounds, it may be only a small step to use the same circuits for synchronizing movements to a beat.' " ( http://www.cnn.com/2009/TECH/science/05/01/dancing.parrots/?iref=hpmostpop#cnnSTCText )
You can watch one of the most popular videos of Snowball, the dancing cockatoo, at this link: http://www.youtube.com/watch?v=N7IZmRnAo6s, accessed 05-04-2009.
Dirk Brockmann, and the epidemic modeling team at the Northwestern Institute on Complex Systems, used air traffic and commuter traffic patterns for the entire country, and data from the American currency tracking website, Where’s George?, to predict the spread of the H1N1 flu or "swine flu" across the United States.
"By most measurements, digital books are a mere page in the novel of publishing, which hovers annually around $25 billion. But in the last year, what was a budding niche market has had a major growth spurt.
"The Association of American Publishers (AAP), the industry’s primary trade group, has tracked digital book sales since 2003, when wholesale revenues amounted to $20 million. By 2007, that number had ambled up to $67 million. But in 2008, the figure nearly doubled to some $113 million.
"This year is off to an equally heady start, says Ed McCoyd, director of digital policy for AAP, pointing to the whopping 173 percent jump in sales from January 2008."
"9.7-inch display with auto-rotation, high-speed wireless access to 275,000 books, 3.3 gigabytes of storage, or room for up to 3,500 books. Native support for PDF documents, with no panning, zooming or scrolling necessary" (http://bits.blogs.nytimes.com/2009/05/06/live-blogging-the-kindle-fest/).
The initial list price of the DX was $489, or $130 more than the previous model, the Kindle 2. The DX was available for sale in the summer of 2009.
Stephen Wolfram and Wolfram Research, Champaign, Illinois, launched Wolfram|Alpha, a computational data engine with a new approach to knowledge extraction, based on natural language processing, a large library of algorithms and an NKS (New Kind of Science) approach to answering queries.
The Wolfram|Alpha engine differs from traditional search engines in that it does not simply return a list of results based on a query, but instead computes an answer.
The Ministry of Industry and Information Technology of the People's Republic of China issued a directive that, as of July 1, 2009, Green Dam Youth Escort (simplified Chinese: 绿坝-花季护航) must be pre-installed on, or shipped on a compact disc with, all personal computers sold in the mainland of the People's Republic of China, including those imported from abroad.
Using the Golden Shield Project, sometimes called the "Great Firewall of China," China regularly restricted access to certain Internet sites and information that the government deemed sensitive.
"Critics fear this new software could be used by the government to enhance internet censorship. The Computer and Communications Industry Association said the development was 'very unfortunate'. Ed Black, CCIA president criticised the move as 'clearly an escalation of attempts to limit access and the freedom of the internet, [...with] economic and trade as well as cultural and social ramifications.' Black said the Chinese were attempting to 'not only control their own citizens' access to the internet but to force everybody into being complicit and participate in a level of censorship'.
"On 8 June, Microsoft said that appropriate parental control tools was 'an important societal consideration'. However, 'we agree with others in industry and around the world that important issues such as freedom of expression, privacy, system reliability and security need to be properly addressed.'
"A spokesman for the Foreign ministry said the software would filter out pornography or violence. "The Chinese government pushes forward the healthy development of the internet. But it lawfully manages the internet," he added.
"On 11 June, a BBC News article reported that potential faults in the software could lead to a large-scale disaster: The report included comments by Isaac Mao, who said that there were 'a series of software flaws', including the unencrypted communications between the software and the company's servers, which could allow hackers access to people's private data or place malicious script on machines on the network to "affect [a] large scale disaster' " (Wikipedia article on Green Dam Youth Escort, accessed 06-11-2009).
In an interview in the Financial Times, Google CEO Eric Schmidt
"reveals that Google seriously considered either buying a newspaper as a for-profit enterprise or hiring a pack of smart lawyers to reconfigure the paper as a nonprofit venture. He doesn't name which paper, of course, but the Financial Times reporters pointedly remind their readers that the hedge fund Harbinger Capital Partners offered Google its twenty percent stake in the New York Times. Ultimately, however, the company decided that going so far as owning an outlet that actually produced copy, rather than simply aggregating and organizing it, would be 'crossing the line' between a content company and a technology company. Wall Street Journal writer Jessica Vascellaro argues that this position is growing increasingly flimsy. After all, she writes, both YouTube and Google's Book Search project are awfully close to resembling content production.
"The real reason may be twofold. First, as Schmidt readily concedes, the targeted papers are either far too expensive or burdened with too much debt and liabilities. Second, the advertising model for general news reporting is obsolete, and Google's execs have decided instead to work with papers such as the Washington Post . . .to come up with a new model that can subsidize serious general news gathering. The days when general display ads would float on the page, contextually disconnected from the substance of the stories, are over. But who wants their ads tied to stories of Gitmo torture? Unless the business model radically changes, there will be no revenue stream that props up the most serious and important news stories.
"So what does Schmidt have in mind for the Washington Post? 'It seems to me that the newspaper that I read online should remember what I read. It should allow me to go deeper into the stories. It's that kind of a discussion that we're having.' In other words, the paper will store and archive a catalogue of the stories you read, steer more stories along those lines to your eyeballs, and keep you coming back for more by knowing what you're most interested in. Google already remembers what you search for, in order to more accurately match ads to your search screen. Now, it seems, Schmidt would like to apply this technique to news gathering" (http://www.thebigmoney.com/blogs/feeling-lucky/2009/05/21/google-almost-bought-paper, accessed 05-22-2009)
At the Google IO Developers Conference in San Francisco Google demonstrated Google Wave, "an ambitious, if incomplete, attempt to reinvent email and Internet communication in general" developed by Lars and Jens Rasmussen, who previously developed Google Maps. The opensource program would be available to developers worldwide.
The Google Wave demonstration is available on a 1.5 hour video available on YouTube. When I accessed the video on June 1, 2009 it had already been downloaded 1,173,600 times and had already received 3,225 ratings.
At the BookExpo convention, New York, Google announced its intention to sell ebooks (e-books) directly to consumers through its Google Books service. In contrast to Amazon, which sold ebooks at the fixed price of $9.95 per title and only through its proprietary Kindle ebook reader, Google allowed publishers to set the price of ebook titles and make them available across browsers, cell phones, and other platforms.
"Around 14.9 million U.S. households regularly buy books online. Among that group, 48 percent earn more than $70,000 a year and spend $28 a month on books, half of them online" (http://news.cnet.com/8301-1023_3-10253199-93.html, accessed 06-01-2009)
The International Internet Preservation Consortium (IIPC), netpreserve. org published the WARC file format as an international standard: ISO 28500:2009, Information and documentation—WARC file format.
"For many years, heritage organizations have tried to find the most appropriate ways to collect and keep track of World Wide Web material using web-scale tools such as web crawlers. At the same time, these organizations were concerned with the requirement to archive very large numbers of born-digital and digitized files. A need was for a container format that permits one file simply and safely to carry a very large number of constituent data objects (of unrestricted type, including many binary types) for the purpose of storage, management, and exchange. Another requirement was that the container need only minimal knowledge of the nature of the objects.
"The WARC format is expected to be a standard way to structure, manage and store billions of resources collected from the web and elsewhere. It is an extension of the ARC format , which has been used since 1996 to store files harvested on the web. WARC format offers new possibilities, notably the recording of HTTP request headers, the recording of arbitrary metadata, the allocation of an identifier for every contained file, the management of duplicates and of migrated records, and the segmentation of the records. WARC files are intended to store every type of digital content, either retrieved by HTTP or another protocol" (http://netpreserve.org/press/pr20090601.php).
"By August 2009, Bing had gained 9.3 percent of the United States Internet search market. However, by September, StatCounter stated that Bing's share of the US search market in September had fallen by over one percentage point to 8.51%. Comscore claimed otherwise, stating that Bing's growth had held steady in September 2009, gaining 0.1 percent of the total United States Internet Search Market representing a market share of 9.4 percent" (Wikipedia article on Bing, accessed 11-12-2009).
Filed under: Indexing & Searching Information
"It has been widely reported that my drawings are now made on an iPhone... Considering all the sketches and watercolors and photographs I have done in the USA for the past twenty years, my output in the Brushes app since I bought a G3 last February is still rather small. It has attracted more attention than anything else I have done: it seems people can't resist a nice tech story. But it's a happy affair. As much as I enjoy and admire other media, drawing on a screen that's always bright even on a dark street, with no paint to carry, no brushes to wash, and countless levels of "undo", seems to agree with me. I always work on location, drawing everything from scratch, with no use of photography whatsoever. (The app churns out Quicktime movies that detail each brushtroke, as seen in The New Yorker's website; it mercifully ignores all the trial-and-errors and failed attempts, making my progression look uncannily flawless. That's so not true.) I could carry a pad or even an easel around. But drawing on a phone is so discreet, so casual" (http://www.drawger.com/jorgecolombo/?section=articles&article_id=9154, accessed 01-07-2010).
♦ On January 07, 2010 you could watch a series of Quicktime movies of Jorge Columbo creating iPhone paintings on the New Yorker website at this link: http://www.newyorker.com/video?videoID=40951183001.
"Regarding storage costs -- again its unhelpful to be vague, but equally unhelpful to be too specific. The cost of a 1 TB [terabyte] hard drive from the local IT hyperstore is NOT a useful number for estimating cost of reliable storage. Unfortunately the 'price of reliability' is equally hard to determine.
"The 'rule of thumb' most quoted now is 'one million dollars per year per petabyte' for 'managed server' storage eg disc-based storage from a well-run data centre that does good redundancy and backups. That means of course one thousand dollars per terabyte (per year) and that's a good estimate, in my view, to use for funding request and planning purposes. It can be done more cheaply -- up to ten times cheaper -- but that introduces various risks and requirements that you may or may not want to get into. In the BBC where we know that archive content is, on average, used once per four years, we're happy to put datatape on shelves and go for a much lower cost per terabyte" (Richard Wright, Sr Research Engineer, Research & Development, BBC Future Media & Technology, from: email@example.com, 06-04-2009).
"SAN FRANCISCO — As the newspaper industry and its classified advertising business wither, one company appears to be doing extraordinarily well: Craigslist.
"The Internet classified ads company, which promotes its “relatively noncommercial nature” and “service mission” on its site, is projected to bring in more than $100 million in revenue this year, according to a new study from Classified Intelligence Report, a publication of AIM Group, a media and Web consultant firm in Orlando, Fla.
"That is a 23 percent jump over the revenue the firm estimated for 2008 and a huge increase since 2004, when the site was projected to bring in just $9 million. 'This is a down-market for just about everyone else but Craigslist,' said Jim Townsend, editorial director of AIM Group. The firm counted the number of paid ads on the site for a month and extrapolated an annual figure. It said its projections were conservative.
"By contrast, classified advertising in newspapers in the United States declined by 29 percent last year, its worst drop in history, according to the Newspaper Association of America" (http://www.nytimes.com/2009/06/10/technology/internet/10craig.html?hpw, accessed 06-10-2009).
The United States converted from analog to digital television broadcasting.
"The switch from analog to digital broadcast television is referred to as the digital TV (DTV) transition. In 1996, the U.S. Congress authorized the distribution of an additional broadcast channel to each broadcast TV station so that they could start a digital broadcast channel while simultaneously continuing their analog broadcast channel. Later, Congress set June 12, 2009 as the final date that full power television stations can broadcast analog signals. As of June 13, 2009, full power television stations will only broadcast digital, over-the-air signals. Your local broadcasters may make the transition before then, and some already have.
"The digital transition is underway. Prepare now! On Feb. 17, some full-power broadcast television stations in the United States may stop broadcasting on analog airwaves and begin broadcasting only in digital. The remaining stations may stop broadcasting analog sometime between April 16 and June 12. June 12 is the final deadline for terminating analog broadcasts under legislation passed by Congress.
"Why are we switching to DTV?
"An important benefit of the switch to all-digital broadcasting is that it will free up parts of the valuable broadcast spectrum for public safety communications (such as police, fire departments, and rescue squads). Also, some of the spectrum will be auctioned to companies that will be able to provide consumers with more advanced wireless services (such as wireless broadband).
"Consumers also benefit because digital broadcasting allows stations to offer improved picture and sound quality, and digital is much more efficient than analog. For example, rather than being limited to providing one analog program, a broadcaster is able to offer a super sharp “high definition” (HD) digital program or multiple “standard definition” (SD) digital programs simultaneously through a process called “multicasting.” Multicasting allows broadcast stations to offer several channels of digital programming at the same time, using the same amount of spectrum required for one analog program. So, for example, while a station broadcasting in analog on channel 7 is only able to offer viewers one program, a station broadcasting in digital on channel 7 can offer viewers one digital program on channel 7-1, a second digital program on channel 7-2, a third digital program on channel 7-3, and so on. This means more programming choices for viewers. Further, DTV can provide interactive video and data services that are not possible with analog technology" (http://dtv.gov/whatisdtv.html, accessed 06-12-2009).
Solid Oak Software Inc, developer of CyberSitter, alleged that an Internet-filtering program called Green Dam Youth Escort produced in China and mandated by the Chinese government, contained stolen portions of the company's code.
"Solid Oak Software, the developer of CyberSitter, claims that the look and feel of the GUI used by Green Dam mimics the style of CyberSitter. But more damning, chief executive Brian Milburn said, was the fact that the Green Dam code uses DLLs identified with the CyberSitter name, and even makes calls back to Solid Oak's servers for updates" (http://www.pcmag.com/article2/0,2817,2348705,00.asp, accessed 06-13-2009).
Solid Oak Software Inc. said it will try to stop PC makers from shipping computers with the software.
"Solid Oak said Friday that it found pieces of its CyberSitter filtering software in the Chinese program, including a list of terms to be blocked, instructions for updating the software, and an old news bulletin promoting CyberSitter. Researchers at the University of Michigan who have been studying the Chinese program also said they found components of CyberSitter, including the blacklist of terms.
"Jinhui Computer System Engineering Co., the Chinese company that made the filtering software, denied stealing anything. "That's impossible," said Bryan Zhang, Jinhui's founder, in response to Solid Oak's charges.
"The allegations come as PC makers such as Dell Inc. and Hewlett-Packard Co. are sorting through a mandate by the Chinese government requiring that all PCs sold in China as of July come with the filtering software. Representatives of the two big U.S. companies said they are working with trade associations to monitor new developments related to the Chinese software" (http://online.wsj.com/article/SB124486910756712249.html, accessed 06-13-2009).
"As employment headlines go from grim to grimmer, it’s appropriate that one job category with expanding demand involves helping people avoid reality. Designers of computer simulations are sought in many fields to help understand complex, multifaceted phenomena that are too expensive or perilous to study in real life."
Bill Waite, chairman of the AEgis Technologies Group, a Huntsville, Ala., company that creates simulations for various military and civilian applications, "estimates that 400,000 people make a living in the United States in one aspect or another of simulation" (http://www.nytimes.com/2009/06/14/jobs/14starts.html?8dpc, accessed 06-22-2009).
"At one time, authoritarian regimes could draw a shroud around the events in their countries by simply snipping the long-distance phone lines and restricting a few foreigners. But this is the new arena of censorship in the 21st century, a world where cellphone cameras, Twitter accounts and all the trappings of the World Wide Web have changed the ancient calculus of how much power governments actually have to sequester their nations from the eyes of the world and make it difficult for their own people to gather, dissent and rebel.
"Iran’s sometimes faltering attempts to come to grips with this new reality are providing a laboratory for what can and cannot be done in this new media age — and providing lessons to other governments, watching with calculated interest from afar, about what they may be able to get away with should their own citizens take to the streets.
"One early lesson is that it is easier for Iranian authorities to limit images and information within their own country than it is to stop them from spreading rapidly to the outside world. While Iran has severely restricted Internet access, a loose worldwide network of sympathizers has risen up to help keep activists and spontaneous filmmakers connected.
"The pervasiveness of the Web makes censorship 'a much more complicated job,' said John Palfrey, a co-director of Harvard’s Berkman Center for Internet and Society.
"The Berkman Center estimates that about three dozen governments — as widely disparate as China, Cuba and Uzbekistan — extensively control their citizens’ access to the Internet. Of those, Iran is one of the most aggressive. Mr. Palfrey said the trend during this decade has been toward more, not less, censorship. 'It’s almost impossible for the censor to win in an Internet world, but they’re putting up a good fight,' he said.
"Since the advent of the digital age, governments and rebels have dueled over attempts to censor communications. Text messaging was used to rally supporters in a popular political uprising in Ukraine in 2004 and to threaten activists in Belarus in 2006. When Myanmar sought to silence demonstrators in 2007, it switched off the country’s Internet network for six weeks. Earlier this month, China blocked sites like YouTube to coincide with the 20th anniversary of the Tiananmen Square crackdown.
"In Iran, the censorship has been more sophisticated, amounting to an extraordinary cyberduel. It feels at times as if communications within the country are being strained through a sieve, as the government slows down Web access and uses the latest spying technology to pinpoint opponents. But at least in limited ways, users are still able to send Twitter messages, or tweets, and transmit video to one another and to a world of online spectators.
"Because of the determination of those users, hundreds of amateur videos from Tehran and other cities have been uploaded to YouTube in recent days, providing television networks with hours of raw — but unverified — video from the protests.
"The Internet has 'certainly broken 30 years of state control over what is seen and is unseen, what is visible versus invisible,' said Navtej Dhillon, an analyst with the Brookings Institution" (http://www.nytimes.com/2009/06/23/world/middleeast/23censor.html?hp).
The death of American entertainer Michael Jackson had a remarkably dramatic impact on the Internet:
"The news of Jackson's death spread quickly online, causing websites to crash and slow down from user overload. Both TMZ and the Los Angeles Times, two websites that were the first to confirm the news, suffered outages. Google believed the millions of people searching 'Michael Jackson' meant it was under attack. Twitter reported a crash, as did Wikipedia at 3:15 PDT. The Wikimedia Foundation reported nearly one million visitors to the article Michael Jackson within one hour, which they said may be the most visitors in a one-hour period to any article in Wikipedia's history. AOL Instant Messenger collapsed for 40 minutes. AOL called it a seminal moment in Internet history,' adding, 'We've never seen anything like it in terms of scope or depth.' Around 15 percent of Twitter posts (or 5,000 tweets per minute) mentioned Jackson when the news broke, compared to topics such as the 2009 Iranian election and swine flu, which never rose above 5 percent of total tweets. Overall, web traffic was 11 percent higher than normal" (Wikipedia article on Death of Michael Jackson, accessed 07-04-2009).
In November 2009 more than 100,000 apps were available for download from Apple's App Store, making it the largest such retailer in the world.
"The App Store launched in July 2008 with just 500 applications. The store is now available in 77 countries, which has contributed to what Apple said Wednesday is well over 2 billion downloads" (http://news.cnet.com/8301-13579_3-10390454-37.html)
The Human Connectome Project, a five-year project sponsored by sixteen components of the National Institutes of Health (NIH) split between two consortia of research institutions, was launched as the first of three Grand Challenges of the National Institutes of Health's Blueprint for Neuroscience Research.
The project was described as "an ambitious effort to map the neural pathways that underlie human brain function. The overarching purpose of the Project is to acquire and share data about the structural and functional connectivity of the human brain. It will greatly advance the capabilities for imaging and analyzing brain connections, resulting in improved sensitivity, resolution, and utility, thereby accelerating progress in the emerging field of human connectomics. Altogether, the Human Connectome Project will lead to major advances in our understanding of what makes us uniquely human and will set the stage for future studies of abnormal brain circuits in many neurological and psychiatric disorders" (http://www.humanconnectome.org/consortia/, accessed 12-28-2010).
"To mark the online launch of the reunited Codex Sinaiticus, the British Library is staging an exhibition, From Parchment to Pixel: The Virtual reunification of Codex Sinaiticus, which runs from Monday 6 July until Monday 7 September, 2009 in the Folio Society Gallery at the Library's St Pancras building. Visitors will be able to view a range of historic items and artefacts that tell the story of the Codex and its virtual reunification, along with spectacular interactive representations of the manuscript and a digital reconstruction of the changes to a specific page over the centuries. In addition, they will see on display in the Treasures Gallery, for the very first time, both volumes of Codex Sinaiticus held at the British Library.
"The virtual reunification of Codex Sinaiticus is the culmination of a four-year collaboration between the British Library, Leipzig University Library, the Monastery of St Catherine (Mount Sinai, Egypt), and the National Library of Russia (St Petersburg), each of which hold different parts of the physical manuscript.
"By bringing together the digitised pages online, the project will enable scholars worldwide to research in depth the Greek text, which is fully transcribed and cross-referenced, including the transcription of numerous revisions and corrections. It will also allow researchers into the history of the book as a physical object to examine in detail aspects of its fabric and manufacture: pages can be viewed either with standard light or with raking light which, by illuminating each page at an angle, highlights the physical texture and features of the parchment.
" 'The Codex Sinaiticus is one of the world's greatest written treasures,' said Dr Scot McKendrick, Head of Western Manuscripts at the British Library. “This 1600-year-old manuscript offers a window into the development of early Christianity and first-hand evidence of how the text of the bible was transmitted from generation to generation. The project has uncovered evidence that a fourth scribe – along with the three already recognised – worked on the text; the availability of the virtual manuscript for study by scholars around the world creates opportunities for collaborative research that would not have been possible just a few years ago.'
"The Codex Sinaiticus Project was launched in 2005, when a partnership agreement was signed by the four partner organisations that hold extant pages and fragments. A central objective of the project is the publication of new research into the history of the Codex. Other key aims of the project were to undertake the preservation, digitisation and transcription of the Codex and thereby reunite the pages, which have been kept in separate locations for over 150 years.
"Professor David Parker from the University of Birmingham's Department of Theology, who directed the team funded by the UK's Arts and Humanities Research Council (AHRC), which made the electronic transcription of the manuscript said: 'The process of deciphering and transcribing the fragile pages of an ancient text containing over 650,000 words is a huge challenge, which has taken nearly four years.
" 'The transcription includes pages of the Codex which were found in a blocked-off room at the Monastery of St Catherine in 1975, some of which were in poor condition,' added Professor Parker. 'This is the first time that they have been published. The digital images of the virtual manuscript show the beauty of the original and readers are even able to see the difference in handwriting between the different scribes who copied the text. We have even devised a unique alignment system which allows users to link the images with the transcription. This project has made a wonderful book accessible to a global audience.' To mark the successful completion of the project, the British Library is hosting an academic conference on 6-7 July 2009 entitled 'Codex Sinaiticus: text, Bible, book'. A number of leading experts will give presentations on the history, text, conservation, palaeography and codicology of the manuscript. See: http://www.codexsinaiticus.org/en/project/conference.aspx" http://www.artdaily.org/index.asp?int_sec=2&int_new=31895, accessed 07-07-2009)
"In a move that angered customers and generated waves of online pique, Amazon remotely deleted some digital editions of the books from the Kindle devices of readers who had bought them.
"An Amazon spokesman, Drew Herdener, said in an e-mail message that the books were added to the Kindle store by a company that did not have rights to them, using a self-service function. 'When we were notified of this by the rights holder, we removed the illegal copies from our systems and from customers’ devices, and refunded customers,' he said.
'Amazon effectively acknowledged that the deletions were a bad idea. 'We are changing our systems so that in the future we will not remove books from customers’ devices in these circumstances,' Mr. Herdener said" (http://www.nytimes.com/2009/07/18/technology/companies/18amazon.html, accessed 07-25-2009).
"Books in the real world are covered by a notion of copyright called the 'first sale' doctrine, which allows a purchaser to do pretty much whatever he or she wants with the book–including reselling it or lending it to a friend.
"But digital books–especially if they’re sold as part of access to a networked system such as Amazon’s Kindle Store and Google’s online books collection–don’t necessarily fall under those same rules. 'We have not matured our understanding of copyright to work in a digital environment in way that provides a set of protections and meets people’s expectations for how we use digital content,' said Brantley" (http://blogs.wsj.com/digits/2009/07/17/an-orwellian-moment-for-amazons-kindle/, accessed 07-25-2009).
"Starting today, USA TODAY's Best-Selling Books list becomes the first major list to include Amazon Kindle e-book sales. The move reflects both the growth of e-book sales and Kindle's role in that market. 'Since 1993, USA TODAY's Best-Selling Books list has always evolved to reflect the ways our readers buy books,' says Susan Weiss, managing editor of the Life section. 'Adding Kindle to our group of contributors makes sense given the growth in the e-book platform.' E-books, for all devices, claimed 4.9% of sales in May, according to book audience research firm Codex-Group. That's up from 3.7% in March. This week, Barnes & Noble announced the launch of its own eBookstore with 700,000 titles."
Microsoft and Yahoo! announced a 10-year deal in which the Yahoo! search engine, then second-largest in terms of query volume, would be replaced by Bing. Yahoo! would get to keep 88% of the revenue from all search ad sales on its site for the first five years of the deal, and have the right to sell advertisements on some Microsoft sites. Yahoo! Search would still maintain its own user interface, but would eventually feature "Powered by Bing" branding.
Bettina Wagner and the Bayerische Staatsbibliothek München, published in print Als die Lettern laufen lernten. Medienwandel im 15. Jahrhundert (When Letters Became Mobile. The Transition of Media in the 15th Century):
"The invention of printing with movable letters by Johann Gutenberg is frequently described as a „media revolution“ and compared to the effects of the „electronic revolution“ of the past decades. While both events had far-reaching consequences on the production and distribution of texts, the exhibition intends to demonstrate that a gradual transition rather than a sudden turnover took place in the second half of the 15th century. Increasingly, printing techniques were employed for the production of books, but the oldest printed books, traditionally referred to as incunabula, still show many individual features which were created by hand. Thus, innovation and tradition overlap in many respects: the modern techniques for multiplication of texts and images in print only gradually superseded handwriting, and for a long time, printed books continued to be corrected by hand and to be decorated with coloured headlines and painted illustrations.
"About 90 items are displayed from the rich holdings of incunabula in the Bayerische Staatsbibliothek, which ranks first among all libraries world-wide with holdings of more than 20,000 15th-century books. The most famous incunabula are on show in the „Schatzkammer“ (treasury), including the Gutenberg-Bible and the ‚Türkenkalender’ of 1454, the earliest printed book in German, which survives in a single copy held at the Bayerische Staatsbibliothek. In addition to illustrated manuscripts and blockbooks, incunabula with painted miniatures and outstanding examples of 15th-century woodcuts can be seen, among them the report by the Mainz canon Bernhard von Breydenbach about his journey to Palestine, Hartmann Schedel’s personal copy of his ‚Nuremberg Chronicle’ and Sebastian Brant’s ‚Ship of Fools’, for which Albrecht Dürer may have designed illustrations. Apart from woodcuts, examples of other techniques for printing illustrations are presented, like copper engravings, metal cuts and printing with colour and gold – still at an experimental stage in the 15th century.
"In the second part of the exhibition, a range of very diverse incunabula give insight into the production and distribution of printed books – starting with the manuscript copy text used for typesetting and ending with the book arriving in the hands of a buyer and reader. Proof-sheets and printed tables of rubrics reveal how early printers organized the production of books. In the first decades of printing, modern conventions of book design like title-pages developed. Texts printed in non-Latin alphabets and unusual formats as well as evidence for 15th-century print-runs demonstrate the effectiveness and capability of early printing workshops. The new medium of the broadside reached entirely new groups of readers. In the printing press, posters and handbills could be produced in large numbers and thus served to disseminate all manners of texts – from pious songs over medical advice up to current news. Early printers also used broadsides to advertise their products in order to achieve financial success. This, however, led to a rapid decrease in book prices: The exhibition ends with a note added to an incunable in 1494 by a buyer who marvels at the low cost of the book. Forty years after Gutenberg published his Bible, the technology of printing finally prevailed over older, competing forms of text reproduction. While conservative circles continued to plead for copying texts by hand, the printed book’s triumph proved unstoppable, even though some readers, like Sebastian Brant’s ‚foolish reader’ could not cope with the massive number of books available" (https://www.bsb-muenchen.de/Detailed-information.403+M56017d4e158.0.html, accessed 09-18-2009).
In "What's a Big City Without a Newspaper?" published in The New York Times Magazine, Michael Sokelove wrote:
"Many working journalists in the country regularly check a Web site known to most as “Romenesko” (after its creator, Jim Romenesko), which aggregates industry news and these days consists mainly of layoffs and other dire news. It can be excruciating to read. Just this year, The Rocky Mountain News perished. The Seattle Post-Intelligencer became a Web-only publication with a tiny staff. Detroit’s daily newspapers are now delivered just three days a week. The Boston Globe, owned by the New York Times Company, and The San Francisco Chronicle, owned by Hearst, each went through near-death experiences as their owners won labor concessions after threatening to shutter the papers.
"Smaller newspapers, those with circulations under 50,000, are considered the healthiest part of the industry. “They’re not making 30 percent profit margins like they once did, but most of them are doing fine,” John Morton, a newspaper analyst who has followed the industry for decades, told me. Most analysts predict that the papers with a national profile and brand — The New York Times, The Washington Post, The Wall Street Journal and USA Today — will find a way to survive and stay in print. (It must be noted that few can say exactly how this will happen.)"
Bioengineer Stephen R. Quake of Stanford University invented a new technology for decoding DNA that could sequence a human genome at a cost of $50,000.
"Dr. Quake’s machine, the Heliscope Single Molecule Sequencer, can decode or sequence a human genome in four weeks with a staff of three people. The machine is made by a company he founded, Helicos Biosciences, and costs 'about $1 million, depending on how hard you bargain,' he said.
"Only seven human genomes have been fully sequenced. They are those of J. Craig Venter, a pioneer of DNA decoding; James D. Watson, the co-discoverer of the DNA double helix; two Koreans; a Chinese; a Yoruban; and a leukemia victim. Dr. Quake’s seems to be the eighth full genome, not counting the mosaic of individuals whose genomes were deciphered in the Human Genome Project."
"For many years DNA was sequenced by a method that was developed by Frederick Sanger in 1975 and used to sequence the first human genome in 2003, at a probable cost of at least $500 million. A handful of next-generation sequencing technologies are now being developed and constantly improved each year. Dr. Quake’s technology is a new entry in that horse race.
"Dr. Quake calculates that the most recently sequenced human genome cost $250,000 to decode, and that his machine brings the cost to less than a fifth of that.
“ 'There are four commercial technologies, nothing is static and all the platforms are improving by a factor of two each year,' he said. 'We are about to see the floodgates opened and many human genomes sequenced.'
"He said the much-discussed goal of the $1,000 genome could be attained in two or three years. That is the cost, experts have long predicted, at which genome sequencing could start to become a routine part of medical practice" (Nicholas Wade, NY Times, http://www.nytimes.com/2009/08/11/science, /11gene.html?8dpc).
Google announced in its blog that it was displaying crowdsourced congestion data from GPS enabled cell phones on Google maps.
". . . When you choose to enable Google Maps with My Location, your phone sends anonymous bits of data back to Google describing how fast you're moving. When we combine your speed with the speed of other phones on the road, across thousands of phones moving around a city at any given time, we can get a pretty good picture of live traffic conditions. We continuously combine this data and send it back to you for free in the Google Maps traffic layers. It takes almost zero effort on your part — just turn on Google Maps for mobile before starting your car — and the more people that participate, the better the resulting traffic reports get for everybody.
"This week we're expanding our traffic layer to cover all U.S. highways and arterials when data is available. We're able to do this thanks in no small part to the data contributed by our users. This is exactly the kind of technology that we love at Google because it's so easy for a single person to help out, but can be incredibly powerful when a lot of people use it together. Imagine if you knew the exact traffic speed on every road in the city — every intersection, backstreet and freeway on-ramp — and how that would affect the way you drive, help the environment and impact the way our government makes road planning decisions. This idea, which we geeks call 'crowdsourcing,' isn't new. Ever since GPS location started coming to mainstream devices, people have been thinking of ways to use it to figure out how fast the traffic is moving. But for us to really make it work, we had to solve problems of scale (because you can't get useful traffic results until you have a LOT of devices reporting their speeds) and privacy (because we don't want anybody to be able to analyze Google's traffic data to see the movement of a particular phone, even when that phone is completely anonymous)" (http://googleblog.blogspot.com/2009/08/bright-side-of-sitting-in-traffic.html, accessed 12-18-2011).
IBM Research – Zurich scientists Leo Gross, Fabian Mohn, Nikolaj Moll and Gerhard Meyer, in collaboration with Peter Liljeroth of Utrecht University, published "The Chemical Structure of a Molecule Resolved by Atomic Force Microscopy," Science, 2009; 325 (5944): 1110 DOI: 10.1126/science.1176210
Using an atomic force microscope operated in an ultrahigh vacuum and at very low temperatures ( –268oC or – 451oF) the scientists imaged the chemical structure of individual pentacene molecules. For the first time ever, they were able to look through the electron cloud and see the atomic backbone of an individual molecule.
The abstract of the article is:
"Resolving individual atoms has always been the ultimate goal of surface microscopy. The scanning tunneling microscope images atomic-scale features on surfaces, but resolving single atoms within an adsorbed molecule remains a great challenge because the tunneling current is primarily sensitive to the local electron density of states close to the Fermi level. We demonstrate imaging of molecules with unprecedented atomic resolution by probing the short-range chemical forces with use of noncontact atomic force microscopy. The key step is functionalizing the microscope’s tip apex with suitable, atomically well-defined terminations, such as CO molecules. Our experimental findings are corroborated by ab initio density functional theory calculations. Comparison with theory shows that Pauli repulsion is the source of the atomic resolution, whereas van der Waals and electrostatic forces only add a diffuse attractive background."
♦ You can watch a video of the scientists discussing and explaining this discovery at IBM's Press Room at this link:
http://www-03.ibm.com/press/us/en/pressrelease/28267.wss, accessed 09-12-2009).
Swiss scientist Francis Schwarze of Empa, St. Gallen, and the Swiss violin maker Michael Rhonheimer of Baden received confirmation that the violin they had created using wood treated with a specially selected fungus compared favorably in a blind test against an instrument made in 1711 by the master violin maker of Cremona, Antonio Stradivari.
"In the test, the British star violinist Matthew Trusler played five different instruments behind a curtain, so that the audience did not know which was being played. One of the violins Trusler played was his own strad, worth two million dollars. The other four were all made by Rhonheimer – two with fungally-treated wood, the other two with untreated wood. A jury of experts, together with the conference participants, judged the tone quality of the violins. Of the more than 180 attendees, an overwhelming number – 90 persons – felt the tone of the fungally treated violin "Opus 58" to be the best. Trusler’s stradivarius reached second place with 39 votes, but amazingly enough 113 members of the audience thought that "Opus 58" was actually the strad! "Opus 58" is made from wood which had been treated with fungus for the longest time, nine months.
"Skepticism before the blind test
"Judging the tone quality of a musical instrument in a blind test is, of course, an extremely subjective matter, since it is a question of pleasing the human senses. Empa scientist Schwarze is fully aware of this, and as he says, 'There is no unambiguous scientific way of measuring tone quality.' He was therefore, understandably, rather nervous before the test. Since the beginning of the 19th century violins made by Stradivarius have been compared to instruments made by others in so called blind tests, the most serious of all probably being that organized by the BBC in 1974. In that test the world famous violinists Isaac Stern and Pinchas Zukerman together with the English violin dealer Charles Beare were challenged to identify blind the 'Chaconne' Stradivarius made in 1725, a "Guarneri del Gesu" of 1739, a 'Vuillaume' of 1846 and a modern instrument made by the English master violin maker Roland Praill. The result was rather sobering – none of the experts was able to correctly identify more than two of the four instruments, and in fact two of the jurors thought that the modern instrument was actually the "Chaconne" stradivarius.
'Biotech wood, a revolution in the art of violin making
"Violins made by the Italian master Antonio Giacomo Stradivarius are regarded as being of unparalleled quality even today, with enthusiasts being prepared to pay millions for a single example. Stradivarius himself knew nothing of fungi which attack wood, but he received inadvertent help from the 'Little Ice Age' which occurred from 1645 to 1715. During this period Central Europe suffered long winters and cool summers which caused trees to grow slowly and uniformly – ideal conditions in fact for producing wood with excellent acoustic qualities.
"Horst Heger of the Osnabruck City Conservatory is convinced that the success of the 'fungus violin' represents a revolution in the field of classical music. 'In the future even talented young musicians will be able to afford a violin with the same tonal quality as an impossibly expensive Stradivarius,' he believes. In his opinion, the most important factor in determining the tone of a violin is the quality of the wood used in its manufacture. This has now been confirmed by the results of the blind test in Osnabruck. The fungal attack changes the cell structure of the wood, reducing its density and simultaneously increasing its homogeneity. 'Compared to a conventional instrument, a violin made of wood treated with the fungus has a warmer, more rounded sound,' explains Francis Schwarze" (http://www.sciencedaily.com/releases/2009/09/090914111418.htm, accessed 10-08-2009).
"This fall, DePaul University journalism alumnus Craig Kanalley will teach what is believed to be the first college-level journalism course focused solely on Twitter and its applications. Kanalley is a digital intern at the Chicago Tribune.
"It is one of several innovative courses offered by DePaul’s College of Communication to help prepare students to work in the burgeoning digital landscape. Other journalism courses include niche journalism, reporting for converged newsrooms, backpack reporting and entrepreneurial journalism.
"Kanalley said his course, 'Digital Editing: From Breaking News to Tweets, is really about learning how to make sense of the clutter of the Web, particularly in situations of breaking news or major developing stories, and how to evaluate and verify the authenticity of reports by citizen journalists.'
“ 'Thousands share information about these stories and how they’re affected through Twitter every day, and there’s a need to sift through this data to find relevant information that provides story tips and additional context for these events,' Kanalley said.
"Students will especially focus on the social networking platform Twitter and apply concepts discussed in class to Kanalley’s live journalism Web site Breaking Tweets ( www.breakingtweets.com ), which integrates news and relevant Twitter feedback to create a one-of-a-kind Web experience for readers by providing eyewitness accounts of breaking news stories from around the world" (http://media-newswire.com/release_1098001.html, accessed 09-01-2009).
"Researchers in Israel say they have developed a computer program that can decipher previously unreadable ancient texts and possibly lead the way to a Google-like search engine for historical documents.
"The program uses a pattern recognition algorithm similar to those law enforcement agencies have adopted to identify and compare fingerprints.
"But in this case, the program identifies letters, words and even handwriting styles, saving historians and liturgists hours of sitting and studying each manuscript.
"By recognizing such patterns, the computer can recreate with high accuracy portions of texts that faded over time or even those written over by later scribes, said Itay Bar-Yosef, one of the researchers from Ben-Gurion University of the Negev.
" 'The more texts the program analyses, the smarter and more accurate it gets,' Bar-Yosef said.
"The computer works with digital copies of the texts, assigning number values to each pixel of writing depending on how dark it is. It separates the writing from the background and then identifies individual lines, letters and words.
"It also analyses the handwriting and writing style, so it can 'fill in the blanks' of smeared or faded characters that are otherwise indiscernible, Bar-Yosef said.
"The team has focused their work on ancient Hebrew texts, but they say it can be used with other languages, as well. The team published its work, which is being further developed, most recently in the academic journal Pattern Recognition due out in December but already available online. A program for all academics could be ready in two years, Bar-Yosef said. And as libraries across the world move to digitize their collections, they say the program can drive an engine to search instantaneously any digital database of handwritten documents. Uri Ehrlich, an expert in ancient prayer texts who works with Bar-Yosef's team of computer scientists, said that with the help of the program, years of research could be done within a matter of minutes. 'When enough texts have been digitized, it will manage to combine fragments of books that have been scattered all over the world,' Ehrlich said" (http://www.reuters.com/article/newsOne/idUSTRE58141O20090902, accessed 09-02-2009).
"In The Case for Books: Past, Present, and Future, Robert Darnton, a pioneer in the field of the history of the book, offers an in-depth examination of the book from its earliest beginnings to its changing—some even say threatened—place in culture, commerce and the academy. But to predict the death of the book is to ignore its centuries-long history of survival. The following are some of Darnton's observations.
"1. The Future. Whatever the future may be, it will be digital. The present is a time of transition, when printed and digital modes of communication coexist and new technology soon becomes obsolete. Already we are witnessing the disappearance of familiar objects: the typewriter, now consigned to antique shops; the postcard, a curiosity; the handwritten letter, beyond the capacity of most young people, who cannot write in cursive script; the daily newspaper, extinct in many cities; the local bookshop, replaced by chains, which themselves are threatened by Internet distributors like Amazon. And the library? It can look like the most archaic institution of all. Yet its past bodes well for its future, because libraries were never warehouses of books. They have always been and always will be centers of learning. Their central position in the world of learning makes them ideally suited to mediate between the printed and the digital modes of communication. Books, too, can accommodate both modes. Whether printed on paper or stored in servers, they embody knowledge, and their authority derives from a great deal more than the technology that went into them.
"2. Preservation. Bits become degraded over time. Documents may get lost in cyberspace, owing to the obsolescence of the medium in which they are encoded. Hardware and software become extinct at a distressing rate. Unless the vexatious problem of digital preservation is solved, all texts “born digital” belong to an endangered species. The obsession with developing new media has inhibited efforts to preserve the old. We have lost 80% of all silent films and 50% of all films made before World War II. Nothing preserves texts better than ink imbedded in paper, especially paper manufactured before the 19th century, except texts written in parchment or engraved in stone. The best preservation system ever invented was the old-fashioned, pre-modern book.
"3. Reading… and Writing. Time was when readers kept commonplace books. Whenever they came across a pithy passage, they copied it into a notebook under an appropriate heading, adding observations made in the course of daily life. The practice spread everywhere in early modern England, among ordinary readers as well as famous writers like Francis Bacon, Ben Jonson, John Milton, and John Locke. It involved a special way of taking in the printed word. Unlike modern readers, who follow the flow of a narrative from beginning to end (unless they are digital natives and click through texts on machines), early modern Englishmen read in fits and starts and jumped from book to book. They broke texts into fragments and assembled them into new patterns by transcribing them in different sections of their notebooks. Then they reread the copies and rearranged the patterns while adding more excerpts. Reading and writing were therefore inseparable activities. They belonged to a continuous effort to make sense of things, for the world was full of signs: you could read your way through it, and by keeping an account of your readings, you made a book of your own, one stamped with your personality.
"4. Piracy. Voltaire toyed with his texts so much that booksellers complained. As soon as they sold one edition of a work, another would appear, featuring additions and corrections by the author. Customers protested. Some even said that they would not buy an edition of Voltaire's complete works—and there were many, each different from the others—until he died, an event eagerly anticipated by retailers throughout the book trade. Piracy was so pervasive in early modern Europe that bestsellers could not be blockbusters as they are today. Instead of being produced in huge numbers by one publisher, they were printed simultaneously in many small editions by many publishers, each racing to make the most of a market unconstrained by copyright. Few pirates attempted to produce accurate counterfeits of the original editions. They abridged, expanded, and reworked texts as they pleased, without worrying about the authors' intentions.
"5. E-Books. I want to write an electronic book. Here is how my fantasy takes shape. An “e-book,” unlike a printed codex, can contain many layers arranged in the shape of a pyramid. Readers can download the text and skim the topmost layer, which will be written like an ordinary monograph. If it satisfies them, they can print it out, bind it (binding machines can now be attached to computers and printers), and study it at their convenience in the form of a custom-made paperback. If they come upon something that especially interests them, they can click down a layer to a supplementary essay or appendix. They can continue deeper through the book, through bodies of documents, bibliography, historiography, iconography, background music, everything I can provide to give the fullest possible understanding of my subject. In the end, they will make the subject theirs, because they will find their own paths through it, reading horizontally, vertically, or diagonally, wherever the electronic links may lead.
"6. Authorship. Despite the proliferation of biographies of great writers, the basic conditions of authorship remain obscure for most periods of history. At what point did writers free themselves from the patronage of wealthy noblemen and the state in order to live by their pens? What was the nature of a literary career, and how was it pursued? How did writers deal with publishers, printers, booksellers, reviewers, and one another? Until those questions are answered, we will not have a full understanding of the transmission of texts. Voltaire was able to manipulate secret alliances with pirate publishers because he did not depend on writing for a living. A century later, Zola proclaimed that a writer's independence came from selling his prose to the highest bidder. How did this transformation take place?
"7. The Book Trade. It may seem hopeless to conceive of book history as a single subject, to be studied from a comparative perspective across the whole range of historical disciplines. But books themselves do not respect limits either linguistic or national. They have often been written by authors who belonged to an international republic of letters, composed by printers who did not work in their native tongue, sold by booksellers who operated across national boundaries, and read in one language by readers who spoke another. Books also refuse to be contained within the confines of a single discipline when treated as objects of study. Neither history nor literature nor economics nor sociology nor bibliography can do justice to all aspects of the life of a book. By its very nature, therefore, the history of books must be international in scale and interdisciplinary in method. But it need not lack conceptual coherence, because books belong to circuits of communication that operate in consistent patterns, however complex they may be. By unearthing those circuits, historians can show that books do not merely recount history; they make it.(http://www.publishersweekly.com/article/CA6696290.html)"
In a paper entitled "Material Degradomics: On the Smell of Old Books", published in the journal Analytical Chemistry, Matija Strlic at University College London, and associates at the Tate art museum (U.K.), the University of Ljubljana, and Morana RTD in Ivančna Gorica, (both in Slovenia) introduced a new method for linking a book’s physical state to its corresponding VOC emissions pattern. The goal was to “diagnose” decomposing historical documents noninvasively as a step toward protecting them.
“Ordinarily, traditional analytical methods are used to test paper samples that have been ripped out,” Strlic says. “The advantage of our method is that it’s nondestructive" (http://pubs.acs.org/doi/full/10.1021/ac902143z?cookieSet=1).
"The test is based on detecting the levels of volatile organic compounds. These are released by paper as it ages and produce the familiar 'old book smell'.
"The international research team, led by Matija Strlic from University College London's Centre for Sustainable Heritage, describes that smell as 'a combination of grassy notes with a tang of acids and a hint of vanilla over an underlying mustiness'.
" 'This unmistakable smell is as much part of the book as its contents,' they wrote in the journal article. Dr Strlic told BBC News that the idea for new test came from observing museum conservators as they worked.
" 'I often noticed that conservators smelled paper during their assessment,' he recalled. 'I thought, if there was a way we could smell paper and tell how degraded it is from the compounds it emits, that would be great.'
"The test does just that. It pinpoints ingredients contained within the blend of volatile compounds emanating from the paper.
"That mixture, the researchers say, 'is dependent on the original composition of the... paper substrate, applied media, and binding' " (http://news.bbc.co.uk/2/hi/science/nature/8355888.stm)
According to Internetworldstats.com there were about 1,733,993,000 Internet users on September 30, 2009. This compared with about 360,985,000 on December 31, 2000.
Oxford University Press published as a printed book the Historical Thesaurus of the Oxford English Dictionary with Additional Material from A Thesaurus of Old English,edited by Christian Kay, Jane Roberts, Michael Samuels, and Irene Wotherspoon.
Forty years in the making, this 4448-page work was the first historical thesaurus to be compiled for any language, and the first to include almost the entire vocabulary of English, from Old English to the present. It was also the largest thesaurus resource in the world, covering more than 920,000 words and meanings, based on the Oxford English Dictionary.
The Historical Thesaurus lists synonyms listed with dates of first recorded use in English, in chronological order, with earliest synonyms first. For obsolete words, the Thesaurus also included the last recorded use of each word.
The work used a specially devised thematic system of classification. Its comprehensive index enabled complete cross-referencing of nearly one million words and meanings. It contained a comprehensive sense inventory of Old English and a fold-out color chart which showed the top levels of the classification structure.
The following are quotations from Google CEO Eric Schmidt, selected from his interview on October 3, 2009 with Danny Sullivan of searchengineland.com, representing Schmidt's view of present problems and possible future solutions for newspapers and journalism impacted by the Internet:
"The number of readers for newspapers is declining. The market is becoming more specialized. There will always be a market for people who read the newspaper on a train going into New York City. There will always be a market for people who sit in in the afternoon in a cafe in the city and read the newspaper in the sunshine. The term “killing” is a bit over[blown]. Newspapers face a long-term secular decline because of the shift in user habits due to the Internet."
"In the case of the newspapers, they have multiple problems which are hard to solve. If you think about it there are three fundamental problems. One is that the physical cost of things is going up, physical newsprint. Another one has been the loss of classifieds. And a third one has been essentially the difficulty in selling traditional print ads. So, all of them have online solutions. And we’ve come to the conclusion that the right thing to do is to help them with the online."
"We think that over a long enough period of time, most people will have personalized news-reading experiences on mobile-type devices that will largely replace their traditional reading of newspapers. Over a decade or something. And that that kind of news consumption will be very personal, very targeted. It will remember what you know. It will suggest things that you might want to know. It will have advertising. Right? And it will be as convenient and fun as reading a traditional newspaper or magazine.
"So one way one to think about it is that the newspaper or magazine industry do a great job of the convenience of scanning and looking and understanding. And we have to get the web to that point, or whatever the web becomes. So we just announced, the official name is Google Fast Flip. And that’s an example of the kind of thing we’re doing. And we have a lot more coming."
"I specifically am talking about investigative journalism when I talk about this. There’s no lack of bloggers and people who publish their opinions and faux editorial writers and people with an opinion. And I think that one of the great things about the internet is that we can hear them. We can also choose to ignore them. So it’s not correct to say that the internet is decreasing conversation. The internet is clearly increasing conversation at an incredibly rapid pace. The cacophony of voices is overwhelming as you know.
"Well-funded, targeted professionally managed investigative journalism is a necessary precondition in my view to a functioning democracy. And so that’s what we worry about. And as you know, that was always subsidized in the newspaper model by the other things that they did. You know, the story about the scandal in Iraq or Afghanistan was difficult to advertise against. But there was enough revenue that it allowed the newspaper to fulfill its mission" (http://searchengineland.com/google-ceo-eric-schmidt-on-newspapers-journalism-27172)
"According to a report being released Wednesday by Forrester Research, Cambridge, Massachusetts, e-reader sales will total an estimated 3 million this year, with Amazon selling 60 percent of them and Sony Corp. 35 percent."
"According to the Association of American Publishers, e-books accounted for just 1.6 percent of all book sales in the first half of the year. But the market is growing fast. E-book sales totaled $81.5 million in the first half, up from $29.8 million in the first six months of 2008.
"And [Jeff] Bezos said Amazon sells 48 Kindle copies for every 100 physical copies of books that it offers in both formats. Five months ago it was selling 35 Kindle copies per 100 physical versions.
"Bezos said that increase is happening faster than he expected.
" 'I think that ultimately we will sell more books in Kindle editions than we do in physical editions,' Bezos said in the interview, which was held in the Cupertino offices of Lab126, the Amazon subsidiary that developed the Kindle" (http://www.nytimes.com/aponline/2009/10/07/business/AP-US-TEC-Amazon-Kindle.html)
Sergey Brin, co-founder and technology president of Google published an Op-Ed piece regarding the Google Book Search program in The New York Times entitled, perhaps overly optimistically, "A Library to Last Forever," from which I quote without implied endorsement:
". . .the vast majority of books ever written are not accessible to anyone except the most tenacious researchers at premier academic libraries. Books written after 1923 quickly disappear into a literary black hole. With rare exceptions, one can buy them only for the small number of years they are in print. After that, they are found only in a vanishing number of libraries and used book stores. As the years pass, contracts get lost and forgotten, authors and publishers disappear, the rights holders become impossible to track down.
"Inevitably, the few remaining copies of the books are left to deteriorate slowly or are lost to fires, floods and other disasters. While I was at Stanford in 1998, floods damaged or destroyed tens of thousands of books. Unfortunately, such events are not uncommon - a similar flood happened at Stanford just 20 years prior. You could read about it in The Stanford-Lockheed Meyer Library Flood Report, published in 1980, but this book itself is no longer available.
"Because books are such an important part of the world's collective knowledge and cultural heritage, Larry Page, the co-founder of Google, first proposed that we digitize all books a decade ago, when we were a fledgling startup. At the time, it was viewed as so ambitious and challenging a project that we were unable to attract anyone to work on it. But five years later, in 2004, Google Books (then called Google Print) was born, allowing users to search hundreds of thousands of books. Today, they number over 10 million and counting.
"The next year we were sued by the Authors Guild and the Association of American Publishers over the project. While we have had disagreements, we have a common goal - to unlock the wisdom held in the enormous number of out-of-print books, while fairly compensating the rights holders. As a result, we were able to work together to devise a settlement that accomplishes our shared vision. While this settlement is a win-win for authors, publishers and Google, the real winners are the readers who will now have access to a greatly expanded world of books.
"There has been some debate about the settlement, and many groups have offered their opinions, both for and against. I would like to take this opportunity to dispel some myths about the agreement and to share why I am proud of this undertaking. This agreement aims to make millions of out-of-print but in-copyright books available either for a fee or for free with ad support, with the majority of the revenue flowing back to the rights holders, be they authors or publishers.
"Some have claimed that this agreement is a form of compulsory license because, as in most class action settlements, it applies to all members of the class who do not opt out by a certain date. The reality is that rights holders can at any time set pricing and access rights for their works or withdraw them from Google Books altogether. For those books whose rights holders have not yet come forward, reasonable default pricing and access policies are assumed. This allows access to the many orphan works whose owners have not yet been found and accumulates revenue for the rights holders, giving them an incentive to step forward.
"Others have questioned the impact of the agreement on competition, or asserted that it would limit consumer choice with respect to out-of-print books. In reality, nothing in this agreement precludes any other company or organization from pursuing their own similar effort. The agreement limits consumer choice in out-of-print books about as much as it limits consumer choice in unicorns. Today, if you want to access a typical out-of-print book, you have only one choice - fly to one of a handful of leading libraries in the country and hope to find it in the stacks." (http://www.nytimes.com/2009/10/09/opinion/09brin.html?scp=2&sq=sergey%20brin&st=cse, accessed 10-09-2009).
"The ghost of a fingerprint in the top left corner of an obscure portrait appears to have confirmed one of the most extraordinary art discoveries. The 33 x 23cm (13 x 9in) picture, in chalk, pen and ink, appeared at auction at Christie’s, New York, in 1998, catalogued as 'German school, early 19th century'. It sold for $19,000 (£11,400). Now a growing number of leading art experts agree that it is almost certainly by Leonardo da Vinci and worth about £100 million.
"Carbon dating and infra-red analysis of the artist’s technique are consistent with such a conclusion, but the most compelling evidence is that fragment of a fingerprint.
"Peter Paul Biro, a Montreal-based forensic art expert, found it while examining images captured by the revolutionary multispectral camera from the Lumière Technology company, Antiques Trade Gazette reports today.
"Mr Biro has pioneered the use of fingerprint technology to help to resolve art authentication disputes. Multispectral analysis reveals each layer of colour, and enables the pigment mixtures of each pixel to be identified without taking physical samples. The fingerprint corresponds to the tip of the index or middle finger, and is 'highly comparable' to one on Leonardo’s St Jerome in the Vatican. Importantly, St Jerome is an early work from a time when Leonardo was not known to have employed assistants, making it likely that it is his fingerprint.
"Martin Kemp, Emeritus Professor of History of Art at the University of Oxford, is convinced and recently completed a book about the find (as yet unpublished). He said that his first reaction was that 'it sounded too good to be true — after 40 years in the business, I thought I’d seen it all'. But gradually, “all the bits fell into place.”
Professor Kemp has rechristened the picture, sold as Young Girl in Profile in Renaissance Dress, as La Bella Principessa after identifying her, 'by a process of elimination', as Bianca Sforza, daughter of Ludovico Sforza, Duke of Milan (1452-1508), and his mistress Bernardina de Corradis. He described the profile as 'subtle to an inexpressible degree', as befits the artist best known for the Mona Lisa.
"If it is by Leonardo, it would be the only known work by the artist on vellum although Professor Kemp points out that Leonardo asked the French court painter Jean Perréal about the technique of using coloured chalks on vellum in 1494.
"The picture was bought in 1998 by Kate Ganz, a New York dealer, who sold it for about the same sum to the Canadian-born Europe-based connoisseur Peter Silverman in 2007. Ms Ganz had suggested that the portrait 'may have been made by a German artist studying in Italy ... based on paintings by Leonardo da Vinci'.
"When Mr Silverman first saw it, in a drawer, 'my heart started to beat a million times a minute,' he said. 'I immediately thought this could be a Florentine artist. The idea of Leonardo came to me in a flash.'
"Carbon-14 analysis of the vellum gave a date range of 1440-1650. Infra-red analysis revealed stylistic parallels to Leonardo’s other works, including a palm print in the chalk on the sitter’s neck 'consistent ... to Leonardo’s use of his hands in creating texture and shading', according to Mr Biro" (http://entertainment.timesonline.co.uk/tol/arts_and_entertainment/visual_arts/article6872019.ece, accessed 10-14-2009).
Bonhams auctioneers announced that they identified a Roman cameo glass vase, which may be the most important of its kind in the world. Strikingly similar to the Portland Vase, it is larger, in better condition and with superior decoration
"The vase dates from between late First Century B.C. to early First Century A.D and stands 13in (33.5cm) high. Only 15 other Roman cameo glass vases and plaques are known to exist today. These very rare vessels were highly artistic, luxury items, produced by the Roman Empire’s most skilled craftsmen. They are formed from two layers of glass – cobalt blue with a layer of white on top – which is cut down after cooling to create the cameo-style decoration.
"Items of this kind were produced, it is thought, within a period of only two generations. They would have been owned by distinguished Roman families.
"Until now, the most famous example has been the Portland Vase, held by the British Museum. This is smaller, standing at only 9in (24cm) high. It is also missing its base and has been restored three times.
"The recently identified vase is also more complex than others of its kind, being decorated with around 30 figures and a battle scene around the lower register. By comparison, the Portland vase has just seven figures. Bonhams’ experts believe that this magnificent artefact could rewrite the history books on cameo vases. Unlike the Portland Vase, it still has its base and lower register and will therefore add significantly to the archaeological understanding of these vessels.
"The vase is thought to have resided in a private European collection for some time. The collector is a long-term client of Bonhams.
"In co-operation with leading experts in the field and with the present owner of the vase, Bonhams say they will be carrying out detailed research over the coming months into the historical background of the vase and its miraculous survival as well as into its more recent history and chain of ownership.
"The vase was presented publicly for the first time at a the 18th Congress of the International Association for the History of Glass at Thessaloniki in Greece in September, where it was viewed by around 200 of the world’s leading glass specialists" (http://www.antiquestradegazette.com/news/7312.aspx).
The Association of Research Libraries in Washington, D.C. presented a symposium entitled An Age of Discovery: Distinctive Collections in the Digital Age.
The full audio text of the symposium and PDFs of some of the presentations were available from the ARL website at the link provided.
Arbor Networks, Chelmsford, Massachusetts, the University of Michigan, and Merit Network presented the findings of the Internet Observatory Report at the North American Network Operators Group NANOG47 in Dearborn, Michigan:
"• The report is believed to be the largest study of global Internet traffic since the start of the commercial Internet in the mid-1990s. The report offers analysis of two years worth of detailed traffic statistics from 110 large and geographically diverse cable operators, international transit backbones, regional networks and content providers.
"• At its peak, the study monitored more than 12 terabits-per-second and a total of more than 256 exabytes of Internet traffic over the two-year life of the study.
"• The Internet Observatory Report includes a discussion around significant changes in Internet topology and commercial inter-relationships between providers; analysis of changes in Internet protocols and applications; and a concluding analysis of Internet growth trends and predictions of future trends.
"• Evolution of the Internet Core: Over the last five years, Internet traffic has migrated away from the traditional Internet core of 10 to 12 Tier-1 international transit providers. Today, the majority of Internet traffic by volume flows directly between large content providers, datacenter / CDNs and consumer networks. Consequently, most Tier-1 networks have evolved their business models away from IP wholesale transit to focus on broader cloud / enterprise services, content hosting and VPNs.
"• Rise of the ‘Hyper Giants’: Five years ago, Internet traffic was proportionally distributed across tens of thousands of enterprise managed web sites and servers around the world. Today, most content has increasingly migrated to a small number of very large hosting, cloud and content providers. Out of the 40,000 routed end sites in the Internet, 30 large companies – “hyper giants” like Limelight, Facebook, Google, Microsoft and YouTube – now generate and consume a disproportionate 30% of all Internet traffic.
"• Applications Migrate to the Web: Historically, Internet applications communicated across a panoply of application specific protocols and communication stacks. Today, the majority of Internet application traffic has migrated to an increasingly small number of web and video protocols, including video over web and Adobe Flash. Other mechanisms for video and application distribution like P2P (peer-to-peer) have declined dramatically in the last two years.
"• A New Internet Ecosystem: Over the last five years, macroeconomic forces have radically transformed the global Internet commercial ecosystem. Economic changes, including the collapse of wholesale IP transit and the dramatic growth in advertisement-supported service, reversed decade-old business dynamics between transit providers, consumer networks and content providers. A wave of innovation is ongoing, with service providers now offering everything from triple play services to managed security services, VPNs and increasingly, CDNs. This change in the Internet business ecosystem has significant ongoing implications for backbone engineering, design of Internet scale applications and research."
According to Arbor Networks' 2009 Atlas Observatory Report Google accounted for 6 percent of all Internet traffic of every type.
"And how many would have heard of a company called Carpathia Hosting? Its MegaUpload, MeaErotik, MegaClick and MegaVideo services have turned it into a company that now accounts for 1 percent of all Internet traffic, says Arbor, and this will doubtless grow. The important takeaway is that few of these companies had even been heard of two years ago, and very few of them are big telcos. To put all this into perspective, in 2007 Arbor found that the overwhelming majority of Internet traffic was accounted for by 30,000 entities, with fifty percent of traffic accounted for by around 10,000 companies.
"Only two years later that same fifty percent now runs through only 150 top 'content delivery networks' (CDNs), an astonishing consolidation made more remarkable by the fact that Internet traffic has grown significantly during that time.
" 'Up to 2007, The Internet meant connecting to lots of servers and data centres around the world,' notes Arbor's chief scientist, Craig Labovitz. Now there are barely 100 companies that matter. Traffic patterns tend to be hidden, mainly because the companies losing out - the traditional telcos and ISPs - don't exactly have an interest in advertising their waning status. The reason for their decline in importance is that Internet traffic is being driven by huge providers with access to content such as video.
" 'For 150 years, they [BT and other telcos] have had the same business model. Now everyone is trying to get away from being a dumb pipe.' Arbor's Atlas Internet Observatory report crunched traffic from 100 of the Internet's largest entities, accounting for 12 Terabytes of peak throughput, equivalent to about a quarter of the Internet's total at any one moment, said Labovitz.The importance of this is not simply that a small number of companies will account for a lot of traffic, but that these companies are increasingly what the Internet actually is. The Internet up to around 2007 was dominated by a hierarchy of companies, co-operating with one another to allow traffic to be passed from one to the other, regardless of size. The new Internet superpowers, in stark contrast, bypass a lot of this and use direct connections from one to the other. If a company is not part of this new core, it could find itself increasingly passed to the 'long tail', a polite way of saying they will be shoved to the fringe.
"Video, including video that runs over web/http, now accounts for an estimated 10 percent of all Internet traffic, and is one reason all these direct connections between large data centres are now necessary. IPv6 traffic remains tiny at only 0.03 percent of traffic, but is showing sudden and possibly rapid growth in recent months thanks to deployments by named hosters.
"Interestingly, P2P is in rapid decline, falling from around 3 percent of all traffic in 2007 to only half a percent now. Again, downloaders appear to prefer direct connectivity for downloads, mostly through port 80 and the web" (http://www.thestandard.com/news/2009/10/14/internet-now-dominated-traffic-superpowers)
Hockney had a history of exploiting new technologies in his art:
"Hockney continued to explore other media besides painting, most notably photography. From 1982-86, he created some of his best-known and most iconographic work — his “joiners,” large composite landscapes and portraits made up of hundreds or thousands of individual photographs. Hockney initially used a Polaroid camera for the photos, switching to a 35 mm camera as the works grew larger and more complex. In interviews, Hockney related the “joiners” to cubism, pointing out that they incorporate elements that a traditional photograph does not possess — namely time, space, and narrative.
"Always willing to adopt new techniques, in 1986 Hockney began producing art with color photocopiers. He has also incorporated fax machines (faxing art to an exhibition in Brazil, for example) and computer-generated images (most notably Quantel Paintbox, a computer system often used to make graphics for television shows) into his work" (http://www.pbs.org/wnet/americanmasters/episodes/david-hockney/the-colors-of-music/103/, accessed 01-09-2010).
The Internet Corporation for Assigned Names and Numbers (ICANN) voted to allow Web addresses written completely in Chinese, Arabic, Korean and other languages using non-Latin alphabets.
"The decision is a 'historic move toward the internationalization of the Internet,' said Rod Beckstrom, Icann’s president and chief executive. 'We just made the Internet much more accessible to millions of people in regions such as Asia, the Middle East and Russia.'
"This change affects domain names — anything that comes after the dot, including .com, .cn or .jp. Domain names have been limited to 37 characters — 26 Latin letters, 10 digits and a hyphen. But starting next year, domain names can consist of characters in any language. In some Web addresses, non-Latin scripts are already used in the portion before the dot. Thus, Icann’s decision Friday makes it possible, for the first time, to write an entire Internet address in a non-Latin alphabet.
"Initially, the new naming system will affect only Web addresses with 'country codes,' the designators at the end of an address name, like .kr (for Korea) or .ru (for Russia). But eventually, it will be expanded to all types of Internet address names, Icann said.
"Some security experts have warned that allowing internationalized domain names in languages like Arabic, Russian and Chinese could make it more difficult to fight cyberattacks, including malicious redirects and hacking. But Icann said it was ready for the challenge. 'I do not believe that there would be any appreciable difference,' Mr. Beckstrom said in an interview. 'Yes, maybe some additional potential but at the same time, some new security benefits may come too. If you look at the global set of cybersecurity issues, I don’t see this as any significant new threat if you look at it on an isolated basis.'
"The decision, reached after years of testing and debate, clears the way for Icann to begin accepting applications for non-Latin domain names Nov. 16. People will start seeing them in use around mid-2010, particularly in Arabic, Chinese and other scripts in which demand for the new 'internationalized' domain name system has been among the strongest, Icann officials say. Internet addresses in non-Latin scripts could lead to a sharp increase in the number of global Internet users, eventually allowing people around the globe to navigate much of the online world using their native language scripts, they said.
"This is a boon especially for users who find it cumbersome to type in Latin characters to access Web pages. Of the 1.6 billion Internet users worldwide, more than half use languages that have scripts that are not based on the Latin alphabet." (http://www.nytimes.com/2009/10/31/technology/31net.html?hp)
The Cray XT5 supercomputer, known as Jaguar, at the National Center for Computational Sciences at Oak Ridge National Laborary, in Oak Ridge,Tennessee, became the world's fastest supercomputer by operating at 1.75 petaflop/s, or quadrillions of floating point operations per second, according to the Top500 Linpack benchmark.
"The company said that the deal will allow users to take advantage of the Wolfram Alpha algorithms and search tools within Bing queries.
"The initial partnership, which is expected to bear fruit within a few days, will focus on providing nutritional information to users as well as certain mathematical tools. When users search for foods or recipes, the engine will display a small tab containing nutritional information.
"Along with increasing traffic to the Bing service, Microsoft hopes that the features will allow users to better monitor their diet and exercise plans.
" 'This notion of creating and presenting computational knowledge in search results is one of the more exciting things going on in search (and beyond) today, and the team at Bing is incredibly fired up to bring some of this amazing work to our customers,' " programme managers Tracey Yao and Pedro Silva said in a blog posting.
"The Wolfram Alpha partnership is one of several campaigns Microsoft has embarked on to drum up traffic for Bing. Other recent additions include visual search results and the ability to search within a user's Hotmail archives" (http://www.v3.co.uk/v3/news/2253013/microsoft-gives-further-updates)
NASA announced that the Lunar CRater Observation and Sensing Satellite (LCROSS) managed by Ames Research Center, Moffett Field, California, and its companion rocket, which impacted in crater Cabeus near the Moon's south pole on October 9, 2009, generated a "significant amount" of water.
This discovery had significant implications for the support of a manned base on the moon or for the generation of rocket fuel to further space exploration.
"From 3AM on Wednesday November 25, 2009, until 3AM the following day (US east coast time), WikiLeaks released half a million US national text pager intercepts. The intercepts cover a 24 hour period surrounding the September 11, 2001 attacks in New York and Washington.
"The messages were broadcasted 'live' to the global community — sychronized to the time of day they were sent. The first message was from 3AM September 11, 2001, five hours before the first attack, and the last, 24 hours later.
"Text pagers are usualy carried by persons operating in an official capacity. Messages in the archive range from Pentagon, FBI, FEMA and New York Police Department exchanges, to computers reporting faults at investment banks inside the World Trade Center
"The archive is a completely objective record of the defining moment of our time. We hope that its entrance into the historical record will lead to a nuanced understanding of how this event led to death, opportunism and war" (http://911.wikileaks.org/, accessed 11-26-2009).
According to BBC.com, the number of text messages published may have been as high as 573,000.
Among the numerous things I collect are DVDs and high-definition Blu-ray Discs. Toward the end of 2009 I noticed that certain classic films were being re-issued as Blu-ray discs packaged in the back of short hardcover books concerning the films. These were not books that happened to include a disc as supplementary material. In those cases the electronic data is often secondary to the physical book. What I bought was the Blu-ray disc, packaged inside a full color book of 30 to 50 pages that was issued in the same size as the normal plastic Blu-ray clamshell boxes. The book is clearly secondary to the data—an excellent informative way of packaging and storing the data.
Two Blu-ray discs that I purchased in December 2009 packaged in hardcover books were Robert Redford's film, A River Runs Through It, based on the elegantly written novella by Norman Maclean, and the 50th Anniversary edition of Alfred Hitchcock's North by Northwest. The back of each book contains a thick plastic insert attached to the inside of the rear cover to protect the disc. Both books contain full-color content that is well-presented and informative.
Why do I include these details in this database? To me, selling Blu-ray discs inside a book represents a notable physical example of the convergence of the book and electronic media. To a book collector this format is also superior and of greater interest than the standard Blu-ray plastic clamshell box.
"Like many other well-known organizations, we face cyber attacks of varying degrees on a regular basis. In mid-December, we detected a highly sophisticated and targeted attack on our corporate infrastructure originating from China that resulted in the theft of intellectual property from Google. However, it soon became clear that what at first appeared to be solely a security incident--albeit a significant one--was something quite different.
"First, this attack was not just on Google. As part of our investigation we have discovered that at least twenty other large companies from a wide range of businesses--including the Internet, finance, technology, media and chemical sectors--have been similarly targeted. We are currently in the process of notifying those companies, and we are also working with the relevant U.S. authorities.
"Second, we have evidence to suggest that a primary goal of the attackers was accessing the Gmail accounts of Chinese human rights activists. Based on our investigation to date we believe their attack did not achieve that objective. Only two Gmail accounts appear to have been accessed, and that activity was limited to account information (such as the date the account was created) and subject line, rather than the content of emails themselves.
"Third, as part of this investigation but independent of the attack on Google, we have discovered that the accounts of dozens of U.S.-, China- and Europe-based Gmail users who are advocates of human rights in China appear to have been routinely accessed by third parties. These accounts have not been accessed through any security breach at Google, but most likely via phishing scams or malware placed on the users' computers.
"We have already used information gained from this attack to make infrastructure and architectural improvements that enhance security for Google and for our users. In terms of individual users, we would advise people to deploy reputable anti-virus and anti-spyware programs on their computers, to install patches for their operating systems and to update their web browsers. Always be cautious when clicking on links appearing in instant messages and emails, or when asked to share personal information like passwords online. You can read more here about our cyber-security recommendations. People wanting to learn more about these kinds of attacks can read this Report to Congress (PDF) by the U.S.-China Economic and Security Review Commission (see p. 163-), as well as a related analysis (PDF) prepared for the Commission, Nart Villeneuve's blog and this presentation on the GhostNet spying incident.
"We have taken the unusual step of sharing information about these attacks with a broad audience not just because of the security and human rights implications of what we have unearthed, but also because this information goes to the heart of a much bigger global debate about freedom of speech. In the last two decades, China's economic reform programs and its citizens' entrepreneurial flair have lifted hundreds of millions of Chinese people out of poverty. Indeed, this great nation is at the heart of much economic progress and development in the world today.
"We launched Google.cn in January 2006 in the belief that the benefits of increased access to information for people in China and a more open Internet outweighed our discomfort in agreeing to censor some results. At the time we made clear that 'we will carefully monitor conditions in China, including new laws and other restrictions on our services. If we determine that we are unable to achieve the objectives outlined we will not hesitate to reconsider our approach to China.'
"These attacks and the surveillance they have uncovered--combined with the attempts over the past year to further limit free speech on the web--have led us to conclude that we should review the feasibility of our business operations in China. We have decided we are no longer willing to continue censoring our results on Google.cn, and so over the next few weeks we will be discussing with the Chinese government the basis on which we could operate an unfiltered search engine within the law, if at all. We recognize that this may well mean having to shut down Google.cn, and potentially our offices in China" (http://googleblog.blogspot.com/2010/01/new-approach-to-china.html, accessed 01-16-2010).
"First, we're introducing new features that bring your search results to life with a dynamic stream of real-time content from across the web. Now, immediately after conducting a search, you can see live updates from people on popular sites like Twitter and FriendFeed, as well as headlines from news and blog posts published just seconds before. When they are relevant, we'll rank these latest results to show the freshest information right on the search results page.
Try searching for your favorite TV show, sporting event or the latest development on a recent government bill. Whether it's an eyewitness tweet, a breaking news story or a fresh blog post, you can find it on Google right after it's published on the web. . .
Our real-time search enables you to discover breaking news the moment it's happening, even if it's not the popular news of the day, and even if you didn't know about it beforehand. For example, in the screen shot, the big story was about GM's stabilizing car sales, which shows under "News results." Nonetheless, thanks to our powerful real-time algorithms, the 'Latest results' feature surfaces another important story breaking just seconds before: GM's CEO stepped down.
Click on 'Latest results' or select 'Latest' from the search options menu to view a full page of live tweets, blogs, news and other web content scrolling right on Google. You can also filter your results to see only 'Updates' from micro-blogs like Twitter, FriendFeed, Jaiku and others. Latest results and the new search options are also designed for iPhone and Android devices when you need them on the go, be it a quick glance at changing information like ski conditions or opening night chatter about a new movie — right when you're in line to buy tickets.
And, as part of our launch of real-time on Google search, we've added 'hot topics' to Google Trends to show the most common topics people are publishing to the web in real-time. With this improvement and a series of other interface enhancements, Google Trends is graduating from Labs.
"Our real-time search features are based on more than a dozen new search technologies that enable us to monitor more than a billion documents and process hundreds of millions of real-time changes each day. Of course, none of this would be possible without the support of our new partners that we're announcing today: Facebook, MySpace, FriendFeed, Jaiku and Identi.ca — along with Twitter, which we announced a few weeks ago" (http://googleblog.blogspot.com/2009/12/relevance-meets-real-time-web.html, accessed 05-06-2010).
Google announced the Living Stories project, which provided a new, experimental way to consume news, developed by a partnership between Google, the New York Times, and the Washington Post.
"The announcement of the 'living stories' project shows Google collaborating with newspapers at a time when some major publishers have characterized the company as a threat. Google has also taken steps recently to project an image of itself as a friend to the industry.
"Living stories is a much-enhanced version of what some newspaper Web sites already do by grouping material by subject matter. In the case of The Times, the paper’s Web site has thousands of “topic pages.” But those efforts have not yielded heavy reader traffic or much advertising.
"The Google project, presented without ads, is now at livingstories.googlelabs.com, part of Google Labs, where the company tries out experimental products. If it is judged a success, it would eventually reside on the site of any publisher that wanted to use it. Those publishers could also sell ads on those pages.
"Google’s dominant search engine sells ads alongside search results that often include news articles, leading some newspaper industry leaders — particularly executives of the News Corporation, led by Rupert Murdoch — to cry foul. Other publishers say that, on the contrary, they owe much of their Internet traffic and revenue to search engines.
"Google executives argue that the tools their company has developed, including search, make them the papers’ ally, a case made by Eric E. Schmidt, Google’s chairman and chief executive, in an opinion piece published last week in The Wall Street Journal. Also last week, Google announced changes in the way its search function interacts with news sites, giving publishers more flexibility in limiting the material readers can see before encountering demands for payment or registration. The changes were relatively minor, but reinforced the message that the company wanted to help news sites.
" 'There’s been a series of steps to work with and mollify news publishers, to improve the P.R., and you can see the living page in that same vein,' said Ken Doctor, a media analyst with the analysis firm Outsell. The project is a genuine step forward, he said, because 'on most news sites, site search, looking for a lot on one subject, is awful.'
"Google worked for months on the project with journalists and Web staffs at The Times and The Post. For now, it covers just eight broad topics, like health care reform and the Washington Redskins. At the top of each subject page is a summary, a timeline of major events and pictures, followed by the opening sections of a series of articles, in reverse chronological order. A set of buttons allows the reader to narrow the topic. 'It’s an experiment with a different way of telling stories,' said Martin A. Nisenholtz, senior vice president for digital operations of The New York Times Company. 'I think in it, you can see the germ of something quite interesting.'
"A reader can call up an entire article without navigating away from the subject page, reading one piece after another without using the 'forward' and 'back' buttons. Josh Cohen, business product manager for Google News, said that having all the material appear on a single page would help the page rank higher in Internet searches than newspapers’ subject pages do now.
"In various ways, the experiment duplicates or improves on what can now be done on publishers’ own sites, through a search engine’s news function or even on Wikipedia. Mr. Cohen said that if it worked well, Google would make the software available free to publishers, much as those publishers now use Google Maps and YouTube functions on their sites" (http://www.nytimes.com/2009/12/09/technology/companies/09google.html?hpw).
If you photographed certain types of individual objects the program would recognize them and automatically displace links to relevant information on the Internet. If you pointed your phone at a building the program would identify it by GPS and identify it. Then if you clicked on the name of the building it would bring up relevant Internet links.
♦ On May 7, 2010 you could watch a video describing the features of Google Goggles at this link:
Avatar, an American science fiction epic film written and directed by film director, producer, screenwriter, editor, and inventor James Cameron, and starring Sam Worthington, Zoe Saldana, Sigourney Weaver, Michelle Rodriguez and Stephen Lang, was first released in London by Twentieth Century Fox, headquartered in Century City, Los Angeles.
"The film is set in the year 2154 on Pandora, a moon in the Alpha Centauri star system. Humans are engaged in mining Pandora's reserves of a precious mineral, while the Na'vi—a race of indigenous humanoids—resist the colonists' expansion, which threatens the continued existence of the Na'vi and the Pandoran ecosystem. The film's title refers to the genetically engineered bodies used by the film's characters to interact with the Na'vi.
"Avatar had been in development since 1994 by Cameron, who wrote an 80-page scriptment for the film. Filming was supposed to take place after the completion of Titanic, and the film would have been released in 1999, but according to Cameron, 'technology needed to catch up' with his vision of the film. In early 2006, Cameron developed the script, as well as the language and culture of the Na'vi. He said sequels would be possible if Avatar was successful, and in response to the film's success, confirmed that there will be another two.
"The film was released in traditional 2-D, as well as 3-D, RealD 3D, Dolby 3D, and IMAX 3D formats. Avatar is officially budgeted at $237 million; other estimates put the cost at $280–310 million to produce and $150 million for marketing. The film is being touted as a breakthrough in terms of filmmaking technology, for its development of 3D viewing and stereoscopic filmmaking with cameras that were specially designed for the film's production.
"Avatar premiered in London, UK on December 10, 2009, and was released on December 18, 2009 in the US and Canada to critical acclaim and commercial success. It grossed $27 million on its opening day domestically (in the United States and Canada) and $77 million domestically on its opening weekend. It opened two days earlier internationally and grossed $232 million worldwide in its first five days of international release. Within three weeks of its release, with a worldwide gross of over $1 billion, Avatar became the second highest-grossing film of all time worldwide, exceeded only by Cameron's previous film, Titanic" (Wikipedia article on Avatar (2009 film), accessed 01-16-2010).
♦ From my perspective the most significant aspect of Avatar, apart from its breathtaking computer graphic animation, and the fascinating artificial culture and language of the Na'vi, was the convincing portrayal of a total virtual reality experience, and the interplay between virtual reality, the reality of earth-born humans, some of whom animated the avatars, and the different reality of the Na'vi. The film presented visions of a reality that I could not have imagined before viewing. In its presentation of new views of reality it is reminiscent of the 1982 film, Blade Runner, directed by Ridley Scott.
Another aspect of the film that was highly timely was its depiction of the struggle between destructive exploitation of natural resources versus living in harmony with nature.
Jean-Pierre Gérault, president of i2S, Pessac, France, announced the formation of a French consortium to scan the contents of French libraries. The project is called "Polinum," a French acronym that stands for "Operating Platform for Digital Books."
"French President Nicolas Sarkozy has made catching up on France's digital delay one of the national priorities by earmarking euro750 million of a euro35 billion ($51 billion) spending plan announced earlier this week for digitizing France's libraries, film and music archives and other repositories of the nation's recorded heritage. These funds will mainly go to French libraries, universities and museums, who will use them to develop their own plans for digitizing their holdings.
"The consortium, meanwhile, intends to be the technological choice for those institutions, Gerault said. He declined to estimate what part of the euro750 million the consortium thinks it can capture.
"France's culture ministry has been in difficult negotiations with Google, which would like to help digitize France's archives but has met resistance in France over fears of giving the internet search giant too much control over the nation's cultural heritage, as well as over how it would protect the interests of authors and other copyright holders" (http://www.businessweek.com/ap/financialnews/D9CL4M480.htm, accessed 12-17-2009).
The Amazon Kindle was hacked, allowing for all purchased content to be transferred off the device via a PDF file.
"Kindle e-books are sold as .AZW files which have DRM that stops users from transferring the purchased books to other devices that are not Kindles.
"That should no longer be a problem thanks to Israeli hacker "Labba" who has cracked the DRM. A second hacker, 'I <3 cabbages,' has released the 'Unswindle' program, which will reformat digital content downloaded and stored on the Kindle for PC app, converting it to easily movable formats, such as PDF.
" 'Cabbages' did note that Amazon's DRM process was tough to crack, although ultimately Amazon's work was in vain. 'Amazon actually put a bit of effort behind the DRM obfuscation in their Kindle for PC application. And they seem to have done a reasonable job on the obfuscation. Way to go Amazon! It's good enough that I got bored unwinding it all and just got lazy with the Windows debugging APIs instead,' he said" (http://www.afterdawn.com/news/archive/20989.cfm#comments, accessed 01-02-2010).
According to Amazon.com, the company sold more Kindle books (ebooks) for Christmas than physical books. At this time the company had over 390,000 titles available for wireless download on the Kindle. The Kindle 2, which weighed 10.2 ounces, could store up 1,500 books. The larger Kindle DX could store approximately 3500 non-illustrated books
Since the company did not give out specific statistics, except to state that the Kindle was their best-selling product this season, it is unclear whether the number of books "sold" included the vast number of free titles available for the Kindle. it is also understandable, that since the Kindle was their best-selling product, that buyers would have ordered multiple titles for each Kindle.
"Two interesting factoids emerge from the marketing verbiage: First, Kindle books outsold paper books on Christmas Day, the first time that has ever happened; Second, the Kindle is the 'most gifted item ever in our history,' according to Bezos. The first may not mean much, since Christmas Day isn’t necessarily a normal shopping day, though the volume of Kindle books sold suggests that on that day a lot of new Kindle users started stocking up on e-books. The second, an aggregate figure that appears to reflect all gifted items over all time, may be very significant or mean absolutely nothing at all, as the increase in online shopping and gifting continues to dwarf previous 'record-setting' gift sales by the law of large(r) numbers.
"Nevertheless, it is clear that this was the Kindle Christmas. During the third quarter of 2009, I estimated that Amazon sold 289,000 Kindles on sales growth of 60 percent year over year. We can assume, given the disappointing availability of most competitors, that Kindle grabbed a very large percentage of e-book reader sales this holiday season. However, it was also a poor Christmas overall, in terms of retails sales, even if Amazon did sell more stuff than ever before.
"So, how many more Kindles sold between the end of the Q3 and Christmas Day? Extrapolating from previous quarters, and assuming this was a break-out sales season for Kindle, meaning that it more that doubled over the previous quarter, factoring in the sales of Kindle books versus paper books as Christmas gift cards were redeemed yesterday, I estimate Amazon sold 419,000 Kindles in the fourth quarter, or 145 percent of the sales in Q3.
"That would make the total number of Kindles sold to date 1,491,000. Kindle now represents approximately 65 percent of the hardware reader market despite the appearance of Barnes & Noble’s Nook, which may reach 30,000 units in the quarter because of delays" (http://blogs.zdnet.com/Ratcliffe/?p=486, accessed 01-02-2010).