In my last trip to India, I met Srinath Srinivasa, faculty member at the International Institute of Information Technology (IIIT-B) and author of an interesting book called “The Power Law of Information”
An excerpt from the book related to non-information overload is here…
So how big is the increase? The School of Information Management and Systems (SIMS) at the University of California at Berkeley, USA, recently conducted an experiment to estimate the amount of information generated and exchanged using print and “online” media like the internet and telephone. In the year 2002, we seem to have generated 5 exabytes of new information on magnetic media, print, film and optical storage. To get the mathematics straight, one exabyte is 10 to the power of 18 (that is 1000000000000000000 bytes). This is just the amount of new information that is added to an already existing pool.
Similarly, an estimated 18 exabytes of new information was exchanged through electronic channels like TV, radio and the internet in 2002. It is also estimated that the amount of new information that is generated doubles in a period of three years, creating an equivalent of the Moore’s Law for information processing.
To understand how much one exabyte is, let us try counting up to one exabyte. An average desktop computer can count up to 10,000 in one second, given an application program written in a high-level programming language. If we ask this computer to simply count until one exabyte and do nothing else, it will take more than 3 million years.
Today, information access is not the problem. There is a lot of it around us. What is required is the need for access to relevant and trusted information