Meeting notice: The 02.Sept.03 meeting will be held at 7:30 p.m. at the Royal East (782 Main St., Cambridge), a block down from the corner of Main St. and Mass Ave. If you're new and can't recognize us, ask the manager. He'll probably know where we are. More details below. <-><-><-><-><-><-><-><-><-><-><-><-><-><-><-><-><-><-> Suggested topic: The surveillance society and technological innovation We all can see with painful clarity the open path leading to a world in which isolated individuals, let alone small sects, will be able to wreak quite impressive levels of damage on the society. There seem only two ways of reacting to this situation: altering brains so as to make sociopathic acts impossible even to imagine, and imposing a level of surveillance that is ubiquitous and continuous, in which either "they" watch "us" all the time, or everyone watches everyone (the Brin/Steve Mann variant). The completely transparent society seems marginally less odious than the brain rebuild. The point is sometimes made that it is at least provably compatible with human existence since we as a species have spent 99% of our history in societies in which everyone knew everything about everyone (and remembered it for generations). The point is well taken, but it is also true that for the same 99% of our history there was next to no technological innovation. Probably 99.99999% of the ideas humans have had regarding the design and improvement of tools (and much else, like science) have occurred during less than .1% of its history. What stands out about the last thousand years or so that can explain this association? Perhaps the most common suggestion is that innovation requires a certain level of population -- perhaps to support a sufficiently intense degree of interaction -- and world populations did not reach those levels until recently. However, the reverse might also be true -- that innovation depends on isolation and a relative attenuation of social relationships (which is why nerds are the way they are). In the tribal context, where everyone is being constantly watched, the earliest and most tentative deviations from group practice and thought can be instantly caught and "corrected," by ridicule if by nothing stronger. Hunter-gatherer groups by their nature are unlikely to tolerate the asocial absorption in and undirected and ungoverned exploration of materials that innovation requires. On this theory, only when it became possible to move out from under the spotlight of clan monitoring does innovation start to become psychologically possible. Does it follow that technical innovation and a transparent society are similarly incompatible in this century? It is easy to imagine a society inflamed with fear becoming quite heavy-handed about explorations that will lead it knows not where. In a world where such investigations can be monitored by anyone that hand could get quite heavy indeed. Imagine a world in which lawyers for established interests could dash off cease-and-desist letters, or technophobic public interest groups mount campaigns, or law enforcement agencies find out about technologies that they think might be a threat to their powers or responsibilities, just a few hours after a new idea has occurred to someone (and their investment and therefore incentive to persist is at a minimum). In such a world the rate of innovation would be much lower than it is today. As that rate collapses, innovation will become increasingly marginalized, and therefore subject to the social hostility that is often directed at the marginal, let alone at people trying to force change on a society that has learned, perhaps from bitter experience, to fear it. In a perfectly transparent society that level of hostility might be sufficent to bring innovation to a halt. Perhaps this is another solution for the Fermi paradox. Societies tolerate innovation until they develop powers frightening enough to make them stop. This point is the opposite of the singularity. Call it 'the exit'. If the exit happens before the singularity, societies may never develop technology sufficient for space exploration on any scale. <-><-><-><-><-><-><-><-><-><-><-><-><-><-><-><-><-><-> Simon's new contact data: Home: Simon D. Levy, 3 Lexington Ave., Lexington, VA 24450, 540-458-5041 Office: Computer Science Department, Washington & Lee University, Lexington, VA 24450, 540-458-8419 (voice). levys@wlu.edu <-><-><-><-><-><-><-><-><-><-><-><-><-><-><-><-><-><-> In twenty years half the population of Europe will have visited the moon. -- Jules Verne, 1865 <-><-><-><-><-><-><-><-><-><-><-><-><-><-><-><-><-><-> Announcement Archive: http://www.pobox.com/~fhapgood/nsgpage.html. <-><-><-><-><-><-><-><-><-><-><-><-><-><-><-><-><-><-> Legend: "NSG" expands to Nanotechnology Study Group. The Group meets on the first and third Tuesdays of each month at the above address, which refers to a restaurant located in Cambridge, Massachusetts. The NSG mailing list carries announcements of these meetings and little else. If you wish to subscribe to this list (perhaps having received a sample via a forward) send the string 'subscribe nsg' to majordomo@world.std.com. Unsubs follow the same model. Discussion should be sent to nsg- d@world.std.com, which must be subscribed to separately. You must be subscribed to nsg-d to post to it and must post from the address from which you subscribed (An anti- spam thing). Comments, petitions, and suggestions re list management to: nsg@pobox.com.