Signs of the Singularity and Why Chris Anderson and Nicholas Carr Won’t Make the Next Cut
I noticed a similarity recently in posts from Chris Anderson and Nicholas Carr. Over the past few months both of these widely read authors published a thought provoking post that calls into question humanity’s stewardship of knowledge in today’s 2.0 world. And each post contains signs of the singularity. Read on brave traveler, but don’t forget to bring your towel !
Anderson, in The End of Theory: The Data Deluge Makes the Scientific Method Obsolete, postulates a world of technological utopianism without realism. Throughout his post Anderson challenges the scientific method with citations from authorities like Box and Turing. Despite the strength of each of his premises, the absurdity of Anderson’s challenge to the scientific method is surpassed only by his inability to reason. Anderson would do well to watch The Matrix again, where he’ll find Neo reading Baudriallard’s Simulacra and Simulation and hopefully recognize that he advocates a technological utopianism following the precession of the simulacra. Dangerous not only for Anderson, but also to those whose fascination with technology overwhelms their ability to think clearly.
As I mention in my previous post The State of the Semantic Web: Representation and Realism, despite its fragile foundation, model theory implies realism. The relation between a model and the world may be only one of approximation, but without realism, technological utopianism quickly precedes to simulacra and simulation. For those who are interested, John Sowa in Process and Causality, provides a very useful visualization in Figure 12 of the relation between the world, a model and a theory that Anderson would do well to better understand. Anderson’s claim that “[...] faced with massive data, this approach to science â€” hypothesize, model, test â€” is becoming obsolete” cannot be correct. Although he leads the reader to believe Google’s success is based solely on statistical induction, Google, a company that measures everything, has a well defined mechanism to validate the realism on which the models they derive from statistical induction are based. And that’s clearly Google’s income statement and its stock price. Currently Google’s page rank approach is holding up in the short term, but I recently had lunch with Vint Cerf and owe him and email about semantics. Semantics are a pressing issue for Google and the competition is increasing in semantic search with Microsoft’s acquisition of Powerset.
Anderson’s claim that statistical induction on large data sets will replace the scientific method is simply absurd. Induction, deduction and abduction all imply a scientific method through which either observer or participant embrace reality. Drew Conway’s The Hubris of the End of Theory provides useful insights on Anderson’s claims from a statistician’s perspective. It’s no small wonder that Nicholas Carr believes it essential to serve as a skeptic against technological utopians like Anderson.
Carr, in Is Google Making Us Stupid, postulates that there’s a behavior evolving in society: widely available information expressed in binary relations without transitive closure and the Internet as the medium through which is it conveyed is leading unwitting individuals to engage in habits that build cognitive pathways which reduce their attention span. And we can’t stop. Carr describes his own experience succumbing to this pernicious affectation as well as his unsettling feeling that he can neither control or reverse the process already underway in his own life. Ultimately, Carr concludes “Thatâ€™s the essence of Kubrickâ€™s dark prophecy: as we come to rely on computers to mediate our understanding of the world, it is our own intelligence that flattens into artificial intelligence.” According to Carr, our wills have somehow been overcome by a force stronger than reason or survival.
I’ll admit that I spend a good deal of time engaged in the behavior Carr describes. I call it surveying and I’ll claim that I’ve discovered some amazing things that I would not have other wise known: Enterprise Architecture and the Information Flow Framework are just two examples. Today, services like StumbleUpon propose to automate that process. Possibly Carr would benefit from a hobby like transcendental meditation or enlisting in the military where he might develop the discipline to walk away from the machine when he feels himself losing control. But most importantly Carr can overcome his condition by developing a complete theory to guide his surveying. And from this complete theory, possibly using one that Anderson has jettisoned, Carr will develop an intuitive sense of closure and put his conscience at ease.
Carr could also develop a sense of being in the long now. By being in the long now I mean a patience that values the experience of knowledge gained over time without fear of loss or the limitation implied by immediacy. A long time ago we called that wisdom. This sense of being in the long now allows someone to have the confidence to develop a multi-year project. Mick Goodrick, an avid follower of Gurdjieff and a guitar teacher of mine in what seems a so long ago, understood well what it means to be in the long now. Faced with a lifetime of mastering techniques to communicate emotion through sound, Goodrick advocated well defined projects with a bounded subject over a fixed time period and no fixed outcome. In music theory, there’s no shortage of subjects from which to choose and the ordering of subjects into patterns is just another project. And part of Goodrick’s technique is retrospective: look back on what you accomplish and allow that experience to further shape your experience. Carr could design a project in which he developed his own theory based on what he surveyed over a two year period, then retrospectively analyze his theory in the context of information theory, starting with Shannon’s Mathematical Theory of Communication, followed by Barwise and Seligman’s Information Flow: the Logic of Distributed Systems and finally Goguen’s Theory of Institutions.
By building on his premise that behavior builds habit reinforced by cognitive pathways, Carr perpetuates the myth of a technological distopia: the myth that our intelligence is becoming subservient to that of machines. Carr says “Still, their easy assumption that weâ€™d all â€œbe better offâ€ if our brains were supplemented, or even replaced, by an artificial intelligence is unsettling. It suggests a belief that intelligence is the output of a mechanical process, a series of discrete steps that can be isolated, measured, and optimized. In Googleâ€™s world, the world we enter when we go online, thereâ€™s little place for the fuzziness of contemplation. Ambiguity is not an opening for insight but a bug to be fixed. The human brain is just an outdated computer that needs a faster processor and a bigger hard drive.”
Those who have invested decades in the advancement of artificial intelligence will attest that we’re not so close to the singularity that we can’t avoid a technological distopia. And no matter how much money Larry Page and Serge Brin have in the bank, the proofs left to us by Turing and Godel remain a considerable challenge to scientists and programmers alike, despite the science fiction of Vernor Vinge or the optimism of Ray Kurzweil. And despite the ongoing work in synthetic biology, the works of Allan Watts stand as a testament to understanding the fundamental challenge of modeling organism with mechanism.
So the similarity in the posts by Anderson and Carr are signs of the singularity. Anderson, a technological utopian, who claims that the scientific method no longer has a place in technology. And Carr, a technological distopian, who has no theory at the foundation of his surveying. Not the singularity of Vinge or Kurzweil, but a utilitarian singularity that is here today in the use of technology grounded in scientific discipline, not through the rejection of reason, but in the use of technology that shapes our daily lives and sets us free through signs that we sometimes understand and sometimes don’t. But, through deeper study we’ll better understand the signs of the singularity.