Kevin Kelly, co-founder of Wired magazine, The WELL, and author of the blog Cool Tools gave a talk called "The Next 100 Years of Science: Long-term Trends in the Scientific Method" at the Long Now Foundation Lecture Series. I was not really familiar with Kelly’s writings, but I attended the talk because of my interest in the topic and because I was familiar with Kelly’s reputation as a respected commentator. Needless to say, his Cool Tools blog did not prepare me for what to expect from his talk, nor does the blog do his talents justice.
Kelly is a self-proclaimed scientist groupie, being a college drop-out and having never participated in technology as a scientist or engineer. He contributes as a cultural commentator, which is how he approached his lecture. Kelly said that he is more interested in the process of science rather than science itself and noted that most scientists are “clueless” about the topic. His interest in talking about the future of science is in how the process will evolve, rather than what actual breakthroughs will be made. So, there was no speculation on the forthcoming prevalence of jetpacks, flying cars or replicants (those would be technological advances rather than scientific advances, anyway).
Despite the forward-looking title, Kelly spent much of his talk detailing key developments in the past history of science. To predict future developments in the scientific method, he would look for patterns in the scientific process over the past 2000 years.
Kelly’s abbreviated history of the scientific process timeline went like this:
2000 BC: first bibliography
250 BC: First catalog
200 BC: first library with an index
1000 AD: first collaborative encyclopedia
1590: first controlled experiment
1600: Introduction of laboratories
1609: Introduction of scopes
1650: Society of Experts created
1665: The concept of necessary repeatability introduced
1665: First scholarly journal published
1675: Peer review introduced
1687: The concept of hypothesis/prediction introduced
1920: Falsifiability introduced
1926: Randomized design created
1937: Controlled placebo approach developed
1946: First computer simulation
1950: Double-blind refinement
1962: Kuhn’s Study of the Scientific Method
All of these are changes to the process of how we know something. The introduction of Falsifiability, for example, affected what we would consider a scientific theory: if a theory could not be proven wrong, then it wasn’t a theory at all (and could more likely be categorized as a belief).
After detailing his view of how the scientific method has evolved up until now, Kelly then went on to present five predictions of how science and the scientific method would change over the next century:
- Science will change in the next 50 years as much as it changed in the last 400. No doubt. Everything is accelerating, although we are highly unlikely to achieve a singularity as Ray Kurzweil suggests.
- It will be a Bio century. Kelly provided data that demonstrates how biology is already the biggest scientific field today and suggested that the amount that we have to learn over the next several decades will overshadow developments in every other field.
- Computers will lead the Third Way of Science. Kelly suggested that the general methods for making scientific progress have so far been Measurement and Hypothesis. He suggests that Computer Simulations will become just as important a tool in the scientist’s arsenal for advancing our knowledge and understanding. Don’t know how something works? Run simulations of every possible parameter set and permutation until you accurately model the behavior of the process that you are observing. I see this already in my field, and certainly simulations play a significant role in our understanding of many different systems today, from economics to physiology.
- Science will create new ways of knowing. Kelly (I think) is talking about tools here. He mentioned wikis, distributed computing, journals of negative results, and triple-blind experiments as examples of recent changes to the process of developing and sharing information. Distributed computing is the distribution of a parallel-processed problem to be solved across many connected computers, as is already being done by SETI and for conducting cancer research. Triple-blind experiments refer to the gathering of massive amounts of data and storing it for future experiments that haven’t been specified yet, with such a broad swath obtained that the control data can also be extracted from the database.
- Science will create a new level of meaning. Here Kelly extrapolated the concept of distributed computing by speculating on the power of all the computers on the internet as a single computing machine. He created analogies between this massive system and the structure of the brain. I have to admit, my notes are sketchy on this section, but they include discussion of both science and religion as consisting of infinite games and recursive loops, and proclamations that Science Is Holy and that the long-term future of science is a divine trip. I guess you’ll have to wait for his book for an explanation of these concepts.
The Q&A section after his talk was perhaps the most interesting part of the seminar. Kelly has clearly spent a lot of time thinking about these issues, and his thoughts are both entertaining and intellectually interesting even if you think that he has completely missed the boat and take issue with his non-scholarly approach.
Keven Kelly seems like he would be an interesting guy to meet at a party for a memorable night of discussion.
Recent Comments