This is my first blog post for a course at Creighton University entitled “Technology and Leadership.” The course is a part of the Interdisciplinary Doctor of Education program, in which I am a student (clearly). Throughout the semester, I will be posting more thoughts and reflections from the course… to my classmates who are reading along this semester, I look forward to sharing these ideas with you and learning from you as always.
Is the world “flat” or “spiky”? And what happens when machines become smarter than we are?
In The World Is Flat: A Brief History of the Twenty-first Century, Friedman (2007) suggests that the earth (in the social, economic, technological sense) is “flattening” because of improvements to communication and travel technology. The assertion is that technological advances since 1989 (para 10) have allowed individuals and businesses to dramatically increase productivity, expand markets, and enhance innovation through broader talent networks. In many ways, Friedman is correct. Technology has decreased the time it takes to do complicated tasks and increased our abilities to connect with one another. The internet’s beginnings as a government/university research and collaboration network speaks to the genetics of how we use the network technology.
Taken from the technological determinist perspective (e.g., McLuhan, 1964), Friedman might have us believe that the seemingly ubiquitous nature of technology and the internet is responsible for major shifts in both human consciousness and subsequent economic shifts – on par with the effects written language had on society (e.g., Ong, 1982, Shlain, 1998). That it pervades our lives in such a way as to change the way we think, what we perceive as real, how we do business, and even how we conceive of human relationships. If that were truly the case – or when it inevitably becomes the case – I would agree that networked technology will indeed fundamentally alter human communication and consciousness. We have already seen its effects in countries with advanced communication infrastructures like the US. In other countries, however, where even literacy hasn’t yet completely transformed society, technology will not thrive in the same ways it has elsewhere. (I’d be interested to explore the notion of skipping literacy in the shift to technology further…
What Friedman may miss, and what Florida (2005) asserts, is that the technology (no matter how good) is not sufficient for the complete “flattening” of the world. The social construction of technology approach (e.g., Bijker, Hughes, & Pinch, 1987) would suggest that the technology is only as influential as its host environment allows it to be. That is, technologies (in all forms) emerge and thrive because of economics, culture, and people’s readiness to adopt the innovation. The printing press, for example, became a pivotal technological innovation in human history because of the climate of late medieval Europe at the time of its invention. Scientific work was blooming, and intellectual curiosity was creating a demand for information. Economics and trade were calling for standards of language and measure and was pushing innovation from the financial side. Exploration and travel meant that more people across Europe, Asia, and Africa were being exposed to written language in the form of mass replications of text via the printing press. I believe that literacy has indeed fundamentally changed many societies, yet may still be a novelty in some others. If we plotted it, I think we’d see a trend of technological hotspots across medieval Europe similar to those presented in Florida’s (2005) article.
Does super intelligence eliminate the social constructivist narrative entirely? In some senses, I think that if the human condition is eliminated from the equation, and AI is given reign over further development and adoption of “technologies” we may see a very different topology. As Bostrom (2015) discussed in his talk, when machines begin to learn and adapt, the potential exists for them to move beyond even the limitations of the physical (or sociocultural) environments in which they operate. If self preservation becomes a value of learning machines, it could follow that they would devise ways of guaranteeing technological adoption and subsequent control of societies through a kind of determinism. If, as Shlain (1998) argues, the invention of writing systems led to the rise of hegemonic masculinity and the subjugation of women, who’s to say that a similarly nefarious plan couldn’t be hatched by the machines to subjugate humanity? I don’t know if I believe this (or just don’t want to), but I recognize the power of technology as a shaper of and respondent to humanity. I sincerely hope that we figure out, as Bostrom (2015) suggests, how to manage our continued exploration of technology!
I’ve failed to address how this practically applies to my work… as a teacher, the influence of networked life on students is profound. Simple examples like language fluency show how the changing communication technology shapes learning. The creation of unrealistic social expectations through MMORPG and other gaming systems impacts how students interact with one another in the classroom. Children who develop cognitively in fixed rules gaming/app environments become college students with a fixed understanding of what’s possible in the world. I think this last example (anecdotally) creates myriad problems in creative fields such as graphics, web design, and advertising; an area I plan to continue exploring!
Can you tell I love this stuff?!
Bijker, W. E., Hughes, T.P., and Pinch, T.J.,(eds). (1987). The social construction of technological systems: New directions in the sociology and history of technology. Cambridge, MA: MIT Press
Bostrom, N. (2015, March) What happens when our computers get smarter that we are? [Video file]. Retrieved from: https://www.ted.com/talks/nick_bostrom_what_happens_when_our_computers_get_smarter_than_we_are
Florida, R. (2005, October). The world is spiky. Atlantic Monthly. 48-51.
Friedman, T. (2007). The world is flat, 3.0. London: Picador.
McLuhan, M. (1964). Understanding media: The extensions of man. New York: McGraw Hill
Ong, W. (1982). Orality and literacy: The technologizing of the word. New York: Methuen
Shlain, L. (1989). The alphabet versus the goddess: The conflict between word and image. London: Penguin Books