Professional presence.
Lee Rainie and Janna Anderson, Imagining the Digital Future
Warren Yoder People have changed before. "The hard work of adaptation will continue as we learn to use AI tools to create lives for ourselves and selves for our lives. Change comes quickly. Wisdom comes slowly."
"Every age has its terrors. The terror for early moderns was electricity, a new and previously unthinkable force that dominated both their imaginations and their nightmares. Mary Wollstonecraft Shelley made this terror visible when she created Frankenstein, a monstrous technologist. She helped early moderns domesticate their fear, making it possible to imagine both dangers to avoid and possibilities for electricity to improve their everyday lives.
"We are now grappling with a level of artificial intelligence previously imagined only in science fiction. The initial reaction of the intellectual class was epistemic panic. But people adapt.
"AI enters a world dominated by human culture, a vast super-intelligence to which every human contributes their minuscule part. The first to define the new reality were members of the informal Silicon Valley Central Committee, tech leaders united by their common debts and desires. Now, world culture is catching up. Merriam-Webster contributed to the domestication of AI when it made 'slop' the word of the year.
"Our adaptation will accelerate at the same time that AI slop takes over advertising, social media and much of our digital communication. We are moving quickly to develop new ethics and legal responses to counterbalance the Silicon Valley Central Committee's defining vision. The hard work of adaptation will continue as we learn to use AI tools to create lives for ourselves and selves for our lives. Change comes quickly. Wisdom comes slowly.
"Philosophers are already finding their place in AI alignment. Artists must be next. We need artists who can make the AI terror of our age visible, much as Mary Shelley brought electricity to life so that we could vicariously experience the monsters we did not want to become."
Lee Rainie and Janna Anderson, Imagining the Digital Future
Warren Yoder: The valorization of science fiction has opened the way for tech leaders to recast puffery as serious prediction, thus boosting hype cycles; 'humans are more than intelligence.'
"Philosophy may be the discipline most transformed in the next decade by the exploding interaction between humans and AIs. Now that we are not the only beings who can ask what kind of beings we are, old questions will be reframed and new questions asked.
"What does it mean to be human? Are we fundamentally thinking stuff, as Rene Descartes ('I think, therefore I am') proposed, or is there more to being human than just intelligence? When AI is roughly as intelligent as a human individual, will capitalism inevitably drive AGI to subjugate human culture? Is there a better way? Many of the answers we have now do not serve us well. The task of philosophy, both professional and popular, is to make sense of the sense we make. Engineers can think of philosophy as a stress test for ideas. Until we cooperatively come up with better ideas, let us avoid these four simple misconceptions:
"Naive communication theory: When we communicate, we are trying to understand something someone somewhere created to express their own understanding. When we query an AI, we create all the understanding ourselves. The public large learning models today are correlation engines that do not have human-level understanding. Querying an AI, in a real sense, is communicating with the Zeitgeist. The biases, fabrications and incitements to violence of raw AI are all-too-honest reflections of the spirit of our times. Thank goodness for the heavy overlay of human engineering that teaches AI the social mores required for polite company. Expect this human engineering, including your own query engineering, to become ever more essential.
"Exponential expectations: Exponential functions are a delightful part of pure mathematics. They don't exist in the natural world. Any exponential function let loose in the natural world would soon turn the whole universe into its output. Paper clips, say. That obviously hasn't happened. Instead, rapid growth is usually driven by sigmoidal S curves: exponential growth followed by exponential slowing. Continued growth can be achieved by stacking sigmoidal functions, but that runs into its own constraints. Anyone using exponential language to describe artificial intelligence isn't thinking clearly.
"Pure puffery: Smart phones aren't actually 'smart.' The neural nets in AI models only superficially resemble the living neural connectomes in our brains. These neologisms are puffery: exaggerated statements not amenable to disproof. Marketing puffery is allowed by the commercial legal code, but it is always the enemy of clear thought. The valorization of science fiction has opened the way for tech leaders to recast puffery as serious prediction, thus boosting hype cycles to support their venture capital. Think through big claims, step by step, for yourself.
"Crumbling assumptions: Ideas we use to explain our world were all created in other times for other uses. We are constantly repurposing old ideas as we struggle to understand our rapidly changing reality. Some of these ideas cannot bear the added weight of new meaning. Intelligence is a good example. It had one meaning in Latin, another in the Middle Ages, only to be deprecated as unusable by early modernists. Intelligence was repurposed in the early 1900s by newly minted psychologists, first for the military, then academia, now for the rest of the world. We know higher scores on intelligence tests are correlated with success in some tasks and professions. But we have never agreed what intelligence means exactly. Some try to shoehorn social and emotional intelligence into the idea. We could even describe human culture as a super intelligence transcending generations and geographies.
"The creative intelligentsia obviously prize intelligence, and their work trained and named early AI. But humans are clearly more than intelligence. We are only now realizing what it means to repurpose a concept we never clearly defined to describe a thing we barely understand. How we think of intelligence is falling apart in our hands, too vague to help us decide if we have achieved artificial general intelligence. Honesty requires us to frankly acknowledge the inherent limits of our assumptions.
"The next 10 years will be a contentious time as we think through what it means to rely on AI. There will be countless misleading, thoughtless and even impossible claims made by people who should know better. Philosophy, the love of wisdom, will be essential as we struggle to understand our new realities."
Lee Rainie and Janna Anderson, Imagining the Digital Future
Warren Yoder The path to 2040 will be a jumble of unanticipated developments in tech, culture and policy
"The next 15 years will be a time of confusion, partly because of the initial misdirection and partly because the results of generative machine learning expose how little we know about ourselves. The path to 2040 will be a disordered jumble, full of unanticipated developments in technology, culture and public policy. Will machine learning make human life better or worse? Yes. Both. And many other things besides. Machine learning is capital- and expertise-intensive. Those who develop and finance machine learning have demonstrated over and over again that they have remarkably limited understanding of the complexity of both human individuals and society. This is most obvious in the names chosen for the new field. The basic technology was described as neural networks, even though neurons are far, far more complex. The field was called artificial intelligence, even though intelligence is a poor representation of humanity's culture-based capabilities. No one objected when these names were mere marketing puffery. Now that machine learning has developed modest capabilities, these misleading definitions are a serious misdirect."
Janna Anderson and Lee Rainie, Elon University and Pew Research Center
Warren Yoder, longtime director at Public Policy Center of Mississippi, now an executive coach, said, "As the 21st century picks up speed, we are moving beyond a focus on the protocol-mediated computation of the Internet. The new focus is on computation that acts upon itself, not yet with autonomous agency, but certainly moving in that direction. Three beneficial changes stand out for the medium-term promise they offer: machine learning, synthetic biology and the built world.
"ChatGPT and other large language models command most of the attention at the moment because they speak our languages. Text, images and music are how we communicate with each other and now, with computation. But machine learning offers much more. It promises to revolutionize math and science, disrupt the economy and change the way we produce and engage information. Educators are rethinking how they teach. Many of the rest of us will realize soon that we must do the same. COVID-19 vaccines arrived in the nick of time, a popular introduction to the potential of synthetic biology. Drug discovery, mRNA treatments for old diseases, modifying the immune system to treat autoimmune disorders and many other advances in synthetic biology promise dramatic improved treatments. Adding computation to the built environment is generally called the Internet of Things. But that formulation does not at all prepare the imagination for the computational changes we are now experiencing in our physical world. Transportation, manufacturing, even the normal tasks of everyday life will see profound gains in efficiency.
"Haunting each of these beneficial changes are the specters of gross misuse, both for the entrepreneur class's vanity and for big-business profit. We could lose not only our privacy, but also our freedom of voice and of exit. Our general culture is already adapting. Artists quickly protested the appropriation of their freely shared work to create the machine learning tools that could replace them. We do not generally acknowledge the speed of culture change, which happens even faster than technology change. Culture slurps tech with its morning coffee.
"Governance, on the other hand, is a messy business. The West delegates initial governance to the businesses that own the tech. Only later do governments try to regulate the harmful effects of tech. The process works poorly, but authoritarian regimes are even worse. In the medium-term, how well we avoid the most harmful effects of machine learning, synthetic biology and the built world depends on how well we cobble together a governance regime. The pieces are there to do an adequate job in the United States and the European Union. Success is anyone's guess."
Janna Anderson and Lee Rainie, Pew Research Center
Warren Yoder encouraged that humanity scrutinize its overall transition, writing: "Postmodernity interrogated modern power and knowledge. It was useful, back then. Now meta-modernity recognizes the existence of multiple modes of the real and prompts one's imagination to take bits and pieces from useful practices wherever we find them."
Warren Yoder, longtime director at Public Policy Center of Mississippi, now an executive coach, wrote, "The smartphone became part of daily life because it commanded foveal vision to present engaging social media stories. Its 1D sound was adequate for 2D storytelling. It was good enough then; it is tedious now. Today's rudimentary metaverse commands binocular vision and surround sound, with some attempts at haptic touch but little development on balance, position, smell or the other senses. Meta makers will have to do better for the sensory experience of the metaverse to command sustained attention.
"An immersive metaverse will also have to command more of the human imperatives that drive our attention. We have decidedly mixed examples for the three positive imperatives. Social media has shown us how to capture the social imperative for nefarious purposes. Porn and sex toy makers are working with the sexual imperative. Education meta makers are exploring ways to truly engage our innate curiosity. Still untouched are the two aversive imperatives of homeostasis and pain. They may not seem natural candidates for the metaverse. But what humans have done in the past, meta makers will redo in the metaverse. Physical challenges have a long and storied history. Will meta makers create desert marathons for participants to run to exhaustion? Will metagroups create painful, scarifying initiations?
"Before horrific developments overtake us again, we need deeper conversations about this new mode of being. Fortunately, key philosophers are doing useful work. Not the philosophers arguing for and against transhumanism. Look instead to those exploring the transition from postmodernity to metamodernity. Postmodernity interrogated modern power and knowledge. Useful back then. Now metamodernity recognizes the existence of multiple modes of the real and prompts one's imagination to take bits and pieces from useful practices wherever we find them.
"We have already begun constructing a new metamorphic reality not limited by old binary contradictions. The metaverse will develop in a world with a metamodern imagination. It is time for an astute foundation to bring together meta makers and metamodern philosophers to deepen this conversation."
Lee Rainie, Janna Anderson, and Emily A. Vogels, Pew Research Center
Warren Yoder, longtime director of the Public Policy Center of Mississippi, now an executive coach, responded, "Widespread adoption of real, consequential ethical systems that go beyond window dressing will not happen without a fundamental change in the ownership structure of big tech. Ethics limit short-term profit opportunities by definition. I don't believe big tech will make consequential changes unless there is either effective regulation or competition. Current regulators are only beginning to have the analytic tools to meet this challenge. I would like to believe that there are enough new thinkers like Lina Khan (U.S. House Judiciary – antitrust) moving into positions of influence, but the next 12 months will tell us much about what is possible in the near future."
Janna Anderson and Lee Rainie, Elon University and Pew Research Center
"Much will change in the practice of representative democracy by 2030. Democracy is an ideal that must be substantiated in a particular practice. Representative democracy is the predominant practice now, but it is inherently fragile and must be re-formed every political generation. Winning political power in a representative democracy requires skills and resources that elites learn to control. But elites are prone to gradually isolating themselves in self-referential communities. The politicians, operatives and supporters all have much the same education, experiences and life chances. As times change, they lose the ability to create compelling accounts that represent the new reality. The Great Recession, several foolish wars and growing inequality created such a generational change. The digital world allows many new actors to participate in forming new accounts and competing for power. We are at a low point in the changeover, with populist leaders using digital media to command the political narrative. But this has happened many times in the past with pamphleteers, muckraking newspapers, radio, deregulated television. Each time the political world reformed itself with new elites that mastered the new world. The changeover is already happening. From the current low point things will get better, just in time for a new generational crisis beginning soon after 2030."