When the World Changes Underneath Us: Leading for the Common Good in a Time of Politcial/Technological Upheaval
Using the See-Judge-Act method With Catholic Social Teachings
Historians call moments like ours a phase change—a term from physics that describes when water doesn’t just get hotter but becomes something else entirely: steam, ice, a new state.
We are living through a societal phase change right now—a fundamental reshaping of how we work, relate, make decisions, raise children, find community, and understand ourselves. This shift is driven by politics, which is being driven by artificial intelligence, social media, algorithmic governance, automation, surveillance technology, and the decline of local, face-to-face institutions that once held communities together. Politics used to be local, we would say. But if that is true, then why is all the money spent on media?
This isn’t a reason for panic. But it is a serious reason. Because phase changes don’t wait for us to be ready, and they don’t distribute their costs and benefits evenly.
What does it mean to lead for the common good when the world is being reshaped beneath us? The Catholic social tradition offers five principles crafted for this very moment. Added to the wisdom of three prophetic thinkers — Thomas Merton, Marshall McLuhan, and Mortimer Adler — they form a framework not to predict the future, but to help us navigate it faithfully.
Human Dignity
The most fundamental claim in Catholic social thought is also the most countercultural one right now: every human being is created in the image of God. The theological term is imago Dei. What it means, practically and politically, is that no person can ever be reduced to a number, a category, a data profile, or a prediction.
Emerging technology creates enormous pressure to do exactly that. Credit scores shape what families can borrow. Hiring algorithms screen résumés before any human reads them. Recommendation engines decide what news we see, what products we’re shown, what ideas we’re exposed to — based not on what’s true or good or important, but on what will keep us engaged. In the criminal justice system, algorithmic tools predict the likelihood of reoffending and influence sentencing. In schools, software flags students’ emotional states and academic trajectories before a teacher has had a chance to know them.
These systems aren’t usually malicious. Most are built by people trying to do useful work. But there is a structural tendency embedded in how they operate: they treat human beings as outputs. You are the sum of your patterns. You are, in the language of machine learning, a prediction.
Thomas Merton — writing in the 1960s, long before any of this existed in its current form — warned against what he called the “enormous mass of mental and emotional rubbish that clutters our minds.” Today, some of that rubbish is the assumption that if a system runs on data, it must be neutral — that efficiency and fairness are the same thing. They are not.
Marshall McLuhan, the Canadian media theorist, gave us an insight that a Jesuit priest, John Culkin, later distilled into a memorable phrase: “We shape our tools, and thereafter our tools shape us.” McLuhan’s own version of this idea runs throughout his work, including his landmark book Understanding Media, where he argued that every technology extends some human faculty — but in doing so, it also amputates another. The printing press extended the reach of the written word and diminished the oral tradition. Television extended visual storytelling and shortened the attention span required for sustained reading. What does our current wave of technology extend? Speed. Pattern recognition. Global reach. What might it amputate? Patience. Presence. The irreducible human act of looking at another person and deciding: this person matters, and I want to understand them.
Mortimer Adler, the philosopher and lifelong champion of the Great Books tradition, spent his career insisting that any system — educational, economic, or technological — that treats people as means rather than ends has failed its most basic moral test. He grounded this in the ancient Greek concept of eudaimonia: the full, flourishing human life. Not productivity. Not efficiency. Flourishing.
What leadership looks like: Consistently and publicly affirm that a person is always more than their data. When new technologies or political movements driven by technical change are introduced, they require review and ask specific questions: What are we optimizing for? Who evaluated this choice? If a system is wrong about someone, ensure there is a clear person they can talk to and a transparent process for addressing errors.
The Common Good
The second principle is the common good — the conviction that the good of each is genuinely inseparable from the good of all. Society must be structured so that every person, not just some, can flourish.
Emerging technology tends to work against this. Wealth and capability are concentrating at a pace and scale we haven’t seen in a century. The organizations that can afford to build and deploy the most powerful technological systems gain structural advantages that compound over time — in hiring, marketing, logistics, and capital allocation. Meanwhile, entire communities are on the wrong end of what’s sometimes called the digital divide: rural areas without reliable broadband, older adults cut off from services that have moved entirely online, and lower-income families whose children lack access to devices and learning tools that wealthier peers take for granted.
But it runs deeper than access. When the systems used to make consequential decisions — who gets a loan, whose neighborhood gets investment, which students are recommended for advanced programs — are trained on data that reflects existing inequalities, those inequalities get automated. They become faster, cheaper, and nearly invisible. The technical term for this is distributional shift. The moral term is injustice.
Social media compounds the problem in a different direction. Platforms optimized for engagement — for keeping us scrolling — have discovered that outrage, anxiety, and tribal conflict are more engaging than nuance or solidarity. The result is communities more divided, families more strained, and public discourse more degraded than at any point in recent memory. This is not an accident. It is the logical output of a system optimized for the wrong thing.
McLuhan described our era as a global village — a world knit together by electronic media. He saw it as a potential blessing, but he was clear-eyed about the risks. He noted that new communication technologies have historically widened gaps and deepened conflict before societies learn to absorb and integrate them. We are living that warning in real time.
Adler stressed that equality is foundational, not optional, in a just society. Access to the conditions of flourishing — including access to the information systems that increasingly shape economic and civic life — is not a luxury. It is a justice claim.
What leadership looks like: With every new technology or political system, ask: Who benefits? Who is left out? Act on these answers, even when it’s difficult. The common good must be a standard of accountability, not just a talking point.
Subsidiarity
Now we arrive at the principle that may be the most urgently needed — and the most thoroughly violated — in our current moment: subsidiarity.
Subsidiarity holds that decisions should be made at the lowest appropriate level, the level closest to the people they affect, and elevated to higher levels only when genuinely necessary. It is a principle designed to protect local wisdom, human judgment, and the dignity of participation. It says: the people who know the situation best should be the ones making the call.
Emerging technology systematically does the opposite.
It centralizes. It takes decisions that once belonged to teachers, doctors, loan officers, local officials — people who knew the specific human being or community in front of them — and moves them to algorithms, platforms, and distant corporate headquarters that have never met the person and never will. The teacher who once said, “I know this child, and these test scores don’t tell the whole story,” now works in a system where software has already scored the child’s essay, flagged their engagement patterns, and projected their academic trajectory. The loan officer who once knew that the business owner across the desk had just weathered a family emergency has been replaced by a credit model that sees none of it. The local newspaper that once covered the school board meeting is gone, and the platform that replaced it shows residents content optimized for their existing beliefs rather than information about their actual community.
Local knowledge — irreplaceable, relational, particular — is being systematically extracted from consequential decisions. What fills the vacuum is platform control, algorithmic authority, and the assumption that aggregate data is always wiser than situated human judgment. It often isn’t.
McLuhan saw this dynamic clearly. He understood that scale changes everything. A technology that seems like a neutral tool at the individual level can, when deployed globally, erase what matters most at the local level. He called this a kind of cultural imperialism: one set of assumptions, embedded in a technology, imposed on everyone who touches it.
Merton’s entire life was an argument for the value of the local, the personal, the particular. His hermitage in the Kentucky woods was not an escape from the world. It was a witness to the idea that there are things you can only know up close — and that the contemplative, attentive mode of knowing is not less real than the analytic one. In certain respects, it is more real.
Adler spent decades defending the importance of genuine human judgment — the kind that cannot be reduced to a formula. In his Paideia framework for education, every student deserved not just instruction but Socratic dialogue: a real, responsive, irreducible encounter with another thinking human being. That encounter cannot be automated. And yet we keep trying.
What leadership looks like: Pushing decisions back down whenever possible. Designing systems that augment human judgment rather than replace it — that give the teacher, the manager, the doctor better information rather than handing them a verdict to rubber-stamp. Keeping humans meaningfully in the loop, not nominally — with real authority, real information, and real accountability. This is not anti-technology. It is pro-human. There is a difference.
Solidarity
The fourth principle is solidarity — the recognition that we are responsible for one another, and especially for the most vulnerable among us. This is not sentimentality. It is a structural claim about the nature of human society: we are bound together, and how we treat the least among us reveals who we really are.
Emerging technology can violate solidarity in ways that are both obvious and subtle.
The obvious: AI systems trained on historically biased data reproduce historical patterns of discrimination — in criminal justice, housing, healthcare, and lending. A recidivism prediction model trained on data from a system that has historically over-policed certain communities learns to over-predict risk for people from those communities. It doesn’t intend to discriminate. It has been trained to. Facial recognition software performs significantly worse on darker skin tones. Voice assistants struggle with certain accents. Medical diagnostic tools were developed on clinical data that underrepresented women and people of color, and their accuracy reflects it. The technical term for this is underrepresentation. The moral term is exclusion.
The subtler violation: technology can fragment the social bonds on which solidarity depends. Algorithms that maximize engagement tend to maximize conflict. Platforms that connect us globally can leave us more isolated locally. The bowling leagues, parish councils, neighborhood associations, and union halls that once gave ordinary people organized voices in public life have declined steadily, while the platforms that replaced them as gathering spaces are owned by a handful of corporations with no stake in the local community.
Merton, speaking in Calcutta just weeks before his death in December 1968, said something that has never been more relevant: “We are already one. But we imagine that we are not. And what we have to recover is our original unity.” The work of solidarity is not creating a connection that doesn’t exist. It is removing the illusions and structures — including technological ones — that make us act as if it doesn’t.
McLuhan’s global village was not a utopia. He saw that the same technologies that made the suffering of a stranger on the other side of the world immediate and undeniable could also produce cacophony and tribalism. More communication does not automatically mean more connection. It can mean more fragmentation.
What leadership looks like: Asking, with every system you rely on: Who is made invisible by this? Whose experience is missing? Whose outcomes are worse because they don’t fit the dominant pattern? Then, do something about it. Investing in more inclusive data. Auditing for disparate impact. Building real redress mechanisms for people who are harmed. Supporting the local institutions — including parishes — that hold communities together in ways no algorithm can replicate. Solidarity is not a feeling. It is a discipline.
Vocation and Meaning
The fifth and perhaps most personal principle is this: human work is not simply productive activity. It is participation in meaning. It is the expression of creativity, care, and calling. The theological word is vocation — the sense that our work is bound up with who we are and who we are called to become.
Emerging technology promises efficiency. The gains are real, and we shouldn’t pretend otherwise. But efficiency doesn’t answer the deeper question: What is the work for?
When you remove the human being from a task, you also remove the meaning that person was making. The teacher who grades papers by hand isn’t just checking answers. They are learning how their students think. They are noticing who is struggling, who has turned a corner, and who has something surprising to say. Automate the grading, and the throughput goes up — and something essential disappears. The same is true of the nurse who used to have time to sit with a patient, the pastor who used to make house calls, the craftsman whose pride in the work was itself a form of witness.
This isn’t nostalgia. It’s a recognition that certain kinds of human activity are valuable not only for their outputs but for what they do to the people performing them — and to the people receiving them.
Merton warned, again and again, of a civilization full of noise but empty of depth. His deepest concern wasn’t war or poverty, as important as those were to him. It was what he called the loss of the contemplative dimension — the capacity to be present, to listen, to encounter the real. He feared we were building a world that could do everything fast and nothing well.
McLuhan argued that every medium has both content and effect, and that the effect is often more significant than the content. Social media doesn’t just distribute information; it shapes how we relate to one another. AI doesn’t just automate tasks; it shapes how we think about what tasks are worth doing at all. He called this the narcotic effect — the way technology can numb us to what we’re losing even as it delivers what we think we want.
Adler spent the last decades of his life arguing that education should not be primarily about job preparation. It should be about learning to live a meaningful life. He distinguished between schooling — the transmission of skills and information — and education — the formation of a human being capable of wisdom, reflection, and genuine happiness. He worried that without a richer vision of human purpose, we would produce generations of people who were competent at tasks but adrift in meaning. That worry has aged remarkably well.
What leadership looks like: Resisting the temptation to automate everything that can be automated. Asking whether a task involves real judgment, real relationship, or real creativity — and if the answer is yes, protecting it, even when the efficiency case for replacing it is compelling. Measuring success not only by what was produced but by whether the people doing the work find it worth doing.
Putting It Together
Our tradition gives us an ancient method for navigating exactly this kind of moment: See. Judge. Act. Look honestly at what is happening. Hold it up against the light of genuine values. Then respond — concretely, locally, specifically.
See: We are not living through a series of technological updates. We are living through a transformation of the basic conditions of human life — how we work, how we know things, how we belong to one another, how decisions are made about us, and who holds power. Most of this is happening without our explicit consent, and much of it is invisible until you start looking.
Judge: Our tradition is clear. Every person bears the image of God and cannot be reduced to their data. The good of each is inseparable from the good of all. Decisions belong to those closest to the situation. We are responsible for the most vulnerable. And work must serve human flourishing — not merely human productivity. When technology, however efficient, violates these principles, it requires not rejection, necessarily, but resistance: the kind that says we will not trade dignity for convenience, and we will not automate away what it means to be human.
Act: The action begins not at the level of global systems or national policy — though those matter — but here, in the specific roles and relationships each of us already carries.
Question the tools you are given. Ask who built them, what they were optimized for, and who they were tested on. This is responsible stewardship, not technophobia.
Advocate for those not represented. When the data doesn’t include them, when the system doesn’t see them, be the person who names it.
Keep human judgment alive. In your families, your workplaces, your parish. Don’t rubber-stamp algorithmic verdicts. Don’t confuse automation with authority.
Support the local institutions — the parish, the neighborhood, the face-to-face community — that hold something no platform can replicate.
And refuse to trade dignity for efficiency, even when the trade seems small. Because trades are rarely small and rarely temporary.
The question before us is not whether emerging technology will shape the future. It will. It already has.
The question is whether we will lead in such a way that this future remains worthy of the human person.
That begins not with grand strategy but with how we see the person in front of us. Whether we let a system define them, or insist on knowing them. Whether we ask, over and over again: Who is being left out of this story?
That is what leadership looks like in a world undergoing a phase change.
Questions for Reflection
On Dignity: Think of a time a system made a decision about you without knowing who you actually are. What was missing from that encounter? Now think of a time you reduced someone else to a category — a type, a case, a number. What would it have taken to see them more fully?
On the Common Good: What technologies does your organization, school, or parish use that you’ve never fully examined? Who benefits from those tools? Who might be quietly disadvantaged in ways you haven’t considered?
On Subsidiarity: Where in your life or work has a decision been moved away from the people closest to a situation — toward a system, a platform, or a distant authority? What was lost when that happened? And where do you have the power to push the decision back toward the human level?
On Solidarity: Whose experience is missing from the systems and structures you rely on? Who is invisible to the tools you use? Where in your community or professional life are you acting as if separation is real — and what would it take to close that gap?
On Vocation: Is there something in your work, or your life, that you’ve been tempted to automate or optimize away — something that carries a meaning you struggle to articulate? What would you lose if you never had to do it the slow way again? Is that a loss worth naming?
On Leadership: If you were to leave this reflection and take one concrete action — not a resolution, not a study group, but one specific act — what would it be? Who is the first person you need to talk to? And when will you do it?

