You are currently viewing The Tyranny of Data

The Tyranny of Data

The Tyranny of Data

As we become ever more dependent on big data to answer our questions and solve our problems, we must consider the implications of an epistemology that rejects the abstract and the unknowable.

By Sarah Ngu

Data science has been dubbed by Harvard Business Review as “the sexiest job in the 21st century” and by The New York Times as a “hot new field that promises to revolutionize industries, from business to government, health care to academia.” As the Times article explains, technology has given us access to gargantuan amounts of data, and companies, universities, and governments are rapidly hiring data scientists with the statistical and programming skills to make sense of “big data,” a phrase coined in 1997 by two researchers to describe data whose size was too large for standard computer systems.

Evidence of this growing coronation of “data as king” is everywhere. “How big data is changing almost everything” was one of the agenda items in this year’s Bilderberg meeting, an annual semi-secretive gathering of the world’s political and financial elite to have “informal, off-the-record discussions about megatrends and the major issues facing the world.” Companies are tapping into behavioral science research and recording the behaviors of their customers in order to figure out how to strategically maneuver them. Founded in 2004, Palantir, a “big data” company that analyzes sectors such as cybersecurity, anti-fraud and capital markets with huge amounts of data, has been valued by some investing groups at around $8 billion; as comparison, Twitter is valued at $9 billion. More recently, Edward Snowden’s leaks reveal how much our national security strategy depends on data-collection from the program Prism, which accesses the information warehouses of top internet giants such as Google, Facebook, and Apple.

Snowden’s leaks have set off public debates over what principles and values are at stake with a program like Prism. Whichever side one lands on in this debate, it is clear that “data scientists” cannot be solely equipped with a calculator and a computer if they are going to confront the philosophical and ethical questions that are inevitably arise in the process of data analysis and strategy. What other “tools” do they then need?

Perhaps a look back into history might offer insight. Although the history of data science is brief, there has always been a need for people who can interpret and strain meaning out of mysterious sets of information. Today, it is big data, but in the past, it was omens, signs, and dreams. In the Bible, Joseph, a Hebrew slave, was summoned to interpret a dream that has troubled Pharaoh, the king of Egypt, and which no magician or wise man could decipher. The dream is as follows: Seven plump cows graze by the Nile only to be swallowed up by seven thin cows. Joseph interprets it to mean that Egypt will experience seven years of abundant harvest followed by seven years of severe famine. Based on this, he recommends that Egypt store up one-fifth of the harvest for the first seven years as a reserve for the subsequent seven years of famine. Because of his insight, Pharaoh appoints Joseph as administrator over Egypt’s agriculture. At this point, his skills of dream interpretation—or data analysis—cannot carry him farther, but he nevertheless flourishes at his position due to his wisdom and ethical principles.

Data has been crowned king, and anything that does not have its backing begins to lose legitimacy. If a value or principle lacks sufficient “measurable evidence,” then one might as well begin ringing the doomsday bells, as many have already done so for the liberal arts who are struggling to justify their existence. Of course, we recognize even if we do have data for something, the data cannot interpret itself or make decisions about what to do. Data has always been meaningless if it is not interpreted through a model that we humans construct with our premises, values and goals.

The problem is not that a data-mindset in and of itself is negative, but simply that statistics and graphs provide insufficient resources to unravel and tackle complex issues.

Recent publications in the past few years such as “What Big Data Needs: A Code of Ethical Practices” and “Ethics of Big Data: Balancing Risk and Innovation” indicate that people are turning to ethics to supplement data analytics. But simply resorting to “ethics” might be too simplistic, for data’s reign has begun to shift how we talk about values and ethics. Last year, David Brooks observed a discussion by recent Stanford graduates over what career path was most justifiable to pursue: finance/consulting, nonprofits, tech start-ups, and so on. Brooks noted that people found it easier to use a utilitarian vocabulary (e.g., How can I most productively apply my talents to the problems of the world? How can I serve the greatest number of people?) than to use the “vocabulary of moral evaluation, which is less about what sort of career path you choose than what sort of person you are.” Resource-allocation and quantifiable outputs were the guidelines for the conversation, and not intrinsic values, daily habits, or any other intangibles that are much harder to graph and slot into tables. Programs like Prism are justified through utilitarian calculations that cannot make sense of fuzzy rights to liberty and privacy. Our models of interpretation and decision-making are gradually absorbing the quantifiable, measurable grammar of data.

That conversation does not seem to be an isolated occurrence, but rather is indicative of increasing reliance on statistics in our current media discourse. Helen Rittlemeyer, in an article last year in The American Spectator, lamented the new “cultural rules in journalism,” in which “a person can’t speak with authority without citing economics or sociology… [this] has begun to affect people’s ethical thinking.” She adds, “The idea that something might be spiritually harmful (or beneficial) in a way that can’t be demonstrated statistically has been written out of the conversation.” Journalism is one of the major arenas for public persuasion, so the rise in dominance of statistics indicates that data increasingly holds the authority to which we feel we can safely appeal.

The problem, however, is not that a data-mindset in and of itself is negative, but simply that statistics and graphs provide insufficient resources to unravel and tackle complex issues, and thus a worship of data—data-ism—is unfounded. To rephrase the original question, if what is required is a shift in mindset, how then do we go about thinking differently in a way that supplements the data-ism of our age?  

One obvious response is to turn towards the humanities, the apparent antithesis of data-ism. Interestingly enough, the American Academy of Arts and Sciences spends some time addressing the rise of big data in its report, “The Heart of the Matter,” a case for the humanities and social sciences that was published this year. In its introduction, it describes America as a nation whose “founding [is] rooted in Enlightenment philosophy” and whose “future [is] informed by the compilation and analysis of Big Data,” implying that the significance of our data-epoch might be comparable to the Enlightenment’s. While it is not without insight, as Stanley Fish, a literary scholar, scathingly noted, it mostly consists of vague platitudinal statements, such as that the liberal arts will promote “skills in communication, interpretation, linking and synthesizing domains of knowledge, and imbuing facts with meaning and value.” To ask for more specificity (e.g., How does the liberal arts cultivate these skills?) might be too much to ask for from what seems to be mainly a slogan-heavy report.

The right hemisphere absorbs new information, paying attention to embodied particularities and interconnections; the left hemisphere takes the experience and recasts it in more manipulatable form through processes of abstraction and compartmentalization.

While an intense discussion of what a liberal arts education can offer in promoting alternative ways of thinking is likely a fruitful one, perhaps part of the solution might lie in tackling a fundamental driving force behind the data-ism: our desire for control and mastery. Ian McGilchrist, a psychiatrist, argues that that desire is propelled by the over-dominance of the left-hemisphere of our brain in his 600-page tome The Master and his Emissary.

Initially trained in English literature at Oxford, McGilchrist went on to pursue medicine and psychology; he is now a Fellow of the Royal College of Psychiatrists in the United Kingdom. His book provides an illuminating grid—the left-right brain divide—with which to understand Western history and our presently data-centric culture and persuasively argues that we must return to the oft-neglected right hemisphere.

While the differences between the hemispheres are largely overhyped, the left and right hemispheres do work in conjunction and in distinct ways. The right hemisphere absorbs new information, paying attention to embodied particularities and interconnections; the left hemisphere takes the experience and recasts it in more manipulatable form through processes of abstraction and compartmentalization. The left is largely responsible for the technological progress that enables us to gather data and the ability to analyze it. McGilchrist attempts to describe what the world would look like if it were run by the left hemisphere alone:

[There would be] an increasing specialization and technicalizing of knowledge. This in turn would promote the substitution of information, and information gathering, for knowledge, which comes through experience… Skills would be reduced to algorithmic procedures… Quantity would be the only criterion which it would understand. The right hemisphere’s appreciation of How (quality) would be lost.

It is a world that uncannily resembles our data-centric reality. Because the “left hemisphere pays attention to the virtual world that it has created, which is self-consistent, but self-contained, ultimately disconnected from the Other,” it creates an “I-it world” in which everything is mastered and dissected. In contrast, the right hemisphere creates an “I-thou” world which “pays attention to the Other, whatever it is that exists apart from ourselves, with which it sees itself in profound relation” and deep connection.  

At the center of Truth is not a text, but a divine Being who refuses to be understood as a sum of discrete parts but instead is fully God and fully man.

To return to our question of how we can begin thinking differently in a way that is freed from data-ism, a regimen in moral philosophy or ethical reasoning might be helpful, but if we follow McGilchrist’s arguments, it might not be enough. What is needed is not more tools to enable further dissection and mastery over something. We need, first and foremost, a fundamental shift in our brain-wiring from “I-it” to “I-thou,” which requires cultivating a humble respect and even love of what we cannot control or fully know.

In particular, there are three arenas which are primarily grounded in the right hemisphere that he recommends we explore: the body, the spirit, and the arts. First, we must shift from treating our body as an instrument and embrace its material limitations, which often resist our will. Second, spirit, or religion, provides myths that allow us to approach a spiritual Other and gives us more than material values. Third, the arts move us to a love of what is beautiful and which has no acquisitive or utilitarian motive. All three realms are “vehicles of love” towards the Other, which the “left hemisphere does not understand and sees an impediment to its authority.”

Christianity, which McGilchrist briefly mentions, incorporates the strengths of all three channels: the Incarnation of Christ affirms the material body, the marriage of Christ and the Church moves us towards union with a spiritual Other, and the commandment to love God and our neighbor—period—prohibits us from instrumentally loving others for the blessings that we can receive from them. At the center of Truth is not a text, but a divine Being who refuses to be understood as a sum of discrete parts but instead is fully God and fully man. Incapable of being analyzing and pinned down, he turns the tables and invites us to follow him.

None of the above is a panacea for what ails our culture. There is still much to be discussed in regards to education, values, and moral training, and relying on faith, the arts, or whatever encourages the right hemisphere will not provide a concrete arsenal of tools that we can neatly package and apply to every philosophical dilemma. But it would, if we follow McGilchrist’s arguments, begin the hard, foundational work of correcting the lopsided ways in which we have been interacting with the world and with ourselves.

Sarah Ngu is a fellow at the Trinity Forum Academy, located in the beautiful Eastern Shore of Maryland. A recent graduate from Columbia University with a double concentration in American Studies and Political Science, she occasionally pops out some thoughts at toastedideas.tumblr.com.