Yale University ran up the white flag this week in its battle to keep seniors Peter Xu and Harry Yu from creating an easier-to-use and more informative version of its online course catalog. The school’s real battle, however, was against technological change ― and so defeat was inevitable.
Xu and Yu, who are twins, invented an ingenious way to combine the catalog with course evaluations. The popular site soon had more than 1,800 student users. Yale tried to shut it down, partly on intellectual property grounds: It was, after all, Yale’s data. Another student found a workaround, and techies and campus free-speech advocates lined up on the brothers’ side.
So Yale (which happens to be my employer) gave in. Mary Miller, the dean of Yale College, conceded that no other outcome was possible: “Technology has moved faster than the faculty could foresee when it voted to make teaching evaluations available to students over a decade ago, and questions of who owns data are evolving before our very eyes.”
The dean’s words connect Yale’s surrender to recent security breaches at Target and Neiman Marcus; to President Barack Obama’s speech last week about reforming the data-gathering processes of the National Security Agency; and to last week’s federal court decision on the Federal Communications Commission’s “net neutrality” rules. The thread through them all is the difficulty of keeping a tight rein on any technology once its maker releases it into the wild.
Rules of law, no matter how carefully stated, are simply inadequate to keep technology from wriggling and writhing where it will. One is reminded of what the critic Harold Schoenberg wrote about Tigran Petrosian, the great chess champion of the 1960s: “Playing him was like trying to put handcuffs on an eel.”
Consider one of the reforms proposed by Obama in his effort to address concerns about the privacy of data collected by the NSA. The president called for the establishment of a third party to hold the NSA’s trove of phone-call data. The idea is to make it harder for the NSA to access the information. But if indeed there is a problem, then this suggestion has matters exactly backward: The trouble isn’t whether the folks at the NSA are trustworthy. The trouble is the data itself.
Once such treasure exists, there will be those who will seek to dig it up and use it for their own purposes. Ask Neiman Marcus and Target, who, along with other U.S. retailers, found their servers invaded by what appears to be a malware program called BlackPOS. The program intercepts credit-card numbers at the only instant when they are unencrypted ― just after they are entered at the point of sale, hence “POS” ― and stores this information in some other server accessible by the attacker. The scheme exists because the vulnerability exists. When the vulnerability ceases to exist (if, for example, U.S. retailers adopt the embedded-chip technology common in much of the world), another vulnerability will crop up, and someone will exploit it.
Because it is there.
The computer systems operated by the NSA may be the most secure in the world. One needn’t be a fan of big government to see that we will weaken the privacy of the data if we place it in some potentially less secure facility instead. Because technology is not, in the end, controllable. To exploit it, we must risk its misuse. We might be able to keep losses below some optimal level, but we will never get them down to zero.
If the data exist, someone will be tempted. Someone at the NSA. Someone at the yet-to-be-named third-party custodian. Someone who wants to break into the yet-to-be-named third-party custodian. Every legal fix, every technological fix, every fix of any kind will yield in the end to a technological assault.
It’s all very Darwinian.
The flip side of the problem is illustrated by the debate over so-called net neutrality.
Now, I will confess to being something of an agnostic when it comes to net neutrality rules. I agree entirely with those who point out how much innovation (and how much consumer surplus) rests on the existence of an Internet that gives every byte the same priority. At the same time, I quite sympathize with those who have invested in the infrastructure and would like to behave like other capitalists, engaging in price discrimination in order to maximize profit.
The point is that no matter which side wins, the victory will likely be evanescent. More and more content providers are experimenting with building their own networks, and more and more broadband providers are experimenting with pricing systems that subvert whatever net neutrality rules ultimately emerge. The spirit of Ronald Coase is alive and well: The effect of any rule depends the cost of getting around it. The pace of technological innovation tends to drive down the cost of circumventing the rules governing its use.
Which brings us back to Xu and Yu’s course catalog. Without attributing any particular motivation to Yale’s failed effort to intervene, it’s easy to see why a generic university might be unenthusiastic about making it cheap for students to find course evaluation information in the process of registering. I’m not a big fan of numerical evaluations myself: They don’t necessarily capture the most important parts of teaching. But the market for college-level courses is more and more consumer-driven. If consumers want an inexpensive way to combine information about your products and services, somebody is going to give it to them. In an era of rapid technological advance, to imagine otherwise is to believe that you can put handcuffs on an eel.
By Stephen L. Carter
Stephen L. Carter is a Bloomberg View columnist and a professor of law at Yale University. He is the author of “The Violence of Peace: America’s Wars in the Age of Obama” and the novel “The Impeachment of Abraham Lincoln.” ― Ed.
(Bloomberg)