Ratcheted up a logarithm?

I really like the Atlantic magazine. Its articles are well-researched, in-depth, interesting, topical, far-ranging and generally a pleasure to read. It is a sign of my enthusiasm for the magazine that I pay the equivalent of US$15 to buy it here in New Zealand, and I take it to a local cafe/pub overlooking Waiheke (see my New Zealand blog for a couple hundred photos) to savor during one of the few periods I spend away from either work or the ever-fascinating and ever-challenging Alexander. I was reading a typically well-written and persuasive article by Andrew Sullivan on why Barack Obama is the candidate to transcend our political divides (see also his blog entry), when I came across the following:

The war on Islamist terror, after all, is two-pronged: a function of both hard power and soft power.

Consider this hypothetical. It’s November 2008. A young Pakistani Muslim is watching television and sees that this man – Barack Hussein Obama is the new face of America. In one simple image, America’s soft power has been ratcheted up not a notch, but a logarithm.

Ratcheted up a logarithm? What could that possibly mean? I have grown used to the phrasing “increased exponentially” even when the increase is quadratic (as a referee, I see this at least twice a year in professional papers; the popular press is hopeless on this). But can there be a less appropriate term than “ratcheted up logarithmically”? Here are some choices:

  • Instead of going up by one (“a notch”) it has gone up by the logarithm of one. Um, no… that would be going up by 0, no matter what base you go with.
  • Instead of increasing by one, you take the logarithm of the level. So if you start at 100, instead of going to 101, you go to the logarithm of 100. We’ll go with base 10, so that would take you to 2. Wrong way (and will be for any reasonable base)… no.
  • Instead of increasing by 1, you increase by the logarithm of the level. So if you start at 100, you go to 102, instead of 101 (base 10). Well, at least it has the right direction, but I can’t believe this is what Sullivan had in mind.

I do not believe I am overly picky here. If Sullivan had referred to “Congresswoman Clinton”, clearly any fact-checker, editor, or casual reader would correct/castigate as appropriate. Or an illiteracy, “Obama is the bestest choice”, would not possibly make it through the system. But the mathematical illiteracy of the population is sufficient that this can go through in what has to be one of the most closely edited magazines in existence.

This reminds me of the old question. Why is it when I say “I am in Operations Research: the science of using mathematics to make better decisions” (or whatever phrase I am going with at the moment), it is quite common for people to say “Oh, I never understood any mathematics”, expecting me to understand and applaud them for recognizing their limits. But when they refer to a book, if I were to say “Oh, I never understood that reading thing”, I would look like an idiot and they would move on to someone else to talk to. Why is there this difference?

If I am blogging 50 years from now, I am sure I will be complaining “Nice article, too bad it doesn’t mention operations research by name”. But, I would hope that I am not seeing lots of “ratcheted up by a logarithm”!

Operations Research and Voting Systems

Almost 20 years ago, I, along with my co-authors John Bartholdi and Craig Tovey, looked at the social choice (or voting theory) literature through the lens of operations research. Over and over again, we would read “we tried to figure out the winner under this rule, but it got too complicated”, or “we think A is the winner, but we are not sure”. Of course, to those trained in OR (or computer science or other related fields) this is a sure sign of some form of computational complexity. And sure enough, we found a number of NP completeness results. My two favorites are

  1. For a rule proposed by Charles Dodgson (aka Lewis Carroll of “Alice in Wonderland” fame), it is NP-complete to determine a winner
  2. there are rules for which it is easy to determine a winner, but it is hard to determine how to “manipulate” your vote by misrepresenting your preferences so as to ensure a preferred outcome.

These results were published in the social choice literature (John has a page detailing these and other results that gives complete references), where they languished for a decade or so, before being rediscovered by a generation of computer scientists and computationally minded economists. Of course, complexity theory has moved along a lot (Complexity Zoo lists 462 complexity classes) so the issues looked at are now much more subtle than our work but it is heartening to go to conferences like COMSOC and see many papers reference our early work as “foundational”. Those papers have shot up my rankings in Google Scholar!

All this started when Bartholdi came across the voting system to elect the Venetian Doge. The rules are a little complex, involving ten rounds of “voting” with some of the rounds involving choosing winners at random. From the Wikipedia article:

Thirty members of the Great Council, chosen by lot, were reduced by lot to nine; the nine chose forty and the forty were reduced by lot to twelve, who chose twenty-five. The twenty-five were reduced by lot to nine and the nine elected forty-five. Then the forty-five were once more reduced by lot to eleven, and the eleven finally chose the forty-one who actually elected the doge.

What could be the reason for such a convoluted system? We never got back to looking at the Doge, but recently Miranda Mowbray and Dieter Gollmann have examined this in a paper presented at the IEEE Computer Security Foundations Symposium. They do a very thorough analysis of the rules and discuss its effect on minority representation and other issues. But at the end, they come up with a non-mathematical argument for the approach:

Schneier [in the book Beyond Fear] has used the phrase “security theatre” to describe public actions which do not increase security, but which are designed to make the public think that the organization carrying out the actions is taking security seriously. (He describes some examples of this in response to the 9/11 suicide attacks.) This phrase is usually used pejoratively.

However, security theatre has positive aspects too, provided that it is not used as a substitute for actions that would actually improve security. In the context of the election of the Doge, the complexity of the protocol had the effect that all the oligarchs took part in a long, involved ritual in which they demonstrated individually and collectively to each other that they took seriously their responsibility to try to elect a Doge who would act for the good of Venice, and also that they would submit to the rule of the Doge after he was elected.

The US system for President is also pretty convoluted, particularly when the Supreme Court gets involved, but we have a way to go before we get to the Venetian level. But there is some level of ritual and even randomness involved, making the article more pertinent to current systems than it might first appear.

Happy Thanksgiving and a couple new additions

First, Happy Thanksgiving to any US readers. Since I am in New Zealand, my staying home on Thursday did not correspond to Thanksgiving but rather a beautiful day keeping me near the beach.

Second, I’ve moved all my blog reading over to Google Reader, which allows me to add a couple things to the sidebar. A new, improved blog roll is a more extensive list of OR blogs. Again, if I am missing some, please let me know. Also, I can now flag interesting things I see in either the OR blogs or in the other blogs I run across. Those postings are in the “Things I am Reading” box. Perhaps you will find something useful (or weird!) there.

Tom Cooley and the role of business schools

Business school deans, as an occupational norm, tend to write in terms acceptable to a very wide audience. After spending all day listening to students, faculty, and administrators, there is no sense adding wood to the fire by writing with undue specificity. This gets even worse when deans write for their alumni magazine. Forests of trees have been sacrificed so deans can say, essentially, “We’re doing great but we’d do even better if you gave us some money”. Part of the job.

So it comes as a surprise when Tom Cooley, Dean of NYU’s Stern School, pens an article on the importance of a research outlook for business schools, an important, but controversial topic. And he pens it in the Fall/Winter issue of the Stern alumni magazine! In the article, he takes on former deans and even the AACSB (the association that certifies business schools):

Jeffrey Garten, former dean of the Yale School of Management, told The New York Times that business school education should be more “clinical” and that business school faculty should be required to have more practical experience in business. And this spring, an Association to Advance Collegiate Schools of Business (AACSB) task force issued a draft report on research. Among its conclusions: that business schools should “demonstrate the impact of faculty intellectual contribution to targeted audiences,” and that AACSB should “develop mechanisms to strengthen interaction between academics and practicing managers in the production of knowledge in areas of greatest interest.”

I’m convinced that the critics of business school education have it exactly backwards: the pressure business schools face to give in to a vocational focus means that students do not acquire the analytical and intellectual training needed to inform a leadership career, and that enables businesspeople to deal with a range of variables far greater than a purely vocational “how to” approach can address.

Cooley celebrates many of the past successes of a research approach to business education:

Between 1954 and 1966 the Ford Foundation spent $35 million to foster business education and research at five schools: Carnegie-Mellon University, the University of Chicago, and Columbia, Harvard, and Stanford Universities. Using the transformation of medicine and engineering as a model, these schools invested heavily in research and in doctoral programs. The investments have clearly produced returns. In 1955-1956, graduate business education was virtually non-existent. Now, well over 100,000 graduate business degrees are awarded annually by 650 AACSB programs. Business schools now enjoy greatly improved status as professional schools, in large measure because the intellectual value of the undertaking is recognized. The widespread adoption of the MBA degree as a qualification for future business leaders has legitimized the position the Ford Foundation and others took 50 or more years ago.

Perhaps more important, business schools have generated ideas of depth and daring that have changed business and financial markets in important ways. Professors Finn Kydland and Edward Prescott were awarded the 2004 Nobel Prize in Economics for work they did in the 1970s and 1980s at Carnegie-Mellon’s Graduate School of Industrial Administration, now called the Tepper School of Management. Their work on what is called “the inconsistency of optimal plans” established the foundation for an extensive research program on the credibility and political feasibility of economic policy, shifting the practical discussion of economic policy away from isolated policy measures toward the institutions of policy-making. Kydland and Prescott were also cited for having transformed our understanding of business cycles by integrating it with the theory of economic growth.

Finn is a friend of mine, and is truly brilliant. He is, not coincidently, trained as an OR person. When he teaches, he is about as far from “vocational instruction” as you can possibly get, to the chagrin of some of our less open-minded MBA students (CMU MBAs are generally very open to research-oriented topics in the classroom: it is one of the hallmarks of our program).

Tom obviously likes Carnegie Mellon, even if he gives us a hyphen that we jettisoned years ago, and calls us a School of Management instead of School of Business.

Tom is an economist, and he gives a great example of how insights from economics provide the basis for serious training in management, and the harm a vocational, current-best-practices approach causes:

Would you rather have business school graduates who know what kinds of contracting structures businesses now use, or students who understand that contracts exist to solve moral hazard, asymmetric information, commitment, and agency problems? It is precisely because we don’t yet know the problems that we will be facing that practice-driven education, focused on current solutions to current problems, will always fall short.

He closes with some comments inspired by the book Moneyball:

One of my favorite business books of the past few years is Moneyball by Michael Lewis, the story of the Oakland A’s and their extremely successful general manager, Billy Beane. More important, it is the story of how baseball has been transformed by a generation of researchers (aka baseball nerds) whose major contact with the game is through data analyzed by increasingly complex computer programming. Beane understood that this apparently arcane research had the potential to create extraordinary value for his team. Indeed, under Beane’s leadership, the perennially undercapitalized A’s managed to reach the playoffs for four consecutive years. Over that period, their salary cost per victory was less than half of the next highest spending team and less than a quarter of teams like the New York Yankees.

Beane overturned the most basic principles of one of the most tradition-bound businesses in America – professional baseball – by using sophisticated statistical research in place of traditional “gut instinct.” Several major league general managers who have never played the game are now similarly schooled in the research tradition of “moneyball,” and executives in sports like basketball and football are catching on.

Moneyball is a good metaphor for what happens in academic research. You hire a bunch of bright, well-trained people with strong technical skills and a passion about what they study and turn them loose. With the right personnel, the right conditions, the right insights, and with a forward-looking rather than a backward-looking focus, exciting things can happen. And that research, applied in the right circumstances, has truly enormous potential for change.

To make the obvious OR connection: the sort of analytic skills you gain and the insights you learn from an analytic approach to business problems is what OR is all about. It is not about linear programming, or dual variables: it is about thinking clearly about objectives, decisions, and constraints.

Sorry for the long post, but Tom’s words speak for themselves. For those of us who are research-oriented, this is really inspiring stuff. Thanks to my friend and colleague Chris Telmer for the pointer.

Al Roth on Market Design

Al Roth is a professor at Harvard (formerly the University of Pittsburgh: I still go to his house regularly, though he isn’t there anymore) who has done a lot of work in market design. His big success was work in stable matchings, and its application to the matching system between hospitals and medical residents. This work had a strong OR component: the heart of the system is the algorithm for finding stable matchings (matchings in which there is no resident, hospital pair, both of which would prefer to match with each other rather than the ones they had been assigned). You can get lots of pointers from his game theory, experimental economics, and market design page. Al recently gave a talk at Yahoo! Research as part of their Big Thinkers series. In the talk, he contrasted markets in kidneys (which I talked about a few weeks ago) with the market to match NY high school kids to schools:

Kidney transplants are necessary for end-stage renal disease, but there is a shortage of kidneys. Lack of compatibility and laws against kidney sales open up the possibility of a market designed kidney exchanged to increase the number of transplants. Roth explained the need for national exchanges as opposed to regional exchanges to increase the thickness in the market; 3-way exchanges that will add to a population of incompatible donor pairs; and opening up kidney exchanges to compatible patient-donor pairs as well.

In the example of matching students to high schools in New York City and Boston, where thickness isn’t a problem, there were too many transactions that led to congestion in the market. As many as 30,000 students in New York City were assigned to schools not on their choice list. The newly designed system incorporated a centralized clearinghouse to which students would submit their true preferences, instead of unreachable choices. The algorithm ensured that more students got accepted to their realistic first preferences and minimized the number of those rejected into their second choice pools.

As a side note, these Yahoo! (and Google) videos are a great resource: they attract the very best researchers who all obviously work very hard on their presentations.

Canadian Algorithms, eh?

The Globe and Mail, Canada’s answer to the New York Times, has an article on “algorithms” in their technology section (thanks Dad for the pointer!).  They give a number of very good examples of work being done by Ontario researchers that illustrates the range of application:

For example, Dr. Terlaky [director of McMaster University’s School of Computational Engineering and Science in Hamilton, Ont] says his school and a team from the University of Waterloo are working with oncologists from Toronto’s Princess Margaret Hospital to find the optimal level of radiation that can be used in treating liver cancer.

“We are working with Dr. Laura Dawson, a PMH oncologist, to find out how much radiation is most effective,” he says.

“Laura Dawson doesn’t understand the math but she doesn’t need to. She knows the treatment – we know the math.”

The drivers behind all this are faster computers and more data:

What has propelled the giant leaps forward in creating mathematical formulas to increase accuracy is a happy combination of data and computing power.

The data becomes the basis for analysis; the more of it available to sift through, the greater the accuracy.

The power and speed of today’s computers means computational tasks that would have taken decades in the 1990s can be done in a matter of minutes today.

“Today, thanks to computational science we can produce huge savings for industry, more effective diagnosis and treatment for medicine, better design for products and better management of resources,” Dr. Terlaky says.

“These great advances come at a time when almost all organizations have accumulated masses of data,” says Rotman’s Dr. Baron, associate professor of operations management. “It becomes historical data which can be analyzed to predict the future.

“Mathematical models are bound to play a greater role in business and even personal decision making,” Dr. Baron predicts.

I suppose I could go off complaining that the words “operations research” are never used, but instead I will be happy that a few million Canadians know a bit more about our field today.

Operations Research and Financial Meltdowns

Aurelie Thiele has a thoughtful post on On Quantitative Finance, inspired by a number of articles on the August melt-down of quant funds. Computational finance has become big business, both in education and on Wall Street. Carnegie Mellon and the Tepper School has been a leader in this, in keeping with our history as a quantitative oriented business school.

Aurelie highlights an article on the role OR people play in this (most won’t call themselves OR people, but I won’t get into that discussion yet again! They use models and mathematics to make better decisions: call it what you want)

But I enjoyed most of all “On Quants,” a short rebuttal to “The Blow-Up” written by a MIT professor in mathematics, Dr. Daniel Stroock, who argues that “[quants’] mission is to blindly keep those stocks moving, not to pass judgment on their value, either to the buyer or to society. Thus, [the author] finds it completely appropriate that quants now prefer the euphemism ‘financial engineer.’ They are certainly not ‘financial architects.’ Nor are they responsible for the mess in which the financial world finds itself. Quants may have greased the rails, but others were supposed to man the brakes.” While the dilution of responsibility in the financial world is becoming a tad worrisome, I liked Stroock’s advocacy for the term “financial engineer.”

I too like the word engineer, but mainly because in fact it would bring into the system some true checks and balances due to an engineer’s professional, legal, and ethical obligations. I think you would be hard pressed to find a professional engineer who would blithely say “I knew it didn’t have any brakes but my job was to put in the biggest engine I could! Shame about those kids, though.” Most engineers would take it as a solemn responsibility to ensure that the nonsense generated by a damn-fool architect was actually safe and effective when put into practice.

Take the following from the National Society of Professional Engineers Code of Ethics:

Fundamental Canons

Engineers, in the fulfillment of their professional duties, shall:

  1. Hold paramount the safety, health, and welfare of the public.
  2. Perform services only in areas of their competence.

II. Rules of Practice

  1. Engineers shall hold paramount the safety, health, and welfare of the public.
    1. If engineers’ judgment is overruled under circumstances that endanger life or property, they shall notify their employer or client and such other authority as may be appropriate.
    2. Engineers shall approve only those engineering documents that are in conformity with applicable standards….

I wonder if the “financial engineers” (very few of whom are professional engineers) really understood the assumptions in the models they used, to the level necessary to communicate that to employers.

Having “financial engineers” sounds like a great thing for the market, but only if they truly are engineers with all that entails.

ISI and Conferences

ISI from Thompson Scientific might be seen as just another scientific article indexing service, just like Google Scholar, Citeseer and many others. But it has a much stronger effect: many universities only “count” ISI-indexed publications. In mainstream operations research, this doesn’t have a very strong effect. Most well-known OR journals are ISI-indexed, and those that are not strive greatly for such indexing. But in some of our newer fields, particularly those related to computer science, this is much more of an issue. In constraint programming, conferences are key outlets for research, particularly such conferences as CP and CP/AI-OR. Both of these conference publish their papers in Springer’s Lecture Notes in Computer Science, which up until recently was ISI-indexed.

The ISI people have recently dropped LNCS and other series from their main ISI-indexing, and created a “conference” version of ISI-indexing. For some, this can have a very strong effect on promotion and tenure.

I have somewhat mixed feelings about this. At one level, I am worried that the ISI-“page count” available for some of our fields has decreased greatly. This will make it harder for some of our subfields to grow and expand. On the other hand, conference volumes are just part of the proliferation of research outlets, and not ISI-indexing them seems appropriate given the sometimes quick level of review that some conferences provide. It is certainly true that the conferences provide reasonably thorough reviews and the rejection rate is high (70% or more rejected), but this still seems different than journals. Perhaps it is just me: as a member of a conference committee, I feel it is my responsibility to review papers somewhat farther from my core interests than I would for journal reviews. As such, I can’t provide the same insights for conferences that I do for journals.

Much of OR doesn’t have this problem. Most OR conferences are “free-for-alls” with no veneer of reviewing. This makes for large conferences of highly-variable quality. It is great for talking about not-fully-formed ideas, or interesting but not publishable results. I wonder if some of the competitive conferences will change in flavor now that the “carrot” of an ISI-indexed paper is not offered. At the very least, people should be working harder to follow up the conference with a journal publication, which seems a good outcome.