Last week I took a bus from Tel Aviv to Jerusalem and I saw (from behind) a person that I immediately recognized. It was Nimrod Megiddo, from IBM Almaden, one of the very first to relate game theory with complexity theory, one of the pioneers of computational geometry, and one of the leaders in optimization and linear programming, the guy who (with Ehud Kalai) was the first to invited me to an international conference, and a fresh Laureate of the John von Neumann theory Prize. I did not see Nimrod more than a year after our last coincidental meeting at the Berkeley Simons Institute, I called over to him and he was happy to see me as I was happy to see him, and we found a place together at the back of the bus and caught up on things.

Here are some of the people left to right, (on some I already told you in other posts):

David Schmeidler (only beard visible)
Guillermo Owen
Lloyd Shapley
Sergiu Hart
Yair Tauman (only back shown)
Robert Aumann
Werner Güth (behind Aumann)
Reinhard Selten
Ehud Kalai (just to the right of Selten looking at the camera)
Gerhard Schwedihauer
Elon Kohlberg (only back shown, looking to the left)
William Lucas (in the back, looking at the wall)
Robert Weber (only hair and jacket visible)
Bezalel Peleg (looking to the right)
Joel Moulen (looking to the left)
Thomas Marschak (sunglasses and beard)
Michael Maschler
Joachim Rosenmüller (with glasses behind Maschler)
Kenjiro Nakamura

Indeed, most people got it right! Bundling sometimes increases revenues, sometimes keeps revenues the same, and sometimes decreases revenues. In fact, this is an interesting issue which was the subject of recent research effort. So here are a few examples as told in the introduction to a recent paper Approximate Revenue Maximization with Multiple Itemsby Sergiu Hart and Noam Nisan:

Example 1: Consider the distribution taking values 1 and 2, each with probability 1/2.
Let us ﬁrst look at selling a single item optimally: the seller can either choose to price it
at 1, selling always and getting a revenue of 1, or choose to price the item at 2, selling it
with probability 1/2, still obtaining an expected revenue of 1, and so the optimal revenue
for a single item is 1. Now consider the following mechanism for selling both items:
bundle them together, and sell the bundle for price 3. The probability that the sum of
the buyer’s values for the two items is at least 3 is 3/4, and so the revenue is 3·3/4 = 2.25
– larger than 2, which is obtained by selling them separately.

Example 2: For the distribution taking values 0 and 1, each with probability 1/2,
selling the bundle can yield at most a revenue of 3/4, and this is less than twice the
single-item revenue of 1/2.

Example 3 (and 4): In some other cases neither selling separately nor bundling
is optimal. For the distribution that takes values 0, 1 and 2, each with probability 1/3,
the unique optimal auction turns out to oﬀer to the buyer the choice between any single
item at price 2, and the bundle of both items at a “discount” price of 3. This auction
gets revenue of 13/9 revenue, which is larger than the revenue of 4/3 obtained from
either selling the two items separately, or from selling them as a single bundle. (A similar situation happens for the uniform distribution on [0, 1], for which neither bundling nor selling separately is optimal (Alejandro M. Manelli and Daniel R. Vincent [2006]).

Example 5: In yet other cases the optimal mechanism is not even deterministic and must oﬀer lotteries for the items. This happens in the following example from a 2011 paper “Revenue Maximization in Two Dimensions”by Sergiu Hart and Phil Reny: Let F be the distribution which takes values 1, 2 and 4, with probabilities 1/6, 1/2, 1/3, respectively. It turns out that the unique optimal mechanism oﬀers the buyer the choice between buying any one good with probability 1/2 for a price of 1, and buying the bundle of both goods (surely) for a price of 4; any deterministic mechanism has a strictly lower revenue. See also Hart’s presentation “Two (!) good to be true” Update: See also this paper by Hart and Nisan: How Good Are Simple Mechanisms for Selling Multiple Goods?

You have one item to sell and you need to post a price for it. There is a single potential buyer and the value of the item for the buyer is distributed according to a known probability distribution.

It is quite easy to compute which posted price will maximize your revenues. You need to maximize the price multiplied by the probability that the value of the item is greater or equal to that price.

Examples:

1) When the value of the item for the buyer is 10 with probability 1/2 and 15 with probability 1/2. The optimal price is 10 and the expected revenue is 10.

2) When the value of the item for the buyer is 10 with probability 1/2 and 40 with probability 1/2. The optimal price is 40 and the expected revenue is 20.

Two items

Now you have two items to sell and as before a single potential buyer. For each of the items, the buyer’s value behaves according to a known probability distribution. And these distributions are statistically independent. The value for the buyer of having the two items is simply the sum of the individual values.

Now we allow the seller to post a price for the bundle of two items and he posts the price that maximizes his revenues.

In summary: The values are additive, the distributions are independent.

Test your intuition:

1) Can the revenues of a seller for selling the two items be larger than the sum of the revenues when they are sold separately?

2) Can the revenues of a seller for selling the two items be smaller than the sum of the revenues when they are sold separately?

PS: there is a new post by Tim Gowers on the cost of Elseviers journals in England. Elsevier (and other publishers) are famous (or infamous) for their bundling policy. The movement towards cheaper journal prices, and open access to scientific papers that Gowers largly initialted two years ago is now referred to as the “academic spring.”

You run a single-item sealed bid auction where you sell an old camera. There are three bidders and the value of the camera for each of them is described by a certain (known) random variable: With probability 0.9 the value is 100+x where x is taken uniformly at random from the interval [-1,1]. With probability 0.1 the value is 300+x where x is as before. The 300 value represents the case that the item has a special nostalgic value for the buyer.

The values of the camera to the three bidders are thus i.i.d random variables. (The reason for adding this small random x is to avoid ties, and you can replace the interval [-1,1] with [-ε, ε] for a small ε, if you wish.) If you don’t want to worry about ties you can simply think about the value being 100 with probability 0.9 and 300 wit probability 0.1.

The basic question

The basic questions for you the seller is:

What is the highest expected revenue you, the seller, can guarantee and what is your optimal allocation policy.

I’d like to see the answer for this question. But let me challenge your intuition with two simpler questions.

1) Can the seller guarantee an expected revenue of 120 or more?

2) What (roughly) is the optimal allocation policy

a) Highest bidder wins.

b) Highest bidder wins if his bid passes some reserve price.

c) The item is allocated at random to the bidders with probabilities depending on their bids.

Background: Myerson’s paper and his revenue equivalence theorem

The most relevant paper to this post is a well-known paper Optimal auction design by Roger Myerson. Myerson considered the case of a single-item sealed-bid auction where the bidders’ values for the item are independent identical random variable.

Note thatIdid not specifythe price that the winning bidder is going to pay for the camera. The reason is that according to a famous theorem by Myerson (referred to as the revenue equivalence theorem), when the bidders are strategic, the expected revenues for the seller are determined by the allocation rule and are independent from the pricing policy! (Well, you need to assume a reasonable pricing policy…)

For example, if the item is allocated to the highest bidder then the expected price will be the second highest price. If the price is determined by the second highest bid (Vickery’s auction) then each bidder has no incentive to give a bid which is different from its value. But even if the item will be allocated to the first bidder for the highest price, the expected revenues will still be the same! When you analyze an auction mechanism like ours you can assume that the payments are set in a way that the bidders have no incentive not to bid the true value the camera has. This is sometimes referred to as the revelation principle.

Myerson considered a mechanism which sometimes lead to higher revenues compared to allocating the item to the highest bidder: The seller set a reserve priceand the item is allocated to the highest bidder if the bid passes this reserve price, and is not allocated at all otherwise. In the paper Myerson also considered more complicated allocation rules which are important in the analysis where the item is allocated to bidders according to some probabilities based on their bids.

Polls

This time we have two questions and two polls:

Once again this is a game-theory question. I hope it will lead to interesting discussion like the earlier one on tic-tac-toe.

A little more Background: Auctions and Vickery’s second price auction.

(From Wikipedia) An auction is a process of buying and selling goods or services by offering them up for bid, taking bids, and then selling the item to the highest bidder. In economic theory, an auction may refer to any mechanism or set of trading rules for exchange.

In our case, we consider an auction of a single item (the camera) and each bidder is giving a sealed bid.

(Again from Wikipedea) A Vickrey auction is a type of sealed-bid auction, where bidders submit written bids without knowing the bid of the other people in the auction, and in which the highest bidder wins, but the price paid is the second-highest bid. The auction was first described academically by Columbia University professor William Vickrey in 1961 though it had been used by stamp collectors since 1893.^{[2]} This type of auction is strategically similar to an English auction, and gives bidders an incentive to bid their true value.

We are considering the stable marriage theorem. Suppose that there are n men and n women. If the preferences are random and men are proposing, what is the likely average women’s rank of their husbands, and what is the likely average men’s rank of their wives?

Boris Pittel proved that on average a man will be matched to the woman in place log n on his list. (Place one is his most preferred woman.) A woman will be matched on average to a man ranked n/log n on her list.

We asked in the post “Test your intuition (19)” what is the situation if there is one additional man, and men are still proposing. This question is based on a conversation with Jacob Leshno who told me about a remarkable paper Unbalanced random matching markets by Itai Ashlagi, Yash Kanoria, and Jacob Leshno. Continue reading →

Maximal lattice-free convex bodies introduced by Herb Scarf and the related complex of maximal lattice free simplices (also known as the Scarf complex) are remarkable geometric constructions with deep connections to combinatorics, convex geometry, integer programming, game theory, fixed point computations, algebraic geometry, and economics. It is certainly an object I would like to think more and tell you more about. Here is a sculpture made by artist Ann Lehman based on such a body generated by a certain 4 by 3 matrix.

Ann Lehman: A sculpture based on a maximal lattice free convex body (photograph taken by the sculptress).

The body described in this sculpture is topologically a (two dimensional) disk, triangulated in a specific way. The boundary is a polygon embedded in 3-space consist of 21 “blue” edges. The “black” edges are internal edges of the triangulation. The triangles of the triangulation are not part of the sculpture but it is easy to figure out what they are and the shape has a remarkable zigzag nature. All vertices are integral. The only interior integral point is in “red”. (The “green” point is the origin, it is not in the body.)

Herb Scarf, me , and the sculpture (picture taken by Kareen Rozen)

The following post was kindly contributed by Eyal Sulganik from IDC (Interdiciplinary Center) Herzliya. Eyal was motivated by our poll on certainty “beyond a reasonable doubt,” which is related to several issues in accounting.

Mathematicians, I believe, are always looking for new areas where their models and concepts can make a difference. Physics, Economics, CS, Biology are just some examples, surely not exhausting a longer list of such areas. Although the origins of accounting emanate from mathematicians (for example: L. Pacioli, and even A. Cayley found interest in it), it is a fact, though not unexplained, that these days (almost) no mathematicians are interested in accounting and there is no field of “mathematical accounting”. In the following few paragraphs I would like to thus draw attention to accounting as a possible field for mathematicians. Surprisingly, a possibly “profitable field”. I believe that accounting can be subject, inter-alia, to use of theories of Formal Systems, Information Theory, Voting theory, Fuzzy Logic, Graph theory and even Catastrophe theory. In brief, accounting (Financial Reporting) deals with the measurement and reporting of economic events . As such, it is a measurement system interlaced with an information system. (Results such as Blackwell theorem on the comparison of information systems are of relevance).

Financial reporting of firms, the lifeline of the Capital Markets, is dictated by “reporting standards”. Those reporting standards are determined by “standard boards” (according to voting rules and procedures, which are very interesting for analysis) and interpreted and evolve over time (as is the case with other languages). The reports are audited by accounting firms. Auditing theory became more sophisticated but even fairly standard tools like Benford Law are not yet routine.

Moreover, a huge debate centers on whether to adopt a rule-based system where “every” possible scenario is prescribed in advance or whether to adopt a principle based system which gives” freedom” to every reporting entity in reflecting the economic substance of an event.

It is well known that a rule-based systems provide greater comparability (between firms), but at the same time, as they are more rigid and make use of “bright lines”, can be more easily forced to reflect form over substance . Indeed, Bright line Accounting rules are not continuous functions and hence small changes in the description or design of an event can lead to enormous differences in the reported values. For example, given that the definition of “CONTROL” was based on a “50% legal test” ,until recently it was the case that if company A was holding 50.01% of the shares of company B (other holders being each much smaller) and sold only 1.02% it could recognize a profit, due to “loss of control”, as though it sold the whole holding and bought back 49.01%. Needless to say that although holding “only” 49.01% , A CONTROLs B (Danny Ben Shahar, Desmond Tsang and Myself are in the process of demonstrating that “accounting theory of control” is inconsistent with Shapley Value).

Principle based systems, on the other hand, must make sure that its principles are common knowledge. For example, if a provision for loss regarding a claim against a firm depends on the chances of loss being “Probable” or “remote” or “reasonably possible” a question arises what the preparers and users of the financial statements think about those terms. Many years ago, me and my colleague Yossi Aharoni found out-through questionnaires- that different types of agents have different probabilistic interpretations to those terms and we explained the mis-communication it can cause. I attach two simple papers (one co-authored with Danny Ben Shahar, the other one in hebrew), that can shed some more light on the above point of view and I dare state a wish that a new field of “mathematical accounting” will be created .

This post is authored by Michael Schapira. (It is the second in a series of two posts.)

In thse two post, I outline work on Internet routing and sketch important areas for future work, both on routing itself and, more broadly, on mechanism design, game theory and distributed computation.

The Internet comprises administrative domains known as Autonomous Systems (ASes), which vary in size, from large (multi)national networks to small networks servicing schools or small businesses. The task of establishing routes between ASes, called interdomain routing, is currently handled by the Border Gateway Protocol (BGP)—the core routing protocol of the Internet. BGP is one of the most critical pieces of the Internet’s infrastructure and can be regarded as the glue that holds the Internet together.

We brefly mentioned two main challenges in this area. The first is the quest for network stability, and the second was giving agents incentives to “behave well”. We will discuss here these challenges in more detail.

Challenge I: The Quest for Network Stability

Informally, a “stable state’’ is a global routing state that, once reached by BGP, remains unchanged. Formally, a stable state is an assignment of routes R_{1},…,R_{n} to the n source nodes (d is assigned the “empty route’’ R_{d}=Ø) such that for every node i (1) there is a node j such that R_{i}=(i,j)R_{j}, where (i,j) R_{j} is the route that has (i,j) as a first link and then follows R_{j} to d; and (2) for every node j such that R_{i}≠(i,j)R_{j}, and (i,j)R_{j} is simple, it holds that R_{i} >_{i} (i,j)R_{j}.

The issue of “what is gambling” is very intereting. In an earlier post entitled “Chess can be a game of luck” the interesting question regarding “games of luck” and “games of skill” was discussed. It was argued that for a betting game, if, over time, for all players (or even for most players), the expected gains are negative, then we can regard the game as primarily a game of luck. (The claim is that this is a reasonable interpretation of the current law, even if not the only possible interpretation.)