Professor Tim Roughgarden Delivers Distinguished Lecture on How CS Solved One of the Hardest Government Auction Problems

Professor Tim Roughgarden gave the fall distinguished lecture for the School of Computer Science on Friday, Oct. 5. The theory talk, How Computer Science Informs Modern Auction Design, engaged the many faculty and students who attended.

Roughgarden is a professor at Stanford, where he specializes in computer science and economics, focusing on design, analysis, applications, and limitations of algorithms. He used a specific auction as a case study of how computer science is influencing economics more and more.

From 2016 to 2017, the Federal Communications Commission (FCC) hosted an auction to free up traditional broadcast licenses for broadband development. This would enable a stronger network for computers, tablets, and smartphones.

To repurpose the wireless spectrum, the FCC hosted reverse and forward auctions. Initially, the FCC spent $10 billion buying spectrum licenses from regional broadcast TV stations. Then it planned a forward auction to sell wireless broadband licenses for $20 billion. Overall, the auction would net $10 million in revenue. Yet to make all the math work, they needed to use computer science.

For the reverse auction, the FCC relied on a descending clock option algorithm. Each broadcaster was offered a buyout price that decreased over time in each round. If a broadcaster declined the offer, they would be exited from the auction and would retain the original license with no government compensation. However, if the broadcaster accepted the offer, they would move to the next around with the expectation that the government would eventually buy back their license and pay the most recent lower offer.

Yet, according to Roughgarden, the central problem with this auction format was a more classic theory problem. The FCC needed to buy up a certain number of licenses nationwide to be successful, but those that remained needed to have channels the government could reassign them to. This is a classic mathematical repacking problem.

What the FCC needed was a solution to one of the most famous NP-complete problems, graph coloring. Graph coloring assigns labels (colors) to graph vertices so that no two adjacent vertices share the same color. To make it even harder, this auction necessitated solving thousands of graph coloring problems within a second.

“NP isn’t a death sentence, but you have to up your game,” Roughgarden said.

Computer scientists used what are known as “satisfiability” algorithms to come up with the best prediction for how a channel would behave, and if they could be repacked during the auction. Surprisingly, the best method for solving this, explained Roughgarden, was using reverse greedy algorithms, which delete edges from graphs.

“It’s not like no one ever thought about reverse greedy algorithms, but it didn’t seem like anyone cared,” Roughgarden said. “Because why use them when they seem more awkward and less good? But coming from economics, there is an extrinsic reason to care about what you can do with a reverse greedy algorithm.”

For the forward auction when the FCC needed to sell broadband licenses, Roughgarden noted how the FCC relied on the computer science concept of proof of anarchy to prove that simple auctions work well without complements. Similarly, complexity theory was able to explain why strong complements make simple auctions perform poorly.

“Without techniques from computer science, the government could not have used the auction they did.”

Contact: 

Tess Malone, Communications Officer

tess.malone@cc.gatech.edu