1 / 9

The KDD 2008 review process (Research track) Bing Liu & Sunita Sarawagi

The KDD 2008 review process (Research track) Bing Liu & Sunita Sarawagi. Bid-based paper assignment. Reviewers bid on papers Scale between 3=Eager and 0=not-willing Initial Assignment Globally maximize total bids subject to load, count constraints Easily solved using any LP-package

karis
Télécharger la présentation

The KDD 2008 review process (Research track) Bing Liu & Sunita Sarawagi

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The KDD 2008 review process(Research track)Bing Liu & Sunita Sarawagi

  2. Bid-based paper assignment • Reviewers bid on papers • Scale between 3=Eager and 0=not-willing • Initial Assignment • Globally maximize total bids subject to load, count constraints • Easily solved using any LP-package • Manual inspection and readjustments • Effort varies from chair to chair KDD-08 Opening August 24, 2008 Bing Liu & Sunita Sarawagi

  3. Problems of bid-based assignment • Two surprising dynamics • Unfair on papers on hot topics • Top fewpapers had bids from 25%of the PC. •  Random PC member reads it. • Unfair on reviewers who bid low • Old cynics (no eager bids) versus young interested (80 eager bids) • Random paper goes to low bidders KDD-08 Opening August 24, 2008 Bing Liu & Sunita Sarawagi

  4. Manual readjustments not easy • Scale: 500 papers, 190 reviewers, • Difficult for chairs to be familiar with the expertise of each reviewer • Tightly constrained system: any change spirals off a cascade of other changes. KDD-08 Opening August 24, 2008 Bing Liu & Sunita Sarawagi

  5. Modeling reviewer to paper affinity • Reviewer profile: abstracts of past publications • Challenge: crawling for abstracts • DBLP with pointers to electronic edition + some manual gathering/cleaning • (Thanks to IITB undergrads: Ankit Gupta, AnkurGoel) • Paper-reviewer affinity • TF-IDF similarity between paper abstract and reviewer profile • Okapi, BM25 etc tuned for short queries and long documents KDD-08 Opening August 24, 2008 Bing Liu & Sunita Sarawagi

  6. The assignment Maximize weighted sum of bid and affinity subject to load,countconstraints KDD-08 Opening August 24, 2008 Bing Liu & Sunita Sarawagi

  7. Manual readjustments still needed • Chairs go over assignments and give input as • Short list of reviewers for a paper • Re-invoke LP with additional constraints • Chairs spared of handling cascaded changes • But, need a stable LP solver to minimize changes • Current algorithm (LpSolve) seems stable • We did three rounds, working10 days non-stop! • Coding easy: One week with LpSolve+Lucene KDD-08 Opening August 24, 2008 Bing Liu & Sunita Sarawagi

  8. Improvements • Better modeling of reviewer expertise • Time decaying topic models? • Better affinity match • Citation distance? • Human intervention is unavoidable. • Good interactive UI tools for paper assignment KDD-08 Opening August 24, 2008 Bing Liu & Sunita Sarawagi

  9. Other issues • Author feedback • Conditional accept • Early notification of sure rejects • Vice chairs select PC and assign papers • KDD is homogeneous • Topics keep shifting • Load balancing across tracks difficult KDD-08 Opening August 24, 2008 Bing Liu & Sunita Sarawagi

More Related