最近有时间我会从前往后阅读nlper这个博客,发现“Most Influential NLP Papers”这篇文章比较有参考价值,不过写于06年初,稍早一些,但是真金不怕火炼,就放在这里供大家参考了!
“I conducted a mini survey recently, asking people I knew what they thought were the most influential papers in NLP from the past two decades. Here are the wholly unscientific results, sorted from most votes and subsorted by author. Note that I only got responses from 7 people. I've not listed papers that got only one vote and have not included my personal votes.”
按照作者的说法,他是做了一个小型的调查,通过询问他所了解的自然语言处理的研究者“过去20年他们所认为的最有影响力的自然语言处理论文”得到这个调查结果的。事实上,作者仅仅得到七个人的回应,并且其中六个人是南加州大学(作者所工作的单位)和宾州大学的。以下是调查的最终结果,按照得票数进行排序,如果票数相同,则按论文作者的姓名进行排序,注意其中并不包括只得到一票的论文和作者自己的投票:
(7 votes): Brown et al., 1993; The Mathematics of Statistical Machine Translation(统计机器翻译)
(5 votes): Collins, 1997; Three Generative, Lexicalised Models for Statistical Parsing(统计句法分析)
(4 votes): Marcus, 1993 Building a large annotated corpus of English: the Penn Treebank(语料库)
(3 votes): Berger et al., 1996; A maximum entropy approach to natural language processing(最大熵)
(2 votes): Bikel et al., 1997; An Algorithm that Learns What's in a Name
(2 votes): Collins, 2002; Discriminative Training Methods for Hidden Markov Models: Theory and Experiments with Perceptron Algorithms
(2 votes): Lafferty et al., 2001; Conditional Random Fields: Probabilistic Models for Segmenting and Labeling Sequence Data(条件随机场)
(2 votes): Och, 2003; Minimum Error Rate Training for Statistical Machine Translation(统计机器翻译)
(2 votes): Papineni et al., 2001; Bleu: a method for automatic evaluation of machine translation(机器翻译自动评测)
(2 votes): Ratnaparkhi, 1999; Learning to Parse Natural Language with Maximum Entropy Models
(2 votes): Yarowsky, 1995; Unsupervised Word Sense Disambiguation Rivaling Supervised Methods(词义消歧)
括号中是我注释的所属领域,机器翻译之所以占了三个,估计与南加州大学的投票有关。
不知道这里是否也可以做个这样的调查?毕竟个人的能力有限,而大家的力量是无穷的,如果我们这些nlpers一起行动,也许会有一个不错的调查结果,对大家以及后来者多少都会有些参考。
初步的想法是:读者如果熟悉自然语言处理或者计算语言学某个领域,可以列出自己认可的比较有影响力的几篇自然语言处理论文,如果能得到足够的回复,我最后统一汇总一下这些结果,做个类似nlper的调查结论。
52nlp还远没有nlper那么大的影响力,我也不知道这个调查是否能最终成功,但是希望亲爱的nlper们能行动起来,无论是一篇还是两篇!
注:原创文章,转载请注明出处“我爱自然语言处理”:www.52nlp.cn
本文链接地址:https://www.52nlp.cn/most-influential-nlp-papers
在这方面,一个比较好的资源是ACL Anthology Network(http://belobog.si.umich.edu/clair/anthology/index.cgi)。不仅能够查到论文的排名,而且能查到作者的排名。
[回复]
谢谢刘洋师兄的指点!
[回复]
好文章,谢谢啦~
[回复]
52nlp 回复:
18 2 月, 2011 at 21:01
不客气啦~
[回复]