信息检索的评价IR Evaluation
发布时间: 2010-03-29 09:27:00 浏览次数: 供稿:未知
演讲人:参考下方
讲座时间:0000-00-00 00:00:00
讲座地点:--
讲座内容

演讲人: 微软亚洲研究院研究员 宋睿华 女士

讲座时间: 2010年3月30日(周二)晚6:00~7:30

讲座地点: ダファベット 入金不要四层报告厅

讲座内容: 演讲人:微软亚洲研究院研究员 宋睿华 女士 Ruihua Song r Ms. Ruihua Song is a researcher in Microsoft Research Asia. She received B.E. and M.E. degrees from Department of Computer Science and Technology in Tsinghua University. Her main research interests are Web information retrieval, and Web information extraction. Her recent research focuses on ranking and evaluation. She participated in Text Retrieval Conference (TREC) four times and achieved excellent results. Also she serves as a coordinator of ACLIA task in NTCIR-7&8. Her homepage is http://research.microsoft.com/users/rsong/. 演讲摘要: Evaluation is an important topic in Information Retrieval (IR). I’d like to introduce some basics on IR evaluation and my recent works in this talk. First, I will discuss the problem of IR evaluation and explain some standard metrics. Second, I will briefly introduce some international workshops that are dedicated to IR evaluation, such as TREC (Text Retrieval Conference) and NTCIR (NII Test Collections for IR Systems). Finally, I’d like to introduce my recent works on IR evaluation.

演讲人简介