* * * * * * * * * * * * * * * * * * * * * * * *
LINGUIST List logo Eastern Michigan University Wayne State University *
* People & Organizations * Jobs * Calls & Conferences * Publications * Language Resources * Text & Computer Tools * Teaching & Learning * Mailing Lists * Search *
* *
LINGUIST List 21.249

Fri Jan 15 2010

Calls: Computational Ling, Text/Corpus Ling/Italy

Editor for this issue: Kate Wu <katelinguistlist.org>

LINGUIST is pleased to announce the launch of an exciting new feature: Easy Abstracts! Easy Abs is a free abstract submission and review facility designed to help conference organizers and reviewers accept and process abstracts online. Just go to: http://www.linguistlist.org/confcustom, and begin your conference customization process today! With Easy Abstracts, submission and review will be as easy as 1-2-3!
        1.    Martin Potthast, Uncovering Plagiarism, Authorship, and Social Software Misuse

Message 1: Uncovering Plagiarism, Authorship, and Social Software Misuse
Date: 13-Jan-2010
From: Martin Potthast <martin.potthastuni-weimar.de>
Subject: Uncovering Plagiarism, Authorship, and Social Software Misuse
E-mail this message to a friend

Full Title: Uncovering Plagiarism, Authorship, and Social Software Misuse
Short Title: PAN

Date: 20-Sep-2010 - 20-Sep-2010
Location: Padua, Italy
Contact Person: Martin Potthast
Meeting Email: panwebis.de
Web Site: http://pan.webis.de

Linguistic Field(s): Computational Linguistics; Text/Corpus Linguistics

Subject Language(s): English (eng)

Call Deadline: 01-Jun-2010

Meeting Description:

The 4th International Workshop on Uncovering Plagiarism, Authorship, and Social
Software Misuse PAN-10 will be held as an evaluation lab in conjunction with the
CLEF conference in Padua, Italy, on September 20-23, 2010.

Call for Papers

Evaluation Campaign on Plagiarism Detection and Wikipedia Vandalism Detection
held in conjunction with the CLEF'10 conference
in Padua, Italy, September 20-23

About the Campaign:
Plagiarism detection in text documents is a challenging retrieval task: today's
detection systems are faced with intricate situations, such as obfuscated
plagiarism or plagiarism within and across languages. Moreover, the source of a
plagiarism case may be hidden in a large collection of documents, or it may not
be available at all. Informally, the respective CLEF-Lab task can be described
as follows:

1. Plagiarism Detection. Given a set of suspicious documents and a set of source
documents, the task is to find all plagiarized sections in the suspicious
documents and, if available, the corresponding source sections.

Following the success of the 2009 campaign and based on our experience we will
provide a revised evaluation corpus consisting of artificial and simulated

Vandalism has always been one of Wikipedia's biggest problems. However, the
detection of vandalism is done mostly manually by volunteers, and research on
automatic vandalism detection is still in its infancy. Hence, solutions are to
be developed which aid Wikipedians in their efforts. Informally, the respective
CLEF-Lab task can be described as follows:

2. Wikipedia Vandalism Detection. Given a set of edits on Wikipedia articles,
the task is to identify all edits which are vandalism, i.e., all edits whose
editors had bad intentions.

Participants are invited to submit results for one or both of the tasks.

Important Dates:
open: Registration
Mar 01, 2010: Training corpora release
(Preliminary training corpora are already available!)
May 03, 2010: Test corpora release
Jun 01, 2010: Result submission deadline
Jun 15, 2010: Notification of performance
Jul 15, 2010: Paper submission deadline
Aug 02, 2010: Notification of reviews
Sep 01, 2010: Final paper deadline
Sep 20-23, 2010: Evaluation lab at CLEF conference

Campaign Organization:
Webis Bauhaus-Universit├Ąt Weimar

NLEL Universidad Polit├ęcnica de Valencia

University of the Aegean

Bar-Ilan University

E-mail: panwebis.de
Campaign Web page: http://pan.webis.de
Read more issues|LINGUIST home page|Top of issue

Please report any bad links or misclassified data

LINGUIST Homepage | Read LINGUIST | Contact us

NSF Logo

While the LINGUIST List makes every effort to ensure the linguistic relevance of sites listed
on its pages, it cannot vouch for their contents.