INLS 613: Text Mining

Objective: Gain experience with both the theoretical and practical aspects of text mining. Learn how to build and evaluate computer programs that generate new knowledge from natural language text.
Description: Changes in technology and publishing practices have eased the task of recording and sharing textual information electronically. This increased quantity of information has spurred the development of a new field called text mining. The overarching goal of this new field is to use computers to automatically learn new things from textual data.

The course is divided into three modules: basics, principles, and applications (see details below). The third part of the course will focus on several applications of text mining: methods for automatically organizing textual documents for sense-making and navigation (clustering and classification), methods for detecting opinion and bias, methods for detecting and resolving specific entities in text (information extraction and resolution), and methods for learning new relations between entities (relation extraction). Throughout the course, a strong emphasis will be placed on evaluation. Students will develop a deep understanding of one particular method through a course project.

Prerequisites: Students should have a reasonable background in programming in a structured or object oriented programming language, such as Java or C++. "Reasonable" means either coursework or equivalent practical experience. You should be able to design, implement, debug and test small to medium sized programs. If you would like to take this course, but do not know if you meet these pre-requisites, please send me an email.
Time & Location: M,W 1:50-3:05, Manning 307
Instructor: Jaime Arguello (email, web)
Office Hours: T, Th 10:00-11:00am, Manning 10 (Garden Level)
Required Textbook: Data Mining: Practical Machine Learning Tools and Techniques (Third Edition) Ian H. Witten, Eibe Frank, and Mark A. Hall. 2011. Morgan Kaufman. ISBN 978-0-12-374856-0. Available online or in the campus bookstore.
Additional Resources: Foundations of Statistical Natural Language Processing. C. Manning and H Schutze. 1999.

Introduction to Information Retrieval. C. Manning, P. Raghavan and H. Schutze. 2008.
Course Policies: Laptops, Attendance, Participation, Collaboration, Plagiarism & Cheating, Late Policy
Grading: 10% Class participation
20% Midterm Exam
30% Homework (10% each)
40% Final project (5% project proposal, 25% project report, 10% project presentation)
Grade Assignments: Undergraduate grading scale: A+ 97-100%, A 94-96%, A- 90-93%, B+ 87-89%, B 84-86, B- 80-83%, C+ 77-79%, C 74-76%, C- 70-73%, D+ 67-69%, D 64-66%, D- 60-63%, F 0-59%

Graduate grading scale: H 95-100%, P 80-94%, L 60-79%, and F 0-59%.

All assignments, exams, and the literature review will be graded on a curve.
Topics: Subject to change! Readings from the required textbook (Witten, Frank, and Hall) is marked with a WFH bellow.
Lecture Date Events Topic Reading Due
1 Wed. 8/24   Introduction to Text Mining: The Big Picture  
2 Mon. 8/29   Course Overview: Roadmap and Expectations WFH Ch. 1, Mitchell '06, Hearst '99
3 Wed. 8/31   /Predictive Analysis: Concepts, Features, and Instances I WFH Ch. 2, Dominigos '12
4 Mon. 9/5 Labor Day (No Class)    
5 Wed. 9/7 HW1 Out Predictive Analysis: Concepts, Features, and Instances II  
6 Mon. 9/12   Text Representation I  
7 Wed. 9/14   Text Representation II  
8 Mon. 9/19   Text Representation III  
9 Wed. 9/21 HW1 Due, HW2 Out Machine Learning Algorithms: Naïve Bayes WFH Ch. 4.2, Mitchell Sections 1 and 2
10 Mon. 9/26   LighSIDE Tutorial LightSIDE User's Manual
11 Wed. 9/28 Project Proposal Due Weka Tutorial WFH Ch. 10 and 11
12 Mon. 10/3   Machine Learning Algorithms: Instance-based Classification WFH Ch. 4.7
13 Wed. 10/5   Predictive Analysis: Experimentation and Evaluation I WFH Ch. 5
14 Mon. 10/10 HW2 Due Predictive Analysis: Experimentation and Evaluation II Smucker et al., '07, Cross-Validation, Parameter Tunning and Overfitting
15 Wed. 10/12   Predictive Analysis with Noisy Labels Sheng et al., '08
16 Mon. 10/17 Midterm Review Midterm Review  
17 Wed. 10/19 Midterm Midterm  
18 Mon. 10/24   Exploratory Analysis: Clustering &Manning Ch. 16
19 Wed. 10/26   Sentiment Analysis I Pang and Lee, '08 (skip Section 5 and only skim Section 6), Pang and Lee, '02
20 Mon. 10/31 HW3 Out Sentiment Analysis II Somasundaran and Weibe '10 (optional),
21 Wed. 11/2   Detecting Viewpoint and Persepective I Yano et al., '10, Weibe '10
22 Mon. 11/7   Detecting Viewpoint and Persepective II  
23 Wed. 11/9   Predicting the Usefuless of Reviews  
24 Mon. 11/14 HW3 Due Discourse Analysis Arguello '15
25 Wed. 11/16   Text-based Forecasting O'connor et al., '10, Lerman et al., '08
26 Mon. 11/21   Information Extraction and Relation Learning McCallum '05, Arguello '07
27 Wed. 11/23 Thanksgiving (No Class)    
28 Mon. 11/28   Bootstrapping in Information Extraction  
29 Wed. 11/30   Student Presentations  
30 Mon. 12/5   Student Presentations  
31 Wed. 12/7   Student Presentations  
32 Fri. 12/16 Project Report Due