INLS 613: Text Mining

Objective: Gain experience with both the theoretical and practical aspects of text mining. Learn how to build and evaluate computer programs that generate new knowledge from natural language text.
Description: Changes in technology and publishing practices have eased the task of recording and sharing textual information electronically. This increased quantity of information has spurred the development of a new field called text mining. The overarching goal of this new field is to use computers to automatically learn new things from textual data.

The course is divided into three modules: basics, principles, and applications (see details below). The third part of the course will focus on several applications of text mining: methods for automatically organizing textual documents for sense-making and navigation (clustering and classification), methods for detecting opinion and bias, methods for detecting and resolving specific entities in text (information extraction and resolution), and methods for learning new relations between entities (relation extraction). Throughout the course, a strong emphasis will be placed on evaluation. Students will develop a deep understanding of one particular method through a course project.

Prerequisites: There are no prerequisites for this course. We will be using a tool called LightSIDE to train and test machine learned models for different predictive tasks. LightSIDE has a graphical user interface that makes it easy to do this without knowing how to program. That being said, knowing how to program (and manipulate text) may enable you to conduct more interesting experiments as part of your final project.
This course will involve understanding mathematical concepts and procedures. I will cover the basics in order for you to understand these. However, if you strongly dislike math and are unwilling to grapple with and ultimately conquer mathematical concepts and procedures, this may not be a good course for you.
Time & Location: M,W 1:25-2:40pm, Manning 001 (In Person).
Instructor: Jaime Arguello (email, web)
Office Hours: By Appointment
Required Textbook: Data Mining: Practical Machine Learning Tools and Techniques (Fourth Edition) Ian H. Witten, Eibe Frank, Mark A. Hall, and Christopher J. Pal. 2017. Morgan Kaufman. ISBN 978-0128042915. Available online
Additional Resources: Foundations of Statistical Natural Language Processing. C. Manning and H Schutze. 1999.

Introduction to Information Retrieval. C. Manning, P. Raghavan and H. Schutze. 2008.
Course Policies: Laptops, Attendance, Participation, Collaboration, Plagiarism & Cheating, Late Policy
Grading: 10% Class participation
20% Midterm Exam
30% Homework (10% each)
40% Final project (5% project proposal, 25% project report, 10% project presentation)
Grade Assignments: Undergraduate grading scale: A+ 97-100%, A 94-96%, A- 90-93%, B+ 87-89%, B 84-86, B- 80-83%, C+ 77-79%, C 74-76%, C- 70-73%, D+ 67-69%, D 64-66%, D- 60-63%, F 0-59%

Graduate grading scale: H 95-100%, P 80-94%, L 60-79%, and F 0-59%.
Topics: Subject to change! Readings from the required textbook (Witten, Frank, Hall, and Pal) is marked with a WFHPP bellow.
Lecture Date Events Topic Reading Due
1 Mon. 8/15   Introduction to Text Mining: The Big Picture  
2 Wed. 8/17   Course Overview: Roadmap and Expectations WFH Ch. 1, Mitchell '06
3 Mon. 8/22   Predictive Analysis: Concepts, Features, and Instances I WFH Ch. 2, Dominigos '12
4 Wed. 8/24 HW1 Out Predictive Analysis: Concepts, Features, and Instances II  
5 Mon. 8/29   Text Representation I  
6 Wed. 8/31   Text Representation II  
7 Mon. 9/5 Labor Day (No Class)    
8 Wed. 9/7 HW1 Due Machine Learning Algorithms: Naïve Bayes I WFH Ch. 4.2, Mitchell Sections 1 and 2
9 Mon. 9/12   LighSIDE Tutorial I LightSIDE User Manual
10 Wed. 9/14 HW2 Out LighSIDE Tutorial II (data)  
11 Mon. 9/19   Machine Learning Algorithms: Instance-based Classification I WFH Ch. 4.7
12 Wed. 9/21 Project Proposal Due Final Project Breakout Group Discussion I  
13 Mon. 9/26 Well-Being Day (No Class)    
14 Wed. 9/28 HW2 Due Machine Learning Algorithms: Instance-based Classification II  
15 Mon. 10/3   Machine Learning Algorithms: Linear Classifiers I WFH 3.2 and 4.6
16 Wed. 10/5   Machine Learning Algorithms: Linear Classifiers II  
21 Mon. 10/10 Midterm Review Midterm Review  
22 Wed. 10/12 Midterm Midterm  
23 Mon. 10/17   Predictive Analysis: Experimentation and Evaluation I WFH Ch. 5
24 Wed. 10/19   Predictive Analysis: Experimentation and Evaluation II Smucker et al., '07, Cross-Validation, Parameter Tunning and Overfitting
25 Mon. 10/24   Predictive Analysis: Experimentation and Evaluation III  
26 Wed. 10/26 HW3 Out Midterm + HW3 + Final Project Breakout Group Discussion  
27 Mon. 10/31   Exploratory Analysis: Clustering I Manning Ch. 16
28 Wed. 11/2   Exploratory Analysis: Clustering II  
29 Mon. 11/7   Sentiment Analysis Pang and Lee, '08 (skip Section 5 and only skim Section 6), Pang and Lee, '02
30 Wed. 11/9 HW3 Due Discourse Analysis Arguello '15
31 Mon. 11/14   Detecting Viewpoint Weibe '10
32 Wed. 11/16   Text-based Forecasting Lerman et al., '08
33 Mon. 11/21   Final Project Presentations I  
34 Wed. 11/23 Thanksgiving (No Class)    
35 Mon. 11/28   Final Project Presentations II  
36 Wed. 11/30   Final Project Presentations III  
37 Mon. 12/5 Project Report Due