INLS 509:
Information Retrieval

Description: The field of information retrieval (IR) is concerned with the analysis, organization, storage, and retrieval of unstructured and semi-structured data. In this course, we will focus on mostly text. While IR systems are often associated with Web search engines (e.g., Google), IR applications also include digital library search, patent search, search for local businesses, and expert search, to name a few. Likewise, IR techniques (the underlying technology behind IR systems) are used to solve a wide range of problems, such as organizing documents into an ontology, recommending news stories to users, detecting spam, and predicting reading difficulty. This course will provide an overview of the theory, implementation, and evaluation of IR systems and IR techniques. In particular, we will explore how search engines work, how they "interpret" human language, what different users expect from them, how they are evaluated, why they sometimes fail, and how they might be improved in the future.
Prerequisites: There are no prerequisites for this course.
Expectations: Information retrieval is the study of computer-based solutions to a human problem. Thus, the first half of the course will be system-focused, while the second half will be user-focused. During the first half, you should expect to see some math (e.g., basic probability and statistics and some linear algebra). However, we will focus on the concepts rather than the details.

Students will have an opportunity to explore their interests with a open-ended literature review.

Time & Location: M, W 1:50-3:05 pm, Manning 307
Instructor: Jaime Arguello (email, web)
Office Hours: T, Th 11:00am-12:00pm, Manning 305
Required Textbook: Search Engines - Information Retrieval in Practice, W. B. Croft, D. Metzler, and T. Strohman. Cambridge University Press. 2009. Available on-line.
Additional Resources: Foundations of Statistical Natural Language Processing. C. Manning and H Schutze. 1999.

Introduction to Information Retrieval. C. Manning, P. Raghavan and H. Schutze. 2008.
Other Readings: Selected papers and chapters from other books will sometimes be assigned for reading. These will be available online.
Course Policies: Laptops, Attendance, Participation, Collaboration, Plagiarism & Cheating, Late Policy
Grading: 30% homework (10% each)
15% midterm exam
15% final exam
30% literature review (5% proposal, 10% presentation, 15% paper)
10% participation
Grade Assignments: Letter grades will be assigned using the following scale: H 95-100%, P 80-94%, L 60-79%, and F 0-59%. All homework, exams, and the literature review will be graded on a curve.
Schedule: Subject to change! The required textbook (Croft, Metzler, and Strohman) is denoted as CMS below.
Lecture Date Events Topic Reading Due
1 Mon. 1/11   Introduction to IR: The Big Picture  
2 Wed. 1/13   Course Overview: Roadmap and Expectations CMS Ch. 1
3 Mon. 1/18 MLK Day (No class)    
4 Wed. 1/20   Introduction To Ad-hoc Retrieval I CMS Ch. 2, 5.3.0-5.3.3, 7.1.0-7.1.1
5 Mon. 1/25 HW1 Out Introduction To Ad-hoc Retrieval II  
6 Wed. 1/27   Indexing and Query Processing  
7 Mon. 2/1   Statistical Properties of Text CMS Ch. 4.1-4.2
8 Wed. 2/3   Text Representation I CMS Ch. 4.3-4.7, MRS Ch. 2
9 Mon. 2/8 HW1 Due Text Representation II  
10 Wed. 2/10   Retrieval Models: Vector Space I CMS Ch. 7.0-7.1.2
11 Mon. 2/15 HW2 Out Retrieval Models: Vector Space II  
12 Wed. 2/17 Literature Review Proposal Due Retrieval Models: Query-likelihood I CMS Ch. 7.3
13 Mon. 2/22   Retrieval Models: Query-likelihood II  
14 Wed. 2/24   Document Priors  
15 Mon. 2/29 HW2 Due Evaluation Overview CMS Ch. 8
16 Wed. 3/2 Midterm Review Midterm Review  
17 Mon. 3/7 Midterm Exam Midterm Exam  
18 Wed. 3/9   Test Collection-based Evaluation I Robertson '08, Sanderson '10 (page 248-298)
19 Mon. 3/14 Spring Break (No class)    
20 Wed. 3/16 Spring Break (No class)    
21 Mon. 3/21 HW 3 Out Test Collection-based Evaluation II  
22 Wed. 3/23   Evaluation Metrics Hersh et al., '00, Turpin & Hersh '01, Sanderson '10 (page 308-350)
23 Mon. 3/28   Experimentation I Smucker et al., '07, Cross-Validation, Parameter Tunning and Overfitting
24 Wed. 3/30   Experimentation II  
25 Mon. 4/4 HW3 Due Relevance Saracevic '07
26 Wed. 4/6   User Studies in Information Retrieval Kelly '09 Chapter 10 (pgs. 99-125), Tombros et al., '05
27 Mon. 4/11   Search-log Analysis Joachims et al., '05, Dumais et al., '14
28 Wed. 4/13   Federated Search  
29 Mon. 4/18   Final Exam Review  
30 Wed. 4/20   Student Presentations  
31 Mon. 4/25   Student Presentations  
32 Wed. 4/27   Student Presentations  
33 Fri. 4/29 Literature Review Due.    
33 Fri. 5/6 Final Exam, Manning 307, 8-11am Final Exam