HCI Seminar 310-89
Day 7 Notes
10/16/00
Syllabus
and links to class notes available at:
http://ils.unc.edu/~march/courses/310_f00/syllabus.html
class
roster
open
info week events
Notes
on midterm projects
Airong
to discuss her eye-tracking study
Main
Points
The
scale of work motivates automatic solutions (e.g., indexing)
More
interactive =/= better interface
Focus
on design elements for usability rather than whole design
Usability
testing takes time
Questions
Will
we get auto indexing of audio/video? Is
it a hw/sw problem?
There
are guides/templates for programs, are there same for UIs?
Why
get away from words?
What
is difference between semantic and non-semantic in video retrieval?
Instead
of verbal vs visual, why not best of both? Yes!
Who
are the leaders in image/video indexing?
When
can I see some video retrieval systems?
Are
there toolkits for instrumenting websites for usability testing?
2.
Readings.
Hutchinson et al.
Eyes as input devices (see Bolt, 1984)
Erica for quadriplegics
Technical elements of eye-tracking described
Practical challenges—communications tedious; technical constraints (head positioning)
500 ms dwell time
Jacob
Need to use ET in conjunction with other inputs (Midas touch)
Non-intuitive (need multiple people)
Lots of noise
150-250
ms dwell time
30%
faster than mouse (but more variance)
4.
Query sketches. No one-size fits all
solution?
What
to search (topic, terms)
Where
to search (db + fields)
How
to search (Boolean, rank cutoffs, etc.)
5.
One-minute paper
What was the big point you learned in class today?
What is the main, unanswered question you leave
class with today?