CS 583 Fall 2023

CS 583 - Fall 2023

Data Mining and Text Mining

Course Objective

This course has three objectives. First, to provide students with a solid background in the classic data mining and machine learning techniques and to introduce the latest research topics (e.g., out-of-distribution (novelty) detection, learning after model deployment, and lifelong/continual learning). Second, to ensure that students are able to read, and critically evaluate data mining research papers. Third, to ensue that students are able to implement and to use some of the important data mining and text mining algorithms.

Think and Ask! If you have questions about any topic or assignment, DO ASK me, TA or your classmates for help, I am here to make the course understood. DO NOT delay your questions. There is no such thing as a stupid question. The only obstacle to learning is laziness.

General Information

  • Instructor: Bing Liu
    • Email: Bing Liu
    • Office: CS 3190c, North End, 3rd Floor, Library

Section 1

  • Course Call Number: 30286
  • Lecture time slot: 12:30pm - 1:45pm Tue & Thu
  • Lecture hall: TBH 180E
  • Instructor office hours: 10:30am - 12:00noon Tuesdays

Grading

  • Final Exam: 30%
  • Midterm: 25%
  • Quizzes: 20%
  • Assignments: 25%
  • Assignments and the research project are done in groups of 2. Discussions with other students are allowed, but each group has to write your own code.

    Prerequisites

    Teaching materials

    Topics (subject to change; the reading list follows each chapter title)

    1. Introduction
    2. Data pre-processing
      • Data cleaning
      • Data transformation
      • Data reduction
      • Discretization
    3. Association rules and sequential patterns (Sections 2.1 - 2.7)
      • Apriori Algorithm
      • Mining association rules with multiple minimum supports
      • Mining class association rules
      • Sequetial pattern mining
      • Summary
    4. Supervised learning and Linear regression, gradient descent and neural networks (Chapter 3)
      • Decision tree induction
      • Classifier evaluation
      • Naive-Bayesian learning
      • Naive-Bayesian learning for text classification
      • Support vector machines
      • Linear regression and gradient descent
      • Neural Networks
      • K-nearest neighbor
      • Bagging and boosting
      • Summary
    5. Unsupervised learning (Clustering) (Chapter 4)
      • K-means algorithm
      • Representation of clusters
      • Hierarchical clustering
      • Distance functions and data standardization
      • Cluster evaluation
      • Discovering holes and data regions
      • Summary
    6. Semi-supervised learning (Sections 5.1.1, 5.1.2, 5.2.1 - 5.2.4)
      • LU learning: Learning from labeled and unlabeled examples
      • PU learning: Learning from positive and unlabeled examples
      • Novelty (or out-of-distribution) detection
    7. Introduction to Information retrieval and Web search (Sections 6.1 - 6.6, and 6.8)
      • Information retrievel models
      • Basic text processing and representation
      • Cosine similarity
      • Relevance feedback and Rocchio algorithm
    8. Social network analysis (Sections 7.1 - 7.4)
      • Centrality and prestige
      • Citation analysis: co-citation and bibliographic coupling
      • The PageRank algoithm (of Google)
      • The HITS algorithm: authorities and hubs
    9. Sentiment analysis and opinion mining (Sections 11.1 - 11.6; check out my two books)
      • Sentiment analysis and emotion analysis problems
      • Document-level Sentiment classification
      • Sentence-level subjectivity and sentiment classification
      • Aspect-level sentiment analysis
      • Mining comparative opinions
      • Sentiment lexicon generation
    10. Recommender systems
      • Content-based recommendation
      • Collaborative filtering based recommendation
        • K-nearest neighbor
        • Association rules
        • Matrix factorization
    11. Lifelong and continual learning (the Lifelong Machine Learning book and research papers) Shorter Version
      • Introduction to lifelong/continual learning
      • Class and task continual learning
      • Out-of-Distribution Detection and Open-world learning
      • Learning after model deployment (on-the-job learning)
    12. Information integration (Section 10.8)

    Rules and Policies

    UIC Conunseling Center

    We value your mental health and emotional wellness as part of the UIC student experience. The UIC Counseling Center offers an array of services to provide additional support throughout your time at UIC, including workshops, peer support groups, counseling, self-help tools, and initial consultations to speak to a mental health counselor about your concerns. Please visit the Counseling Center website for more information (https://counseling.uic.edu/). Further, if you think emotional concerns may be impacting your academic success, please contact your faculty and academic advisers to create a plan to stay on track.

    My Home Page

    By Bing Liu, Aug. 4, 2023