Overview  Grades  Textbook  Schedule  Readings 
We will begin by expanding on what we discussed in CPSC 322 about decision making under uncertainty, and spend some time on the issues that arise in multistage decision processes, including planning and learning for acting. We will then cover approximate inference in Bnets and more efficient temporal inferences in HMM. Other graphical models will be also covered, including Conditional Random Fields (CRFs) . We will then switch to Deterministic environments and expand the treatment of logics from CPSC 322, considering First Order Logics (FOL), satisfiability and ontologies. Finally, we will study representations that attempt to combine Logics with Probabilities, like Markov Logics and Probabilistic Relational Models. Throughout, we will pay special attention to the understanding the state of the art and we will discuss several applications and research papers.
 Meeting Times: Monday, Wednesday, Friday, 9:00  10:00 PM
 First Class: Wed, Sep 7, 2016
 Location: DMP 301
 Instructor: Giuseppe Carenini carenini@cs.ubc.ca
 Instructor's Office Location: ICICS (CICSR) 105
 Instructor's Office Hours: Mondays 1011, my office ICICS (CICSR) 105.
 TAs and Office Hours:
 Jordon Johnson jordon@cs.ubc.ca ICCS X237, Mon 12pm
 Enamul Hoque Prince enamul.hoque.prince@gmail.com (marking only)
 Emily Chen emily404@hotmail.com Office hour: ICCS X237, Thurs 121pm
 Course Discussion Board: TBD the place to submit your questions and get answers, as well as see answers given to others
 AISpace: demo applets that illustrate some of the techniques covered in class
 Prerequisites: CPSC 322
 Final exam: TBA
Grading Scheme: Evaluation will be based on a set of assignments, a midterm, and an exam. Important: you must pass the final in order to pass the course. The instructor reserves the right to adjust this grading scheme during the term, if necessary.
 Assignments  15%
 Readings: Questions and Summaries  10%
 Midterm  30%
 Final  45%
If your grade improves substantially from the midterm to the final, defined as a final exam grade that is at least 20% higher than the midterm grade, then the following grade breakdown will be used instead.

Assignments  15%

Readings: Questions and Summaries  10%

Midterm  15%

Final  60%
The assignment grade will be computed by adding up the number of points you get across all assignments, dividing this number by the number of possible points, and multiplying by 20. Assignments will not be graded out of the same number of points; this means that they will not be weighted equally.
Late Assignments: Assignments are to be handed in BEFORE the start of lecture on the due date. However, every student is allotted four "late days", which allow assignments to be handed in late without penalty on three days or parts of days during the term. The purpose of late days is to allow students the flexibility to manage unexpected obstacles to coursework that arise during the course of the term, such as travel, moderate illness, conflicts with other courses, extracurricular obligations, job interviews, etc. Thus, additional late days will NOT be granted except under truly exceptional circumstances. If an assignment is submitted late and a student has used up all of her/his late days, 20% will be deducted for every day the assignment is late. (E.g., an assignment 2 days late and graded out of 100 points will be awarded a maximum of 60 points.)How late does something have to be to use up a late day? A day is defined as a 24hour block of time beginning at 10AM on the day an assignment is due. To use a late day, write the number of late days claimed on the first page of your assignment
Examples:
 Handing in an assignment at the end of lecture on the day it is due consumes one late day.
 Handing in an assignment at 9:15 the morning after it is due consumes one late day.
 Handing in an assignment at 10:30 the day after an assignment is due consumes two late days.
Assignments can be handed in electronically using ..........; this is the only way to hand in late assignments over a weekend. Written work can also be put in Giuseppe's mailbox in the main CS office (room 201); ask the secretary to timestamp it.
Missing Deadlines or Exams: In truly exceptional circumstances, when accompanied by a note from Student Health Services or a Department Advisor, the following arrangements will be made.
 If an assignment cannot be completed, the assignment grade will be computed based on the remaining assignments. Note that such an arrangement is extremely unusualthe late day system is intended to allow students to accommodate disruptions from moderate illness without contacting the instructor.
 If the midterm is missed, its grades will be shifted to the final. This means the final will count for 75% of the final grade, and assignments + readings will count for the remaining 25%.
 If the final is missed, a makeup final will be scheduled. This makeup final will be held as soon as possible after the regularly scheduled final.
Academic Conduct: Submitting the work of another person as your own (i.e. plagiarism) constitutes academic misconduct, as does communication with others (either as donor or recipient) in ways other than those permitted for homework and exams. Such actions will not be tolerated. Specifically, for this course, the rules are as follows:
 The written part of assignments is to be done alone. You may not, under any circumstances, submit any solution not written by yourself, look at another student's solution (this includes the solutions from assignments completed in the past), or previous sample solutions, and you may not share your own work with others. All work for this course is required to be new work and cannot be submitted as part of an assignment in another course without the approval of all instructors involved.
 You may, however, discuss your solutions and design decisions with your fellow students. In other words, you can talk about the assignments, but you cannot look at or copy other people's answers.
 The programming part of assignments is to be done either alone, or working with one other student. If you work with another student, each of you must hand in a copy of your work separately. You may not submit any solution not written by yourself and this one other student, look at other students' solutions (this includes the solutions from assignments completed in the past), or previous sample solutions, and you may not share your own work with others. All work for this course is required to be new work and cannot be submitted as part of an assignment in another course without the approval of all instructors involved.
Violations of these rules constitute very serious academic misconduct, and they are subject to penalties ranging from a grade of zero on the current and *all* the previous assignments to indefinite suspension from the University. More information on procedures and penalties can be found in the Department's Policy on Plagiarism and collaboration and in UBC regulations on student discipline . If you are in any doubt about the interpretation of any of these rules, consult the instructor or a TA!
Selected Chapters of Artificial Intelligence: foundations of computational agents by David Poole and Alan Mackworth, Cambridge University Press, 2010  Complete book available online and possibly from Russell and Norvig's Artificial Intelligence: A Modern Approach (third edition) [webpage]. I've arranged for a copy to be put on reserve in the CS reading room.. Although these texts will be our main reference for the class, it must be stressed that you will need to know all the material covered in class, whether or not it is included in the readings or available online. Likewise, you are responsible for all the material in assigned readings, whether or not it is covered in class.Here is where you can find the course schedule and the PPT and PDF files from lectures. These dates will change throughout the term, but this schedule will be kept up to date. Assignment due dates are provided to give you a rough sense; however, they are also subject to change. I will try to always post the slides in advance (by 9am). After class, I will post the same slides inked with the notes I have added in class.
Date  Lecture  Notes 
1 Wed, Sep 7  Course Overview [pdf ]  Assignment 0 (you'll do this on a Google Form, available on Connect) 
2 Fri, Sep 9  Value of Info and Control  start Markov Decision Processes (MDPs) [pdf ]  "322" Slides on Decision Networks and on Markov Chains 
3 Mon, Sep 12  MDP example start Value Iteration [pdf ]  Assignment 0 due, Practice Ex 9.C 
Wed, Sep 14  NO LECTURE  
4 Fri, Sep 16  finish MDPs Value Iteration [pdf ] 
FYI (not a required reading!) Planning with Markov Decision Processes: An AI Perspective, Synthesis Lectures on Artificial Intelligence and Machine Learning June 2012, 210 pages 
5 Mon, Sep 19  Partially Observable MDPs (POMDPs) [ pdf]  
6 Wed, Sep 21

POMDPs (cont') [pdf ] 
 Assignment 1 out 
Blackjack.xml See http://www.cs.uwaterloo.ca/~ppoupart/software.html for code and sample problems for Symbolic Perseus algorithm for factored POMDPs 
7 Fri, Sep 23 
Reinforcement Learning (RL) [pdf ]

Practice Ex 11.A 
8 Mon, Sep 26  Reinforcement Learning (RL) (cont') [pdf ]  
9 Wed, Sep 28  Paper Discussion MDP for scheduling (Medicine) [ ppt ] [ pdf ] YOUR QUESTIONS 
A Markov decision process approach to
multicategory patient scheduling in a diagnostic facility,
Artificial Intelligence in Medicine Journal, 2011
[pdf]
MDPs vs. Heuristic Methods

10 Fri, Sep 30  Finish RL  SARSA [ pdf ]  Ex 11.B 
11 Mon, Oct 3  Recap BNets  Start Approximate Inference in BNets [pdf ]  Practice Ex 6.E, BN Company NorSys assignment1 due / assignment2 out hmw1.zip 
12 Wed, Oct 5  Approx. Inference  Likelihood Weighting, MCMC (Gibbs Sampling) [pdf ]  BNet tool (with approx. inference algorithms) GeNIe 
13 Fri, Oct 7 
Paper Discussion
(ITS) 
Paper on
application of a relatively large BNet (where approx. inference is needed) slides [ pdf ] 
Using Bayesian Networks to Manage Uncertainty in Student Modeling.
Journal of User Modeling and UserAdapted Interaction 2002
[pdf] Dynamic
BN
(required
only up to page 400) Carnegie Learning Workshop on ill defined domains Conf. on Educational Data Mining 
Mon, Oct 10  Thanksgiving  
14 Wed, Oct 12  Temporal Inference  HMM (Filtering, Prediction) [ pdf ]  
15 Fri, Oct 14  HMM (Smoothing, just start Viterbi) [ pdf ]  
16 Mon, Oct 17

Finish Viterbi  Approx. Inference in Temporal Models (Particle Filtering) [pdf ]  
17 Wed, Oct 19  Intro Graphical Models Undirected Graphical Models  Markov Networks [pdf ] 

18 Fri, Oct 21 
Inference in Markov Networks Conditional Random Fields (CRFs)
 Naive Markov [ pdf ]

assignment2 due FYI (not a required reading!) An Introduction to Conditional Random Fields. Charles Sutton, AndrewMcCallum. Foundations and Trends in Machine Learning 4 (4). 2012. 
19 Mon, Oct 24  Linear Chain CRFs  NLP applications [ pdf ]  MALLET 
Wed, Oct 26  Midterm exam (55 mins, same room DMP 301 ) 
WE START AT 9am SHARP 
20 Fri, Oct 28  Full Propositional Logics, Language and Inference [ pdf ]  
21 Mon, Oct 31  Finish Resolution, Satisfiability, WalkSAT [ pdf ]  
22 Wed, Nov 2  SAT encoding example  First Order Logics (FOL) [ pdf ]  assignmet3 out 
23 Fri, Nov 4 
Ontologies/Description Logics: Wordnet, UMLS, Yago, Probase..... [ pdf ] 
 Wordnet
and YAGO
(Wikipedia + Wordnet + GeoNames).
See also MS Research Probase, Google Knowledge Graph and Freebase and MS Concept Graph  (Domain
specific thesaurus)
Medical Subject
Headings (MeSH) 
24 Mon, Nov 7  Similarity Measures: Concepts in Ontologies and Distributional for Words [ pdf ]  
25 Wed, Nov 9  NLP: ContextFree Grammars and Parsing [ pdf ] 
SKIP THIS PAPER THIS YEAR Paper Discussion (NLP) [ ppt ] [ pdf ] [DEMO]Carenini G., Ng R., Zwart E., Extracting Knowledge from Evaluative Text, Third International Conference on Knowledge Capture (KCAP 2005). Banff, Canada. October 25, 2005. [pdf] 
Fri, Nov 11  remembrance day  
26 Mon, Nov 14  Probabilistic Context Free Grammar (1) [ pdf ]  
27 Wed, Nov 16  Probabilistic Context Free Grammar (2) [ pdf ]   Berkeley Parser with demo 
28 Fri, Nov 18  Paper Discussion (NLP) on PCFG and CRFs [ ppt ] [ pdf ] 
portions of CL paper
CODRA: A Novel Discriminative Framework for Rhetorical Analysis. Computational Linguistics (CL (2015)) Vol. 41, No. 3: 385–435, MIT press only sections 1, 3 and 4 are mandatory DEMO 
29 Mon, Nov 21  Markov Logics (1) Representation [ pdf ] 
assignmet3 due  assignmet4 out Markov Logic: An Interface Layer for Artificial Intelligence
P. Domingos
University of Washington
D. Lowd
University of Oregon

30 Wed, Nov 23  Markov Logics (2) Inference [ pdf ]  Alchemy is a software package providing a series of algorithms for statistical relational learning and probabilistic logic inference, based on the Markov logic representation. 
31 Fri, Nov 25  Finish Markov Logics Inference + Applications [ pdf ]  (only if we cover plate notation) Practice Ex. 14.A 
32 Mon, Nov 28  Probabilistic Relational Models (1) Representation [ pdf ]  
33 Wed, Nov 30  Probabilistic Relational Models (2) Parameters and Inference [ pdf ]  
34 Fri, Dec 2  Beyond 3/422, AI research, Watson etc. [ pdf ] 
assignmet4 due  Some relevant papers form IUI15 Modeling users' interests 
DEC 09 07:00 PM 
Final Exam
Room:
MCLD 228 

