TSRT14 Sensor fusion
Course Information VT2, 2021
IMPORTANT INFORMATION
Given the current COVID19 restrictions in place, the course will be given in distance mode also 2021. This implies a number of changes to how the course is executed. Throughout this page, these are highlighted with boxes with red background.
In short, all teaching activities will be performed online. Instead of regular campus lectures, a number of videos for each lecture have been produced covering the material usually covered on the lectures. As the lectures, these should be considered as an way into the textbook, not a substitute for reading the textbook. A QnA session in Zoom will be arranged in conjunction with each lecture. Lessons will take place in Zoom. Links to the Zoom sessions will be made available here, hence make sure you are registered for the course to be able to access relevant information. More detailed infraction about the different activities is available below.
Goal:
The student should after the course have the ability to describe the most important methods and algorithms for sensor fusion, and be able to apply these to sensor network, navigation and target tracking applications. More specifically, after the course the student should have the ability to: Understand the fundamental principles in estimation and detection theory.
 Implement algorithms for parameter estimation in linear and nonlinear models.
 Implement algorithms for detection and estimation of the position of a target in a sensor network.
 Apply the Kalman filter to linear state space models with a multitude of sensors.
 Apply nonlinear filters (extended Kalman filter, unscented Kalman filter, particle filter) to nonlinear or nonGaussian state space models.
 Implement basic algorithms for simultaneous localization and mapping (SLAM).
 Describe and model the most common sensors used in sensor fusion applications.
 Implement the most common motion models in target tracking and navigation applications.
 Understand the interplay of the above in a few concrete real applications.
The course comprise:
Lectures: 10
Exercise sessions: 8
Laboratory exercises: 2

Localization in acoustic sensor networks. A network with microphones detects, localizes, and tracks the target. Matlab files are provided. An introduction to the lab, its execution and the data is available here.
The lab contains one data collection part in our lab (RT3, Laboteket, which is located in house B, entrance 27, corridor C) and one data processing part where algorithms will be developed and applied to the data. An example of collected measurements is available here for those not attending the course, but still interested in performing the lab on their own.
All data collection sessions are canceled. All groups are instead asked to use this prerecorded data when performing the task described in the lab instructions.
An introductory QnA session dedicated to Lab 1 is being hosted Wednesday April 7, 2021 at 15:00 in Zoom.
Note: You should still sign up for the lab in Lisam as these groups will be used to handle the reports and reviews.
The participants will be examined with a lab report which will be peerreviewed by other students attending the course. Each lab group will review one report each. The report is due on Thursday May 6, 2021, at 23:59, the review report is due on Friday May 14, 2021, at 23:59. The updated report based on the review feedback is due on Friday May 21, 2021, at 23:59. Feedback on the report will be provided on Friday May 28 2021, at 23:59, and the resubmission of the lab report (if required) is due on Sunday June 6 2021, at 23:59. The report should be written in English.
Lab reports: The lab and review report should be submitted (as a PDF file) via Lisam.

Orientation estimation using smartphone sensors. In this lab an orientation filter will be implemented using measurements from gyroscope, accelerometer and magnetometer in a smartphone. The lab is compatible with any android phone containing these sensors (which most modern smartphones do). The students can either use their own phones, or use a phone provided by the course. Matlab files are provided as well as the Sensor Fusion Android app which will be needed to stream sensor data from the phone to matlab. This video introduces what should be done, and how to connect your smartphone to your computer. An example of collected measurements that can be used for debugging purposes, or if you lack an android device, can be found here.
The lab will consist of a 4 hour lab session in our computer rooms. The participants will be examined during the session and no written report will be required.
This lab will be performed from home. We do assume that you have access to at least one Android smartphone in each lab group. If not contact us!Instead of the regular lab sessions with onsite examination, you are expected to perform the lab at home and produce a short lab report of the findings followed by a short interactive session. Deadline for this report is May 18, 2021, at 12:00. An report template has been added to the lab files, labreport.m. Follow the instructions there and hand in the report in Lisam. An introductory QnA session (Zoom) will be provided May 12, 2021, at 13:15 to help you get started with the lab work. More details about the procedure will be provided in due time.
The course assistant is responsible for the lab schedule Anton Kullberg (anton.kullberg@liu.se).
Email list
Information during the course will be sent to the course mailing list. The list name, together with further information and signup, can be found on "Studentsidan".
Toolbox
The toolbox that will be used during the course can be downloaded here. Toolbox Manual can be downloaded here. A video providing an introduction of the toolbox by the author himself is provided here.
To activate the toolbox, run the included command
initSigSysin matlab. To use the latest version of the toolbox in the Linux computer labs, run:
module add course/TSRT14in a terminal prior to opening matlab, or install a current version of the toolbox in your home directory as you would at home.
Literature
 Statistical Sensor Fusion. Fredrik Gustafsson. Studentlitteratur, 2018, Third Edition.
 Statistical Sensor Fusion  Exercises. Christian Lundquist, Zoran Sjanic, and Fredrik Gustafsson. Studentlitteratur, 2015.
 Statistical Sensor Fusion  Laborations. Available from the homepage.
 Statistical Sensor Fusion  Matlab Toolbox Manual. Available here. (Introductory video.)
Examination
Written examination with Matlab.Organizers
Lecturer and examiner: Gustaf Hendeby, email: gustaf.hendeby@liu.se.
 Anton Kullberg email: anton.kullberg@liu.se.
 Carl Hynén, email: carl.hynen@liu.se.
Preliminary lecture plan
Lectures
Slides will be linked from the lecture number in advance.Nr.  Content  Suggested reading 

1 (slides, slides4up)  Course overview. Estimation theory for linear models.  Chapters 1, 2 
2 (slides, slides4up)  Estimation theory for nonlinear models.  Chapter 3 
3 (slides, slides4up)  CramérRao lower bound (CRLB). Models for sensor network applications.  Chapter 4 
4 (slides, slides4up)  Detection theory. Filter theory.  Chapters 5, 6 
5 (slides, slides4up)  Modeling and motion models.  Chapters 1214 
6 (slides, slides4up)  Kalman filter. Kalman filter approximations for nonlinear models (EKF, UKF).  Chapters 7, 8 
7 (slides, slides4up)  The pointmass filter and the particle filter.  Chapter 9 
8 (slides, slides4up)  The particle filter theory. The marginalized particle filter.  Chapter 9 
9 (slides, slides4up)  Simultaneous localization and mapping (SLAM).  Chapter 11 
10 (slides, slides4up)  Sensors and sensornear signal processing. Filter and model validation. Case study from industry.  Chapter 14, 15 
In distance mode, the lecture series is replaced by a number of prerecorded videos on covering all the topics usually treated on the lectures. The table below indicates which movies replace the each of the lectures.
The recorded material can be found on sensorfusion.se. The material there matches the lectures the following way.
Nr.  Content  Suggested entries 

0  Course introduction.  slides, slides4up 
1  Estimation theory for linear models.  15 
2  Estimation theory for nonlinear models.  69 
3  CramérRao lower bound (CRLB). Models for sensor network applications.  1011 
4  Detection theory. Filter theory.  1215, 25 
5  Modeling and motion models.  1619 
6  Kalman filter. Kalman filter approximations for nonlinear models (EKF, UKF).  2024, 26 
7  The pointmass filter and the particle filter.  2728 
8  The particle filter theory. The marginalized particle filter. Filter banks.  2933 
9  Simultaneous localization and mapping (SLAM).  3437 
10  Sensors and sensornear signal processing. Filter and model validation.  
Exam info  Information about the exam.  slides, slides4up 
Lab 1  Lab Work: Localization Using a Microphone Network.  38 
Lab 2  Lab Work: Orientation Estimation using Smartphone Sensors.  39 
SigSys  Introduction to Signals and Systems Toolbox.  40 
Preliminary exercise plan
Lessons
Nr.  Ex  Content 

1  2.1, 2.4, 3.1, 3.2, 3.6b, 2.3, 2.5  Estimation. 
2  4.10, 4.2, 4.3, 16.1  Sensor networks. 
3  2.10, 3.10, 3.7, 4.7, 4.8, 4.9, 5.4  Computerbased estimation and detection. 
4  6.1, 6.2, 12.1, 12.3, 13.1, 13.2, (12.2)  Filter theory and models. 
5  7.1, 7.2, 7.3, 8.1, 8.2, 8.4  Kalman filters and Kalman filter approximations. 
6  7.10, 8.6, 9.5, (16.3)  Computerbased filtering. 
7  11.1, 11.3  Computerbased SLAM. 
8  9.1, 9.2, 9.3, 14.2  Particle filtering and sensors. 
The physical lessons will be replace with open Zoom sessions, where you can pop in and ask questions. Please, be prepared when asking question to allow us to answer as many questions as possible in the limited time.
A list of links to upcoming Zoom meetings is available here.
Informationsansvarig: Gustaf Hendeby
Senast uppdaterad: 20211021