DeepLearn 2023 Winter
8th International School
on Deep Learning
Bournemouth, UK · January 16-20, 2023
Registration
Downloads
  • Call DeepLearn 2023 Winter
  • Poster DeepLearn 2023 Winter
  • Lecture Materials
  • Home
  • Schedule
  • Lecturers
  • News
  • Accommodation
  • Info
    • Travel from London to Bournemouth
    • Sponsoring
    • Code of conduct
    • Visa
    • Testimonials
  • Home
  • Schedule
  • Lecturers
  • News
  • Accommodation
  • Info
    • Travel from London to Bournemouth
    • Sponsoring
    • Code of conduct
    • Visa
    • Testimonials
Marco Duarte

Marco Duarte

University of Massachusetts, Amherst

[introductory/intermediate] Explainable Machine Learning

Summary

Machine learning (ML) methods have been remarkably successful for a wide range of application areas in the extraction of essential information from data. An exciting and relatively recent development is the uptake of ML in the natural sciences, where the major goal is to obtain novel scientific insights and discoveries from observational or simulated data that make sense to the specialized practitioner. In a different vein, the increasing pervasiveness of applied machine learning has drawn attention to aspects of safety and ethics in the ML decision-making process – is the practitioner confident that the ML decisions will be safe for the user, and that the decisions made do not have any ethical shortfalls? In this short course, we review explainable machine learning in view of these aspects and discuss three core elements that we identified as broadly relevant: transparency, interpretability, and explainability. With respect to these core elements, we provide a survey of recent scientific works that incorporate machine learning and the way that explainable machine learning is used in combination with domain knowledge and application requirements.

Syllabus

  • Introduction and Motivation: ML and Science, Safety, and Ethics
  • Terminology and Definitions
  • Explainability: ML Outputs
  • Explainability: ML Model Structure and Design
  • Explainability: ML Model Parameters

References

  • F. Doshi-Velez and B. Kim, “Towards a rigorous science of interpretable machine learning,” 2017, arXiv:1702.08608.
  • R. Guidotti, A. Monreale, S. Ruggieri, F. Turini, F. Giannotti, and D. Pedreschi, “A survey of methods for explaining black box models,” ACM Comput. Surv., vol. 51, no. 5, pp. 1–42, Aug. 2018.
  • Z. C. Lipton, “The mythos of model interpretability,” Commun. ACM, vol. 61, no. 10, pp. 36–43, Sep. 2018.
  • G. Montavon, W. Samek, and K.-R. Müller, “Methods for interpreting and understanding deep neural networks,” Digit. Signal Process., vol. 73, pp. 1–15, Feb. 2018.
  • W. James Murdoch, C. Singh, K. Kumbier, R. Abbasi-Asl, and B. Yu, “Definitions, methods, and applications in interpretable machine learning,” Proc. Nat. Acad. Sci., vol. 116, no. 44, pp. 22071-22080, 2019.

Pre-requisites

Familiarity with basic machine learning problems (detection, classification, estimation) and approaches (hypothesis testing, support vector machines, neural networks).

Short bio

Marco F. Duarte received the B.Sc. degree (Hons.) in computer engineering and the M.Sc. degree in electrical engineering from the University of Wisconsin-Madison, Madison, WI, USA, in 2002 and 2004, respectively, and the Ph.D. degree in electrical and computer engineering from Rice University, Houston, TX, USA, in 2009.

He was an NSF/IPAM Mathematical Sciences Postdoctoral Research Fellow of the Program of Applied and Computational Mathematics, Princeton University, Princeton, NJ, USA, from 2009 to 2010, and the Department of Computer Science, Duke University, Durham, NC, USA, from 2010 to 2011. He is currently an Associate Professor with the Department of Electrical and Computer Engineering, University of Massachusetts Amherst, MA, USA. His research interests include machine learning, compressed sensing, sensor networks, and computational imaging.

Dr. Duarte received the Presidential Fellowship and the Texas Instruments Distinguished Fellowship, in 2004, and the Hershel M. Rich Invention Award, in 2007, from Rice University. He was a recipient of the IEEE Signal Processing Society Overview Paper Award (with Y. C. Eldar), in 2017, and the IEEE Signal Processing Magazine Best Paper Award (with M. Davenport, D. Takhar, J. Laska, T. Sun, K. Kelly, and R. Baraniuk) in 2020. He is also an Associate Editor of the IEEE Transactions on Signal Processing.

Other Courses

deeplearn-speaker-yi-maYi Ma
Daphna WeinshallDaphna Weinshall
Eric P. XingEric P. Xing
Matias Carrasco KindMatias Carrasco Kind
deeplearn-speaker-nitesh-chawlaNitesh Chawla
Sumit ChopraSumit Chopra
Luc De RaedtLuc De Raedt
Joao GamaJoão Gama
Claus HornClaus Horn
Zhiting Hu & Eric P. XingZhiting Hu & Eric P. Xing
deeplearn-speaker-nathalie-japkowiczNathalie Japkowicz
deeplearn-speaker-gregor-kasieczkaGregor Kasieczka
Karen LivescuKaren Livescu
deeplearn-speaker-david-mcallersterDavid McAllester
deeplearn-speaker-dhabaleswar-k-pandaDhabaleswar K. Panda
Fabio RoliFabio Roli
Bracha ShapiraBracha Shapira
deeplearn-speaker-kunal-tawarKunal Talwar
Tinne TuytelaarsTinne Tuytelaars
deeplearn-speaker-lyle-ungarLyle Ungar
speakers-bram-van-ginnekenBram van Ginneken
deeplearn-speaker-yudong-zhangYu-Dong Zhang

DeepLearn 2022 Winter

CO-ORGANIZERS

Bournemouth University
Department of Computing and Informatics

Universitat Rovira i Virgili, Tarragona

Institute for Research Development, Training and Advice – IRDTA, Brussels/London

Active links
  • DeepLearn 2023 Summer – 10th International Gran Canaria School on Deep Learning
  • BigDat 2023 Summer – 7th International School on Big Data
  • DeepLearn 2023 Spring – 9th International School on Deep Learning
Past links
  • DeepLearn 2022 Autumn
  • DeepLearn 2022 Summer
  • DeepLearn 2022 Spring
  • DeepLearn 2021 Summer
  • DeepLearn 2019
  • DeepLearn 2018
  • DeepLearn 2017
© IRDTA 2022. All Rights Reserved.
We use cookies on our website to give you the most relevant experience by remembering your preferences and repeat visits. By clicking “Accept All”, you consent to the use of ALL the cookies. However, you may visit "Cookie Settings" to provide a controlled consent.
Cookie SettingsAccept All
Manage consent

Privacy Overview

This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary
Always Enabled
Necessary cookies are absolutely essential for the website to function properly. These cookies ensure basic functionalities and security features of the website, anonymously.
CookieDurationDescription
cookielawinfo-checkbox-advertisement1 yearThe cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Advertisement".
cookielawinfo-checkbox-analytics11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics".
cookielawinfo-checkbox-functional11 monthsThe cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional".
cookielawinfo-checkbox-necessary11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary".
cookielawinfo-checkbox-others11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other.
cookielawinfo-checkbox-performance11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance".
PHPSESSIDsessionThis cookie is native to PHP applications. The cookie is used to store and identify a users' unique session ID for the purpose of managing user session on the website. The cookie is a session cookies and is deleted when all the browser windows are closed.
viewed_cookie_policy11 monthsThe cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data.
Functional
Functional cookies help to perform certain functionalities like sharing the content of the website on social media platforms, collect feedbacks, and other third-party features.
Performance
Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.
Analytics
Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics the number of visitors, bounce rate, traffic source, etc.
CookieDurationDescription
_ga2 yearsThis cookie is installed by Google Analytics. The cookie is used to calculate visitor, session, campaign data and keep track of site usage for the site's analytics report. The cookies store information anonymously and assign a randomly generated number to identify unique visitors.
_gat_gtag_UA_74880351_91 minuteThis cookie is set by Google and is used to distinguish users.
_gid1 dayThis cookie is installed by Google Analytics. The cookie is used to store information of how visitors use a website and helps in creating an analytics report of how the website is doing. The data collected including the number visitors, the source where they have come from, and the pages visted in an anonymous form.
Advertisement
Advertisement cookies are used to provide visitors with relevant ads and marketing campaigns. These cookies track visitors across websites and collect information to provide customized ads.
Others
Other uncategorized cookies are those that are being analyzed and have not been classified into a category as yet.
SAVE & ACCEPT
Powered by CookieYes Logo