Start time: 9.00 am GMT
End time: 11.00 am GMT
Announcements from the European Actuarial Academy: In an increasingly data-driven world both the data being available to actuaries in their daily work as well as the toolbox of algorithms grow significantly. This is because new sources of data are open for actuarial work, like geo-risk data on granular levels, data from sensors and all kinds of IoT devices, health trackers, telematics data etc. On the algorithmic side, methods from the area of analytics, predictive modelling and machine learning as well applications in the area of artificial intelligence (ML in the following). These methods also extend the field of activity of actuaries, e.g. into the field of customer behaviour analytics.
Within this brave new world actuaries must maintain the highest professional standards of data and algorithmic quality. This means that there has to be a systematic management of input data as well as of algorithmic quality throughout actuarial procedures.
We start the web session with a look on the legal and regulatory side. What are the most relevant dimensions of data and algorithmic quality and how does this refer to explainability and fairness of ML? What are relevant measures of quality in this respect?
We proceed by having a look at existing European regulation and legislation, including actuarial quality standards, which govern the use and the application of these new methodologies. We also include current discussions which are underway in insurance, actuarial, regulatory and ML communities.
In order to take a view on the practical side, we show and discuss various approaches of creating transparency and of managing risk around ML. This refers to both the most relevant selection of ML approaches, but also to additional measures to interpret ML results (like LIME, SHAP etc.). We also refers to most useful R/Python libraries.
Finally, we will have a look at a typical use case, and present how key quality challenges can be tackled in this specific case. This should enable the participant to transfer practical measures of data and algorithmic quality management to his/her own realm.
The web session is open to all interested persons, who ideally (but not necessarily) have a basic knowledge of the algorithmic universe of ML.
This web session has been developed for all levels of practitioners who want to gain a deeper understanding of requirements and state of the art techniques for data and algorithmic quality management. It has a special focus on new types of algorithms like Predictive Modeling, Machine Learning and Artificial Intelligence, and includes a case study to show practical applications in a real-world actuarial context.
Please check with your IT department if your firewall and computer settings support web session participation (the programme GotoTraining/GotoWebinar is used for the web session).
To make a reservation click here. The registration fee is € 100.00 plus 16% VAT
- Why is it important to look at data and algorithmic quality?
- Dimensions of quality – for data and for actuarial models
- Let’s have a look at current regulation – what do professional standards and supervisors tell us?
- Predictive modelling, Analytics, Machine Learning, AI (ML) – a short overview over algorithmic approaches
- Transparency, explainability and fairness – why are they even more important for these approaches?
- Let’s have a look at the further regulatory discussion – what do professional associations and supervisors think about ML?
- A practical approach to increase explainability and manage risk around ML – including the discussion of a real-world case study
Dr Clemens Frey
Dr Clemens Frey is Partner of EMEIA Financial Services Consulting at Ernst & Young (EY) in Germany, leading the Data & Analytics activities of EY in Germany and Switzerland. He has 20 years of experience in consulting and the financial services industry. He is active in various Committees of the Germany Actuarial Association (DAV) and the IAA, including Vice Chair of the General Insurance Committee and member of the Actuarial Data Science Committee. He holds a Diploma in Mathematics and received his doctorate based on a work on self-learning agents.