Keynote at #fusion24: Thomas Schön (Uppsala univ.) on "Modelling and Generating Data via Deep Probabilistic Representations".

Combining probabilistic modelling and deep learning is more interesting than most people thinks.

Application example: predict abnormalities in ECGs (auto inference useful for telehealth). 2.3M ECGs for learning (even more now). Reach human-level performance (2020).

But can also discover new knowledge by modelling myocardial infarction (2022).

Tomorrow, at the #fusion24 conference, 11:10 (in room 3A), I will present how to create #knowlegegraphs at scale from tabular data and a simple mapping declaration, using the #OntoWeaver software.

At #fusion24, Anne-Laure Jousselme, on "A model for an imperfect knowledge base for high-level information fusion experiments".

Evaluation of high-level fusion algo. HLIF's benchmark's datasets are smaller, need low-level fusion tasks. Involves high heterogeneity of sources & their qualities, many uncertainty types, focus on the global level. Have semantic upgrades of attributes [objects?]. Formalization of HLIF at different levels of abstraction (tactical → situation). Thus we need an ontology

At #fusion24, "From tactical picture to situation assessment evaluation: A Countering Uncrewed Aerial Systems illustration", presented by Benjamin Pannetier (CS group).

About evaluation of fusion systems. Six quality dimensions for tactical picture. Situation assessmen in the study is qualified by three of them.

Introduces semantic elevation: a transfer function projecting input variables in the "assessment universe".

Various erformances metrics for both tactical and situation assessment.

At #fusion24, Franck Mignet (Thales) on "Qualitative Causal Approach to Determining Adequate Training Data Quantity for Machine Learning".

How complex should a ML model be? Wow much training data? Look at the data generation process (DGP).

DPG as a causal process (as in Pearl, with probabilistic graphical models).

Model the process, reduce it from knowing relevant variables. We have joint probability distributions in the graph, discrete variables leads to a "local context combination".

At #fusion24, Thomas Wodko (Ulm univ. & MRM) presents his work on "Conflict Handling in Time-dependent Subjective Networks"

Motivation: observing the state of the weather.
But someone else (trustable to some degree) observes. Several observations. A predictor predicts weather

Uses subjective logic (binomial opinions). Weighted vector in uncertainty-disbelief/belief space. Ex. of op.: cumulative fusion

Then models trust with trusted fusion and trust revision (change trust after observations)

At #fusion24, Simone Severini has a keynote about "Quantum Technologies: Dream or Reality?"

Future of computation is about time (deploy, run) and energy (production, optimization).

Hardware: semiconductor constraints dictate new approaches, end-to-end hardware/software co-design is essential.

Algorithms: between 1940 and 2019: algo. sped up drastically.

Experimental physics is now a branch of computer science as well.

At #fusion24, Markus Walker talks about "Trustworthy Bayesian Perceptrons".

Classical NNs use point estimation, which lacks confidence intervals. Thus prone to overconfident predictions.

One solution: quantify uncertainty. Output of NN is thus density/moments. But uncertainty estimates are also often not trustworthy. Because weights are random variables, and enables quantification of uncertainty prediction.

At #fusion24, "Fusion of Individual and Population Graphs in a GNN Brain Disease Network", presented by a colleague who had his visa.

Subject: we have graphs representing the brain's regions of interest and their relationships.

Problem: we want to fuse individual graphs into a population graph.

Method is named IPGNN.

The presenter doesn't know how it's done. But it performs better and detects abnormal brain regions.

EOT

Danut-Vasile Giurgi at #fusion24 on "Fusion of Semantic Segmentation Models for Vehicle Perception Tasks".

Combining fusion with deep learning. CNN is for segmentation, uncertainties modeled with Shanon entropy, cross-fusion & late fusion with evidence theory.

Two input pipelines (lidar and images) on NN, then "PCR6+" fusion on the output (compared to Dempster-Shafer).

PCR6+ is more complicated, but has "neutrality of vacuous BBA" and no counter-intuitive behavior.