Four-dimensional (4D) topographic point clouds contain information on surface change processes and their spatial and temporal characteristics, such as the duration, location, and extent of mass movements. To automatically extract and analyze changes and patterns in surface activity from this data, methods considering the spatial and temporal properties are required. The commonly used model-to-model cloud comparison (M3C2) point cloud distance reduces uncertainty through spatial averaging for bitemporal analysis. To extend this concept into the full spatiotemporal domain, we use a Kalman filter for change analysis in point cloud time series. The filter incorporates M3C2 distances together with uncertainties obtained through error propagation as Bayesian priors in a dynamic model. The Kalman filter yields a smoothed estimate of the change time series for each spatial location in the scene, again associated with an uncertainty. Through the temporal smoothing, the Kalman filter uncertainty is generally lower than the individual bitemporal uncertainties, which therefore allows the detection of more changes as significant. We apply our method to a dataset of tri-hourly terrestrial laser scanning point clouds of around 90

Near-continuous time series of 3D topographic point clouds have recently become readily available through applications in research

However, with any measurement taken in the real world, uncertainties need to be considered. In the case of topographic laser scanning, uncertainty may result in estimated change values that seemingly correspond to a change in the topography of the involved surfaces despite no real change having occurred. For example, erosion or accumulation with a low velocity can only be detected with confidence after a certain period, when the change magnitude exceeds the random effects introduced by the measurements.

Two approaches can be combined to handle uncertainty: statistical tests, such as a

The other approach, alleviating uncertainty, takes advantage of the fact that no two measurements are completely uncorrelated. Generally, the closer
they are to each other, the more they are alike. In space, this has been described in Tobler's First Law of Geography

Spatial smoothing, i.e., aggregating points spatially before change analysis, reduces the spatial resolution at which change can be detected. In the
widely employed multiscale model-to-model cloud comparison (M3C2) algorithm, a method to compare surfaces represented by two point clouds, a search
cylinder is used to select and aggregate points of the two epochs before measuring the distance between them

In the time domain, measurement series are often interpreted as signals. Signal smoothing is widely used, and a multitude of methods have been
established. In many approaches, a moving window is employed to aggregate multiple consecutive measurements or samples, removing or reducing
outliers. Depending on the aggregation function, different filters are established and may be mathematically described as 1D convolutions

To smooth time series, (B-)splines are commonly employed

The geostatistical prediction method of Kriging

Four-dimensional (4D) point cloud analyses have employed spatial and temporal smoothing separately to increase the signal-to-noise ratio of the change signal

In our case, the observations are bitemporal point cloud distances, which we refer to as “displacements” from here on out. In a Bayesian sense, each
observation provides prior information on the system. The Kalman filter combines this information in a joint probability distribution to obtain
estimates for the target variables that are, in general, more accurate (less uncertain) than the original observations. When estimates of position,
velocity, and acceleration have been made, they can even be propagated into the future beyond the newest measurement

We use the Kalman filter on change values between each epoch and a fixed reference epoch to obtain a smoother, less uncertain time series of change
for each spatial location. Accurate uncertainty estimates for the change values are obtained by applying M3C2-EP

To show the applicability of our method, we analyze a synthetic scene and a dense (tri-hourly) time series of terrestrial laser scanning (TLS) point
clouds acquired in Vals, Tyrol

The contribution of our research is twofold. First, we combine the existing method of M3C2-EP point cloud change quantification, including the quantification of associated uncertainty with a Kalman filter to take advantage of the temporal domain, resulting in lower detection thresholds and less noise in the change extracted from the 3D time series. Second, we show how different smoothing methods for topographic point cloud time series influence derived change patterns in the observed scene obtained via clustering.

We investigate the performance of our method using two different datasets: a real scene featuring surface erosion and snow cover changes on a debris-covered slope and a synthetic scene created from a 3D surface mesh model with known deformation properties.

For a real use case, we are using TLS data acquired over approx. 3 months, totaling 674 epochs from 17 August 2021 at 12:00 LT to 15 November
2021 at 18:00 LT (all times are in local time) in Vals, Tyrol, Austria (WGS84: 47

As the point clouds are not perfectly aligned with each other

During this period, both natural and anthropogenic surface changes were captured. To investigate the benefits of full 4D point cloud
analysis, we focus on relatively small-magnitude and long-duration changes. We therefore select an area of interest consisting of the debris-covered
slopes, excluding the valley in which excavator works lead to sudden and high-magnitude changes (Fig.

The dataset presented here is part of a continuous monitoring campaign, which was operated in three subsequent setups, one in 2020 and two in 2021,
and of which we use data from the third setup. It was designed to collect data for various research and development activities regarding the
deployment of long-range laser scanners within a remotely controlled, web-based monitoring system from an engineering geodetic perspective. In
addition to the laser scanner, inclination sensors on the pillars (PC-IN1-1

In addition to dataset alignment, preprocessing consisted of the removal of outliers and vegetation points using the statistical outlier filter

In these data, we quantified bitemporal change and uncertainties using M3C2-EP (presented in detail in Sect.

A 3D view of the number of points found in the M3C2 search cylinders for the epoch 15 November 2021 at 18:00 LT. The search radius of 0.5

To estimate the ranging uncertainty and its variation over time, we also used the prisms installed in the scene. After extracting them from the full
high-resolution scans using thresholding on the returned amplitude and approximate locations of the prisms, a planar fit was carried out. The variance
of the orthogonal distances to this plane was then extracted for each prism and averaged for each epoch. The resulting precision measure, ranging
from 0.004 to 0.006

To validate and compare different methods of 4D point cloud processing, we created a synthetic 4D point cloud dataset. A mesh model of a
100

Workflow depicting the generation of the synthetic dataset. First, a planar object is created and scanned from a range of 300

We applied displacements orthogonally to the mesh surface and rotated the mesh to represent a slope of 60

In every epoch, a different transformation was then applied to the point cloud. Subsequently, M3C2-EP was used to quantify bitemporal surface changes
and associated uncertainties, where the same normal distribution parameters were used as covariance information for the transformation. The full point
cloud of the null epoch (no deformations) was used as core points, and the normal vector was defined to be the plane-normal vector of the original
mesh for all points. For M3C2-EP and M3C2 distance calculations, a search radius of 1

Flowchart of the workflow undertaken in this research. The novel method is highlighted using bold arrows and boxes. Three time-series-based methods and one bitemporal method are compared. Additionally, we use

In this section, we

show how measurement uncertainties can be propagated to bitemporal change values using M3C2-EP (Sect.

present a baseline method of time series smoothing using a temporal median filter (Sect.

introduce the Kalman filter smoother and the corresponding equations (Sect.

use clustering to identify areas of similar change patterns (Sect.

The full processing workflow is shown in Fig.

To enable analysis of the time series, we convert the 4D point cloud into a series of change values at selected locations (the core points). As we
want to rigorously consider uncertainties in order to separate noise from change signal, we employ multiscale model-to-model cloud comparison using
error propagation

The error propagation is carried out by taking the mathematical model of how point cloud coordinates are obtained from transforming measured
quantities (range, azimuth angle, and elevation angle) and computed quantities (transformation parameters). This model is then linearized by a Taylor approximation. Following

While M3C2 itself also quantifies the uncertainty in the estimated bitemporal displacements, this estimate is inferred from the data distribution and
influenced by non-orthogonal incidence angles and object roughness within the M3C2 search cylinder

The M3C2-EP point cloud distance measure hence allows transferring uncertainty attributed to each of the original measurements, i.e., laser ranges and angular measurements, to uncertainty in point cloud displacement for every individual core point. Additionally, uncertainties from the alignment of the two datasets are considered. These uncertainties are highly correlated between points of the same epoch. Thereby, the obtained M3C2-EP distance and its spatially heterogeneous uncertainty are representing our knowledge of the point cloud displacement itself, not of the measurements. This property allows us to use the displacement for following analyses.

We resample the time series to a regular dataset by using linear interpolation to fill in missing data points, e.g., caused by temporary occlusion in the observed scene.

As a baseline method to compare the Kalman filter result with, we use a temporal median filter for smoothing the time series, as presented by

When applying error propagation on the median filter, the output value can be seen as a linear combination of the input values multiplied by weights

Furthermore, a window size needs to be chosen. If the chosen size is too large, temporary surface alterations, such as a deposition of material followed by
erosion, will be smoothed out. For windows that are too small, the benefit of smoothing in terms of outlier elimination becomes negligible. To account for
this, the window must be chosen to be smaller than the expected change rates

We present the use of a Kalman filter, which can be used to incorporate multiple observations (in our case the change values for each epoch,
quantified along the local 3D surface normals using M3C2-EP; cf. Sect.

While the Kalman filter is an assimilation method, which allows updates by adding new data points, we consider an a posteriori analysis and assume that all measurements are available at the time of analysis. This allows us not only to consider previously observed change values at a given location but also to incorporate future observations. To that end, we can make use of the full 4D domain of the dataset.

The Kalman filter can be seen as a temporal extension of adjustment computation. It allows the integration of measurements over time into a
state vector

For the propagation from one state to the next (the “prediction” step), the state transition matrix

We use the
nomenclature of the Python package “FilterPy” and the accompanying book

Here, the next position (at

An observation (a single measurement)

When considering long time series of topographic point cloud data, the local direction of the surface (here calculated as the normal vector of the null epoch) might change. A way to incorporate this into the Kalman filter would be to project the quantified changes onto the original change direction, e.g., by altering

The step size of the update

As in adjustment computation, every observation

The process noise matrix

A common approach to model process noise is discrete white noise. Here, the variance of the highest-order element (e.g., the acceleration) is defined
as

The exact choice of this process noise model, especially the choice of the value of

To initialize the iterative Kalman filter algorithm, starting values for the state and its uncertainty are required. As we start the time series at
the null epoch with zero change, we define the initial state vector to be

Running the Kalman filter then results in estimates of the state and its uncertainty for each point in time based on all previous states and
measurements. This is referred to as a “forward pass”, as calculation on a time series starts with the first measurement and then continues forward
in time

First, we predict the future state

Next, we predict the future covariance by using the law of error propagation (cf. Eq.

As a consequence, the state of the system becomes less certain over a longer time and can be made more certain by introducing a new observation with adequate uncertainty. For example, in the case of near-continuous TLS, change can be estimated 1 d into the future after having acquired 1 week of hourly observations. This allows for estimating whether a larger interval between the observations still fully represents the expected changes.

If an observation at a given point in time

Using the predicted covariance, the measurement function

Finally, the Kalman gain

Similarly, the state covariance is updated through the Kalman gain. Note that the term

Repeating Eqs. (

We can, however, also include consecutive states and measurements in the estimation of the state, which can decrease uncertainty and lead to a better
estimate of the state as, for example, outliers are much more easily detected compared to just using a forward pass. The Rauch–Tung–Striebel (RTS) smoother
is a linear Gaussian method (such as the Kalman filter itself) used to consider consecutive states of the system

It is important to note that the choice of a null or reference epoch influences the results, as all change detections and quantifications relate to this epoch. The Kalman-filtered smooth trajectory and its corresponding uncertainty also signify the change related to the null epoch, and a choice of a different reference epoch would likely result in different detected changes.

To represent the information contained in the time series in a static map, we use a clustering approach. Here, data points with similar features are
aggregated into groups or clusters. Due to its unsupervised nature, no training data are required, which would often be lacking in the case of
topographic monitoring of scenes typically featuring variable, a priori unknown surface dynamics. Instead, the resulting clusters can be analyzed
with respect to their size, location, and magnitude, as well as visually by their shape in 3D space, and ultimately attributed to certain process
types. We use

In

We first present the impact of different model and parameter choices on the clustering and change detection results (Sect.

We tested three different model choices for the Kalman filter and a number of parameters for each model. The different models increase in complexity
and dimensions of their state vector. The first model simply tracks the displacement value itself. The assumption for this model is that the allowed
variance (

Retrieved Kalman smoother estimates and the raw time series (black) for three different models:

For each of the three models, we experimented with the process noise variance

To find appropriate values for the state variance in each of the models, the following options were investigated. For the displacement-only model
(order 0), values of

Comparison of Kalman smoother trajectories, temporal median smoothing, and the raw time series for three selected locations, corresponding to labels “i”

For the velocity-based model (order 1), we investigated values of

For sudden changes that result in a trajectory similar to a “step function”, smaller values of

With increasing order of the model, the smooth trajectories are more oscillating. For the model of order 2 (Fig.

To investigate the performance of the Kalman filter within the field of 4D change analysis methods, we compared our results to other methods using the
same dataset (cf. Fig.

Percentage of time over the full time span where change was detected (displacement value larger than the respective level of detection). The results from the bitemporal detection

An important result of the Kalman filter is the quantification of the level of detection, which we compare to the level of detection of the bitemporal
M3C2 with error propagation. In Fig.

The exploitation of temporal autocorrelation, i.e., aggregating multiple measurements from multiple points in time yields much smaller levels of detection in the change detection compared to the bitemporal approach. Consequently, smaller-magnitude changes – as long as they are permanent enough for the Kalman filter to pick them up – can be detected as significant. In the case of a slope movement, this is especially important, as small movements over long time spans add up to larger displacements. Through continuous observation, the detection of surface displacement and the quantification of the change rate can be achieved earlier and with higher precision.

In Fig.

We show the locations of points where change is detected with only the bitemporal approach, where it is detected with multitemporal Kalman filtering,
and where it is detected with both in Fig.

Sum of squared residuals (estimated – true) for the synthetic change aggregated for all core points over the full time series. The true displacement is calculated by using the

For the real dataset, there are no validation data or other area-wide reference data with a much higher accuracy available, as TLS is considered to be
the “gold standard”. Local, point-based validation can be achieved with total station measurements if such measurements are available within the
area of interest. In our case study in Vals, total station measurements were only available for reflectors installed in stable parts outside of the
area of interest. This means that we cannot investigate whether the detected change is actual change for this dataset. We therefore employed a
synthetic scene with exactly known displacement to study the behavior of our method. For the analysis, we followed the same approach as with the real
data, i.e., selecting a proper value of

Timelines for different models representing change values in the synthetic scene. Panels

In addition, we show the raw time series compared to temporal median smoothing for three locations (zero displacement and large positive or negative
displacement) in Fig.

Comparison of detected synthetic change at the end of the 40

The detected change at the end of the simulated 40

Comparison of different numbers of cluster centroids used in

Comparison of clustering results from time series estimated using different state-of-the-art methods:

To assess the influence of filtering on subsequent analyses, we use the estimated time series of change values to cluster the core points following
the

Comparing the results from the temporal median model (using a window size of 96

Kalman filtering is an alternative method for time series analysis of 3D point clouds, which, compared to the raw time series or moving median
windows, rigorously considers uncertainties. As such, each observation input to the Kalman filter is attributed with an uncertainty, e.g., stemming
from bitemporal change quantification using M3C2 with error propagation. This uncertainty is combined with a system state variance, a measure of how
much change is expected in subsequent time periods, resulting in (i) a smoothed time series and (ii) associated uncertainties. These uncertainties are quantified not only for the observation points but also for interpolated and extrapolated displacement values. Quantification of uncertainties allows
for statistical tests of significance to separate change from noise. By analyzing the full time series instead of epoch-wise bitemporal analyses, we
were able to increase the number of points where change was detected confidently at a given point in time, e.g., at the end of the time series. At our study site, the number of core points attributed with significant change was almost doubled (cf. Fig.

We compare different models by visually inspecting the estimated trajectories at sample locations (Figs.

A major challenge in the application of our method for different geographic settings is the choice of the model order (i.e., the physical basis) and the state variance. As no control data were available for the real dataset, we chose models by visual interpretation. We selected models that effectively reduce daily patterns, which in our data can be attributed to remaining nonlinear atmospheric effects, yet do not smooth out real surface changes too much. In this study area, we select a model of order 1 for further investigations. The exact choice of model and state variance depends on the types of change processes that are being investigated. Even a spatially and temporally varying state variance could be applicable and is possible within the presented mathematical framework. This would, however, require a priori knowledge of the processes acting on the surface.

In comparing the estimated Kalman trajectories to ones obtained from temporal median smoothing and to the raw time series, we demonstrate that,
especially with data gaps, the Kalman filter estimates often provide a more realistic interpolation trajectory (e.g., Fig.

Higher-order models, especially the order 2 model, tend to overfit on step functions, resulting in ringing artifacts (blue line in
Fig.

As an application example, we showed how the smoothed Kalman time series can be used in

Future research could investigate how discrete change events can be identified and modeled appropriately by re-initializing the Kalman filter just after such an event. Such a re-initialization resets the estimated displacement, velocity, and acceleration (depending on the chosen order of the model), increasing the uncertainty until more observations become available and the filter converges again. In line with this consideration is the choice of uncertainty at the beginning of the process. At the start of the time series, the displacement must be zero by definition, and we therefore assign an uncertainty of zero to this initialization. This also ensures that all trajectories pass through the origin at the beginning of the time span. For subsequent initializations, this argument does not hold, and a larger uncertainty (e.g., derived from the bitemporal comparison) should be assumed.

We presented a novel method for the analysis of 4D point clouds, supporting the monitoring of Earth surface dynamics. The application of a Kalman filter allows informed temporal smoothing, which decreases uncertainty and enables interpolation of the time series. As M3C2-EP is used to compute point cloud change values, which spatially aggregates and smooths data, the full 4D domain is exploited to find optimal estimates for change values, velocities, and accelerations. Our work can be used to detect locations and points in time where significant change occurs throughout the near-continuous 3D observation and to group these locations into areas or subsets with similar properties. The extraction of the smoothed time series then allows the interpretation of individual trajectories where the influence of random noise is largely suppressed, which in turn allows more precise statements about the significance of quantified change values and the properties of this change. The 4D point cloud analysis using a Kalman filter and clustering techniques facilitates interpretation and allows extraction of the relevant information from the topographic point cloud time series.

The rigorous treatment of uncertainty follows a statistical approach to identify significant change and to separate it from noise resulting from sensing uncertainty and processing steps. The use of the Kalman filter thereby allows propagating uncertainties from bitemporal differencing into the time series and reduces the associated level of detection.

Many real-world time series datasets contain gaps or are irregular by design. With our approach, the time series can be both temporally interpolated and resampled. The regularity can subsequently be utilized by algorithms relying on a constant time step in the time series. We showed this by performing clustering of the spatial locations using the estimated change values as a feature vector, yielding groups of similar surface change behavior.

Overall, smoothing time series while fully considering associated uncertainties is an important tool for the interpretation of topographic 4D point clouds, especially for small-magnitude changes. Such changes become especially important with increasing observation frequencies, a trend in recent near-continuous laser scanning survey setups.

The code used for processing the point clouds, including M3C2-EP and the Kalman filter, is available on GitHub and indexed with Zenodo (v0.0.4,

LW: conceptualization, methodology, formal analysis; writing – original draft; writing – review and editing; visualization; KA: methodology, formal analysis, data curation, writing – review and editing; DCS: resources, data curation, writing – review and editing; BH: conceptualization, writing – review and editing, supervision, funding acquisition.

The contact author has declared that none of the authors has any competing interests.

Publisher's note: Copernicus Publications remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

We thank the Tyrol State Government – Department of Geoinformation for their support in conducting the experimental study. We also thank RIEGL Laser Measurement Systems GmbH for the technical support and exchange of information during the research work. We further wish to thank the reviewers, Roderik Lindenbergh, Mieke Kuschnerus, Dimitri Lague, and Giulia Sofia for their valuable input and critical discussion of our manuscript, as well as Fabio Crameri for his work on scientific color maps, which we have used throughout this paper

This research has been supported by the European Union Research Fund for Coal and Steel (RFCS project no. 800689).

This paper was edited by Niels Hovius and Giulia Sofia and reviewed by Roderik Lindenbergh, Dimitri Lague, and one anonymous referee.