Coverage for src / ts_stat_tests / regularity / tests.py: 100%
34 statements
« prev ^ index » next coverage.py v7.13.2, created at 2026-02-01 09:48 +0000
« prev ^ index » next coverage.py v7.13.2, created at 2026-02-01 09:48 +0000
1# ============================================================================ #
2# #
3# Title: Regularity Tests #
4# Purpose: Convenience functions for regularity algorithms. #
5# #
6# ============================================================================ #
9# ---------------------------------------------------------------------------- #
10# #
11# Overview ####
12# #
13# ---------------------------------------------------------------------------- #
16# ---------------------------------------------------------------------------- #
17# Description ####
18# ---------------------------------------------------------------------------- #
21"""
22!!! note "Summary"
23 This module contains convenience functions and tests for regularity measures, allowing for easy access to different entropy algorithms.
24"""
27# ---------------------------------------------------------------------------- #
28# #
29# Setup ####
30# #
31# ---------------------------------------------------------------------------- #
34# ---------------------------------------------------------------------------- #
35# Imports ####
36# ---------------------------------------------------------------------------- #
39# ## Python StdLib Imports ----
40from typing import Union
42# ## Python Third Party Imports ----
43import numpy as np
44from numpy.typing import ArrayLike, NDArray
45from typeguard import typechecked
47# ## Local First Party Imports ----
48from ts_stat_tests.regularity.algorithms import (
49 VALID_KDTREE_METRIC_OPTIONS,
50 approx_entropy,
51 permutation_entropy,
52 sample_entropy,
53 spectral_entropy,
54 svd_entropy,
55)
56from ts_stat_tests.utils.errors import generate_error_message
59# ---------------------------------------------------------------------------- #
60# Exports ####
61# ---------------------------------------------------------------------------- #
64__all__: list[str] = ["entropy", "regularity", "is_regular"]
67# ---------------------------------------------------------------------------- #
68# #
69# Tests ####
70# #
71# ---------------------------------------------------------------------------- #
74@typechecked
75def entropy(
76 x: ArrayLike,
77 algorithm: str = "sample",
78 order: int = 2,
79 metric: VALID_KDTREE_METRIC_OPTIONS = "chebyshev",
80 sf: float = 1,
81 normalize: bool = True,
82) -> Union[float, NDArray[np.float64]]:
83 """
84 !!! note "Summary"
85 Test for the entropy of a given data set.
87 ???+ abstract "Details"
88 This function is a convenience wrapper around the five underlying algorithms:<br>
89 - [`approx_entropy()`][ts_stat_tests.regularity.algorithms.approx_entropy]<br>
90 - [`sample_entropy()`][ts_stat_tests.regularity.algorithms.sample_entropy]<br>
91 - [`spectral_entropy()`][ts_stat_tests.regularity.algorithms.spectral_entropy]<br>
92 - [`permutation_entropy()`][ts_stat_tests.regularity.algorithms.permutation_entropy]<br>
93 - [`svd_entropy()`][ts_stat_tests.regularity.algorithms.svd_entropy]
95 Params:
96 x (ArrayLike):
97 The data to be checked. Should be a `1-D` or `N-D` data array.
98 algorithm (str, optional):
99 Which entropy algorithm to use.<br>
100 - `sample_entropy()`: `["sample", "sampl", "samp"]`<br>
101 - `approx_entropy()`: `["app", "approx"]`<br>
102 - `spectral_entropy()`: `["spec", "spect", "spectral"]`<br>
103 - `permutation_entropy()`: `["perm", "permutation"]`<br>
104 - `svd_entropy()`: `["svd", "svd_entropy"]`<br>
105 Defaults to `"sample"`.
106 order (int, optional):
107 Embedding dimension.<br>
108 Only relevant when `algorithm=sample` or `algorithm=approx`.<br>
109 Defaults to `2`.
110 metric (VALID_KDTREE_METRIC_OPTIONS):
111 Name of the distance metric function used with [`sklearn.neighbors.KDTree`](https://scikit-learn.org/stable/modules/generated/sklearn.neighbors.KDTree.html#sklearn.neighbors.KDTree). Default is to use the [Chebyshev distance](https://en.wikipedia.org/wiki/Chebyshev_distance).<br>
112 Only relevant when `algorithm=sample` or `algorithm=approx`.<br>
113 Defaults to `"chebyshev"`.
114 sf (float, optional):
115 Sampling frequency, in Hz.<br>
116 Only relevant when `algorithm=spectral`.<br>
117 Defaults to `1`.
118 normalize (bool, optional):
119 If `True`, divide by $log2(psd.size)$ to normalize the spectral entropy to be between $0$ and $1$. Otherwise, return the spectral entropy in bit.<br>
120 Only relevant when `algorithm=spectral`.<br>
121 Defaults to `True`.
123 Raises:
124 (ValueError):
125 When the given value for `algorithm` is not valid.
127 Returns:
128 (Union[float, NDArray[np.float64]]):
129 The calculated entropy value.
131 ??? success "Credit"
132 All credit goes to the [`AntroPy`](https://raphaelvallat.com/antropy/) library.
134 ???+ example "Examples"
136 ```pycon {.py .python linenums="1" title="Setup"}
137 >>> from ts_stat_tests.regularity.tests import entropy
138 >>> from ts_stat_tests.utils.data import data_normal
139 >>> normal = data_normal
141 ```
143 ```pycon {.py .python linenums="1" title="Example 1: Sample Entropy"}
144 >>> print(entropy(x=normal, algorithm="sample"))
145 2.2374...
147 ```
149 ```pycon {.py .python linenums="1" title="Example 2: Approx Entropy"}
150 >>> print(entropy(x=normal, algorithm="approx"))
151 1.6643...
153 ```
155 ```pycon {.py .python linenums="1" title="Example 3: Spectral Entropy"}
156 >>> print(entropy(x=normal, algorithm="spectral", sf=1))
157 0.9329...
159 ```
161 ??? question "References"
162 - Richman, J. S. et al. (2000). Physiological time-series analysis using approximate entropy and sample entropy. American Journal of Physiology-Heart and Circulatory Physiology, 278(6), H2039-H2049.
163 - https://scikit-learn.org/stable/modules/generated/sklearn.neighbors.DistanceMetric.html
164 - Inouye, T. et al. (1991). Quantification of EEG irregularity by use of the entropy of the power spectrum. Electroencephalography and clinical neurophysiology, 79(3), 204-210.
165 - https://en.wikipedia.org/wiki/Spectral_density
166 - https://en.wikipedia.org/wiki/Welch%27s_method
168 ??? tip "See Also"
169 - [`regularity()`][ts_stat_tests.regularity.tests.regularity]
170 - [`approx_entropy()`][ts_stat_tests.regularity.algorithms.approx_entropy]
171 - [`sample_entropy()`][ts_stat_tests.regularity.algorithms.sample_entropy]
172 - [`spectral_entropy()`][ts_stat_tests.regularity.algorithms.spectral_entropy]
173 - [`permutation_entropy()`][ts_stat_tests.regularity.algorithms.permutation_entropy]
174 - [`svd_entropy()`][ts_stat_tests.regularity.algorithms.svd_entropy]
175 """
176 options: dict[str, tuple[str, ...]] = {
177 "sampl": ("sample", "sampl", "samp"),
178 "approx": ("app", "approx"),
179 "spect": ("spec", "spect", "spectral"),
180 "perm": ("perm", "permutation"),
181 "svd": ("svd", "svd_entropy"),
182 }
183 if algorithm in options["sampl"]:
184 return sample_entropy(x=x, order=order, metric=metric)
185 if algorithm in options["approx"]:
186 return approx_entropy(x=x, order=order, metric=metric)
187 if algorithm in options["spect"]:
188 return spectral_entropy(x=x, sf=sf, normalize=normalize)
189 if algorithm in options["perm"]:
190 return permutation_entropy(x=x, order=order, normalize=normalize)
191 if algorithm in options["svd"]:
192 return svd_entropy(x=x, order=order, normalize=normalize)
193 raise ValueError(
194 generate_error_message(
195 parameter_name="algorithm",
196 value_parsed=algorithm,
197 options=options,
198 )
199 )
202@typechecked
203def regularity(
204 x: ArrayLike,
205 algorithm: str = "sample",
206 order: int = 2,
207 metric: VALID_KDTREE_METRIC_OPTIONS = "chebyshev",
208 sf: float = 1,
209 normalize: bool = True,
210) -> Union[float, NDArray[np.float64]]:
211 """
212 !!! note "Summary"
213 Test for the regularity of a given data set.
215 ???+ abstract "Details"
216 This is a pass-through, convenience wrapper around the [`entropy()`][ts_stat_tests.regularity.tests.entropy] function.
218 Params:
219 x (ArrayLike):
220 The data to be checked. Should be a `1-D` or `N-D` data array.
221 algorithm (str, optional):
222 Which entropy algorithm to use.<br>
223 - `sample_entropy()`: `["sample", "sampl", "samp"]`<br>
224 - `approx_entropy()`: `["app", "approx"]`<br>
225 - `spectral_entropy()`: `["spec", "spect", "spectral"]`<br>
226 - `permutation_entropy()`: `["perm", "permutation"]`<br>
227 - `svd_entropy()`: `["svd", "svd_entropy"]`<br>
228 Defaults to `"sample"`.
229 order (int, optional):
230 Embedding dimension.<br>
231 Only relevant when `algorithm=sample` or `algorithm=approx`.<br>
232 Defaults to `2`.
233 metric (VALID_KDTREE_METRIC_OPTIONS):
234 Name of the distance metric function used with [`sklearn.neighbors.KDTree`](https://scikit-learn.org/stable/modules/generated/sklearn.neighbors.KDTree.html#sklearn.neighbors.KDTree). Default is to use the [Chebyshev distance](https://en.wikipedia.org/wiki/Chebyshev_distance).<br>
235 Only relevant when `algorithm=sample` or `algorithm=approx`.<br>
236 Defaults to `"chebyshev"`.
237 sf (float, optional):
238 Sampling frequency, in Hz.<br>
239 Only relevant when `algorithm=spectral`.<br>
240 Defaults to `1`.
241 normalize (bool, optional):
242 If `True`, divide by $log2(psd.size)$ to normalize the spectral entropy to be between $0$ and $1$. Otherwise, return the spectral entropy in bit.<br>
243 Only relevant when `algorithm=spectral`.<br>
244 Defaults to `True`.
246 Returns:
247 (Union[float, NDArray[np.float64]]):
248 The calculated regularity (entropy) value.
250 ??? success "Credit"
251 All credit goes to the [`AntroPy`](https://raphaelvallat.com/antropy/) library.
253 ???+ example "Examples"
255 ```pycon {.py .python linenums="1" title="Setup"}
256 >>> from ts_stat_tests.regularity.tests import regularity
257 >>> from ts_stat_tests.utils.data import data_normal
258 >>> normal = data_normal
260 ```
262 ```pycon {.py .python linenums="1" title="Example 1: Sample Entropy"}
263 >>> print(regularity(x=normal, algorithm="sample"))
264 2.2374...
266 ```
268 ```pycon {.py .python linenums="1" title="Example 2: Approx Entropy"}
269 >>> print(regularity(x=normal, algorithm="approx"))
270 1.6643...
272 ```
274 ```pycon {.py .python linenums="1" title="Example 3: Spectral Entropy"}
275 >>> print(regularity(x=normal, algorithm="spectral", sf=1))
276 0.9329...
278 ```
280 ??? question "References"
281 - Richman, J. S. et al. (2000). Physiological time-series analysis using approximate entropy and sample entropy. American Journal of Physiology-Heart and Circulatory Physiology, 278(6), H2039-H2049.
282 - https://scikit-learn.org/stable/modules/generated/sklearn.neighbors.DistanceMetric.html
283 - Inouye, T. et al. (1991). Quantification of EEG irregularity by use of the entropy of the power spectrum. Electroencephalography and clinical neurophysiology, 79(3), 204-210.
284 - https://en.wikipedia.org/wiki/Spectral_density
285 - https://en.wikipedia.org/wiki/Welch%27s_method
287 ??? tip "See Also"
288 - [`entropy()`][ts_stat_tests.regularity.tests.entropy]
289 - [`approx_entropy()`][ts_stat_tests.regularity.algorithms.approx_entropy]
290 - [`sample_entropy()`][ts_stat_tests.regularity.algorithms.sample_entropy]
291 - [`spectral_entropy()`][ts_stat_tests.regularity.algorithms.spectral_entropy]
292 - [`permutation_entropy()`][ts_stat_tests.regularity.algorithms.permutation_entropy]
293 - [`svd_entropy()`][ts_stat_tests.regularity.algorithms.svd_entropy]
294 """
295 return entropy(x=x, algorithm=algorithm, order=order, metric=metric, sf=sf, normalize=normalize)
298@typechecked
299def is_regular(
300 x: ArrayLike,
301 algorithm: str = "sample",
302 order: int = 2,
303 sf: float = 1,
304 metric: VALID_KDTREE_METRIC_OPTIONS = "chebyshev",
305 normalize: bool = True,
306 tolerance: Union[str, float, int, None] = "default",
307) -> dict[str, Union[str, float, bool]]:
308 """
309 !!! note "Summary"
310 Test whether a given data set is `regular` or not.
312 ???+ abstract "Details"
313 This function implements the given algorithm (defined in the parameter `algorithm`), and returns a dictionary containing the relevant data:
314 ```python
315 {
316 "result": ..., # The result of the test. Will be `True` if `entropy<tolerance`, and `False` otherwise
317 "entropy": ..., # A `float` value, the result of the `entropy()` function
318 "tolerance": ..., # A `float` value, which is the tolerance used for determining whether or not the `entropy` is `regular` or not
319 }
320 ```
322 Params:
323 x (ArrayLike):
324 The data to be checked. Should be a `1-D` or `N-D` data array.
325 algorithm (str, optional):
326 Which entropy algorithm to use.<br>
327 - `sample_entropy()`: `["sample", "sampl", "samp"]`<br>
328 - `approx_entropy()`: `["app", "approx"]`<br>
329 - `spectral_entropy()`: `["spec", "spect", "spectral"]`<br>
330 - `permutation_entropy()`: `["perm", "permutation"]`<br>
331 - `svd_entropy()`: `["svd", "svd_entropy"]`<br>
332 Defaults to `"sample"`.
333 order (int, optional):
334 Embedding dimension.<br>
335 Only relevant when `algorithm=sample` or `algorithm=approx`.<br>
336 Defaults to `2`.
337 metric (VALID_KDTREE_METRIC_OPTIONS):
338 Name of the distance metric function used with [`sklearn.neighbors.KDTree`](https://scikit-learn.org/stable/modules/generated/sklearn.neighbors.KDTree.html#sklearn.neighbors.KDTree). Default is to use the [Chebyshev distance](https://en.wikipedia.org/wiki/Chebyshev_distance).<br>
339 Only relevant when `algorithm=sample` or `algorithm=approx`.<br>
340 Defaults to `"chebyshev"`.
341 sf (float, optional):
342 Sampling frequency, in Hz.<br>
343 Only relevant when `algorithm=spectral`.<br>
344 Defaults to `1`.
345 normalize (bool, optional):
346 If `True`, divide by $log2(psd.size)$ to normalize the spectral entropy to be between $0$ and $1$. Otherwise, return the spectral entropy in bit.<br>
347 Only relevant when `algorithm=spectral`.<br>
348 Defaults to `True`.
349 tolerance (Union[str, float, int, None], optional):
350 The tolerance value used to determine whether or not the result is `regular` or not.<br>
351 - If `tolerance` is either type `int` or `float`, then this value will be used.<br>
352 - If `tolerance` is either `"default"` or `None`, then `tolerance` will be derived from `x` using the calculation:
353 ```python
354 tolerance = 0.2 * np.std(a=x)
355 ```
356 - If any other value is given, then a `ValueError` error will be raised.<br>
357 Defaults to `"default"`.
359 Raises:
360 (ValueError):
361 If the given `tolerance` parameter is invalid.
363 Valid options are:
365 - A number with type `float` or `int`, or
366 - A string with value `default`, or
367 - The value `None`.
369 Returns:
370 (dict[str, Union[str, float, bool]]):
371 A dictionary containing the test results:
373 - `result` (bool): `True` if `entropy < tolerance`.
374 - `entropy` (float): The calculated entropy value.
375 - `tolerance` (float): The threshold used for regularity.
377 ??? success "Credit"
378 All credit goes to the [`AntroPy`](https://raphaelvallat.com/antropy/) library.
380 ???+ example "Examples"
382 ```pycon {.py .python linenums="1" title="Setup"}
383 >>> from ts_stat_tests.regularity.tests import is_regular
384 >>> from ts_stat_tests.utils.data import data_normal
385 >>> normal = data_normal
387 ```
389 ```pycon {.py .python linenums="1" title="Example 1: Sample Entropy"}
390 >>> print(is_regular(x=normal, algorithm="sample"))
391 {'result': False, 'entropy': 2.23743099781426, 'tolerance': 0.20294652904313437}
393 ```
395 ```pycon {.py .python linenums="1" title="Example 2: Approx Entropy"}
396 >>> print(is_regular(x=normal, algorithm="approx", tolerance=0.5))
397 {'result': False, 'entropy': 1.6643808251518548, 'tolerance': 0.5}
399 ```
401 ??? question "References"
402 - Richman, J. S. et al. (2000). Physiological time-series analysis using approximate entropy and sample entropy. American Journal of Physiology-Heart and Circulatory Physiology, 278(6), H2039-H2049.
403 - https://scikit-learn.org/stable/modules/generated/sklearn.neighbors.DistanceMetric.html
404 - Inouye, T. et al. (1991). Quantification of EEG irregularity by use of the entropy of the power spectrum. Electroencephalography and clinical neurophysiology, 79(3), 204-210.
405 - https://en.wikipedia.org/wiki/Spectral_density
406 - https://en.wikipedia.org/wiki/Welch%27s_method
408 ??? tip "See Also"
409 - [`entropy()`][ts_stat_tests.regularity.tests.entropy]
410 - [`regularity()`][ts_stat_tests.regularity.tests.regularity]
411 - [`approx_entropy()`][ts_stat_tests.regularity.algorithms.approx_entropy]
412 - [`sample_entropy()`][ts_stat_tests.regularity.algorithms.sample_entropy]
413 - [`spectral_entropy()`][ts_stat_tests.regularity.algorithms.spectral_entropy]
414 - [`permutation_entropy()`][ts_stat_tests.regularity.algorithms.permutation_entropy]
415 - [`svd_entropy()`][ts_stat_tests.regularity.algorithms.svd_entropy]
416 """
417 if isinstance(tolerance, (float, int)):
418 tol = tolerance
419 elif tolerance in ["default", None]:
420 tol = 0.2 * np.std(a=np.asarray(x))
421 else:
422 raise ValueError(
423 f"Invalid option for `tolerance` parameter: {tolerance}.\n"
424 f"Valid options are:\n"
425 f"- A number with type `float` or `int`,\n"
426 f"- A string with value `default`,\n"
427 f"- The value `None`."
428 )
429 value = regularity(x=x, order=order, sf=sf, metric=metric, algorithm=algorithm, normalize=normalize)
430 result = value < tol
431 return {
432 "result": bool(result),
433 "entropy": float(value),
434 "tolerance": float(tol),
435 }