seg1d.algorithm.combine_corr

seg1d.algorithm.combine_corr(x, w, method='m', scale=True)[source]

Combines Weighted Correlation

Takes in the correlated data results and multiply the weighting values to each array of data for that feature. | Combines the results of the weighted features

Parameters:
xDict[int,Dict[string,numpy.array]]

{scale:{ feature: array([correlations]) } }

wDict[string,float]

{ feature: weight }

method{‘m’,’w’, ‘s’}

keyword to use for aggregating feature correlations (default m). Options, m=mean, w=weighted mean, s=sum

scalebool, optional

keyword argument for scaling the correlated feature before applying any of the aggregation methods

Returns:
Dict[int,numpy.array]

{scale: array([weighted correlations]) }

See also

rolling_corr
(input for this function)
get_peaks
(takes the return of this function)

Examples

>>> import random
>>> import numpy as np
>>> import seg1d.algorithm as alg
>>> #make a convenience function to get a wave for sample data
>>> def s(f1, f2, f3): return np.sin( np.linspace(f1, f2, f3) )
>>> x = {
...     10: {'a': s(-np.pi*0.8, 0, 10), 'b': s(0, np.pi*0.8, 10)},
...     20: {'a': s(-np.pi*0.7, 0, 10), 'b': s(0, np.pi*0.7, 10)}
... }

Assign some weights and find the averaged value

>>> w = { 'a': 0.5, 'b': 0.9 }
>>> a = alg.combine_corr(x, w )
>>> for k,v in a.items(): print(k,v)
10 [-0.14694631 -0.07296588  0.00666771  0.0857847   0.15825538  0.21846498
  0.26174865  0.28475292  0.2856955   0.26450336]
20 [-0.20225425 -0.12293111 -0.03630481  0.0524783   0.13814375  0.21560229
  0.2802522   0.32825274  0.35675226  0.36405765]

Change the weight values and see the weighted scores change

>>> w = { 'a': 0.9, 'b': 0.2 }
>>> a = alg.combine_corr(x, w )
>>> for k,v in a.items(): print(k,v)
10 [-0.26450336 -0.3270411  -0.36424081 -0.37322037 -0.35328408 -0.30597655
 -0.23496298 -0.14574528 -0.04523573  0.05877853]
20 [-0.36405765 -0.39304054 -0.39867347 -0.38062179 -0.33995792 -0.27909765
 -0.20165658 -0.1122354  -0.01614647  0.0809017 ]