I am looking for a method to flag sequential constant values (e.g. n), whom successively are constant, in a pd.dataframe (e.g. df).
I have written some code by which a value will be flagged if its differences with the n/2 next and n/2 previous data point are zero.
n = 5 # the minimum number of sequential constant values
#to create a adatframe example
df=pd.DataFrame(np.random.randn(25), index=pd.date_range(start='2010-1-1',end='2010-1-2',freq='H'), columns=['value'])
#to modify the dataframe to have several sets of constant values
df[1:10]=23
df[20:26]=10
df[15:17]=15
for i in np.arange(1, int(n/2)):
# to calcualte the difference between value and ith previous values
df_diff['delta_' + str(i)] = (df['value'].diff(periods=i)).abs()
# to calcualte the difference between value and ith next values
df_diff['delta_' + str(-i)] = (df['value'].diff(periods=-i)).abs()
# to filter the results (e.g. as a boolean)
result_1 = (df_diff[:] <= 0).all(axis=1)
result_2 = (df_diff[:] <= 0).any(axis=1)
The result_1 and results_2 in this example do not provide the right answer.
What I expect is:
2010-01-01 00:00:00 False
2010-01-01 01:00:00 True
2010-01-01 02:00:00 True
2010-01-01 03:00:00 True
2010-01-01 04:00:00 True
2010-01-01 05:00:00 True
2010-01-01 06:00:00 True
2010-01-01 07:00:00 True
2010-01-01 08:00:00 True
2010-01-01 09:00:00 True
2010-01-01 10:00:00 False
2010-01-01 11:00:00 False
2010-01-01 12:00:00 False
2010-01-01 13:00:00 False
2010-01-01 14:00:00 False
2010-01-01 15:00:00 False
2010-01-01 16:00:00 False
2010-01-01 17:00:00 False
2010-01-01 18:00:00 False
2010-01-01 19:00:00 False
2010-01-01 20:00:00 True
2010-01-01 21:00:00 True
2010-01-01 22:00:00 True
2010-01-01 23:00:00 True
2010-01-02 00:00:00 True
IIUC, use DataFrame.groupby with grouper being Series.diff, .ne(0) then .cumsum:
df.groupby(df.value.diff().ne(0).cumsum())['value'].transform('size').ge(n)
[out]
2010-01-01 00:00:00 False
2010-01-01 01:00:00 True
2010-01-01 02:00:00 True
2010-01-01 03:00:00 True
2010-01-01 04:00:00 True
2010-01-01 05:00:00 True
2010-01-01 06:00:00 True
2010-01-01 07:00:00 True
2010-01-01 08:00:00 True
2010-01-01 09:00:00 True
2010-01-01 10:00:00 False
2010-01-01 11:00:00 False
2010-01-01 12:00:00 False
2010-01-01 13:00:00 False
2010-01-01 14:00:00 False
2010-01-01 15:00:00 False
2010-01-01 16:00:00 False
2010-01-01 17:00:00 False
2010-01-01 18:00:00 False
2010-01-01 19:00:00 False
2010-01-01 20:00:00 True
2010-01-01 21:00:00 True
2010-01-01 22:00:00 True
2010-01-01 23:00:00 True
2010-01-02 00:00:00 True
Freq: H, Name: value, dtype: bool
The series we group by, will be the contiguous groups of equal values:
s = df.value.diff().ne(0).cumsum()
2010-01-01 00:00:00 1
2010-01-01 01:00:00 2
2010-01-01 02:00:00 2
2010-01-01 03:00:00 2
2010-01-01 04:00:00 2
2010-01-01 05:00:00 2
2010-01-01 06:00:00 2
2010-01-01 07:00:00 2
2010-01-01 08:00:00 2
2010-01-01 09:00:00 2
2010-01-01 10:00:00 3
2010-01-01 11:00:00 4
2010-01-01 12:00:00 5
2010-01-01 13:00:00 6
2010-01-01 14:00:00 7
2010-01-01 15:00:00 8
2010-01-01 16:00:00 8
2010-01-01 17:00:00 9
2010-01-01 18:00:00 10
2010-01-01 19:00:00 11
2010-01-01 20:00:00 12
2010-01-01 21:00:00 12
2010-01-01 22:00:00 12
2010-01-01 23:00:00 12
2010-01-02 00:00:00 12
Freq: H, Name: value, dtype: int32
When you groupby these 'group id', using transform to return an object the same shape as original DataFrame, aggregating to 'size', you get:
s.groupby(s).transform('size')
2010-01-01 00:00:00 1
2010-01-01 01:00:00 9
2010-01-01 02:00:00 9
2010-01-01 03:00:00 9
2010-01-01 04:00:00 9
2010-01-01 05:00:00 9
2010-01-01 06:00:00 9
2010-01-01 07:00:00 9
2010-01-01 08:00:00 9
2010-01-01 09:00:00 9
2010-01-01 10:00:00 1
2010-01-01 11:00:00 1
2010-01-01 12:00:00 1
2010-01-01 13:00:00 1
2010-01-01 14:00:00 1
2010-01-01 15:00:00 2
2010-01-01 16:00:00 2
2010-01-01 17:00:00 1
2010-01-01 18:00:00 1
2010-01-01 19:00:00 1
2010-01-01 20:00:00 5
2010-01-01 21:00:00 5
2010-01-01 22:00:00 5
2010-01-01 23:00:00 5
2010-01-02 00:00:00 5
Freq: H, Name: value, dtype: int64
From here, it's a simple Series.ge (>=) comparison with your value n
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With