Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to convert a DataFrame using map, reduce , apply or other functions in python (in this example)?

I Have such DataFrame: enter image description here

df = pd.DataFrame({'id': [111,222], 'CycleOfRepricingAnchorTime': ['27.04.2018', '09.06.2018'], 'CycleOfRepricing': ['3M','5M'] }) 
df['CycleOfRepricingAnchorTime'] = pd.to_datetime(df['CycleOfRepricingAnchorTime'] ) 
df

I need to get such DataFrame: enter image description here

The result DataFrame: the first column is id , the second column is Date with frequency equals 'CycleOfRepricing' of this id. Max date is 31.12.2019

I have tried to solve such task with apply, map, etc. But I have not had a success since I can get only objects

 df.apply(lambda x: \
        pd.date_range(start = x.CycleOfRepricingAnchorTime, \
                         end = pd.to_datetime('31.12.2019'),
                         freq = x.CycleOfRepricing), axis = 1)

I will be grateful for the help.

like image 533
Roman Avatar asked Mar 15 '26 14:03

Roman


1 Answers

Update to match day of the month for each period.

df.assign(ReportingTime=df.apply(lambda x: \
        pd.date_range(start = x.CycleOfRepricingAnchorTime, \
                         end = pd.to_datetime('31.12.2019'),
                         freq = x.CycleOfRepricing+'S')+
                                pd.Timedelta(days=x.CycleOfRepricingAnchorTime.day-1), 
                      axis = 1)).explode('ReportingTime').to_markdown()

Output:

|    |   id | CycleOfRepricingAnchorTime   | CycleOfRepricing   | ReportingTime       |
|---:|-----:|:-----------------------------|:-------------------|:--------------------|
|  0 |  111 | 2018-04-27 00:00:00          | 3M                 | 2018-05-27 00:00:00 |
|  0 |  111 | 2018-04-27 00:00:00          | 3M                 | 2018-08-27 00:00:00 |
|  0 |  111 | 2018-04-27 00:00:00          | 3M                 | 2018-11-27 00:00:00 |
|  0 |  111 | 2018-04-27 00:00:00          | 3M                 | 2019-02-27 00:00:00 |
|  0 |  111 | 2018-04-27 00:00:00          | 3M                 | 2019-05-27 00:00:00 |
|  0 |  111 | 2018-04-27 00:00:00          | 3M                 | 2019-08-27 00:00:00 |
|  0 |  111 | 2018-04-27 00:00:00          | 3M                 | 2019-11-27 00:00:00 |
|  1 |  222 | 2018-09-06 00:00:00          | 5M                 | 2018-10-06 00:00:00 |
|  1 |  222 | 2018-09-06 00:00:00          | 5M                 | 2019-03-06 00:00:00 |
|  1 |  222 | 2018-09-06 00:00:00          | 5M                 | 2019-08-06 00:00:00 |

Try this using pandas version 0.25.0+:

df.assign(ReportingTime=df.apply(lambda x: \
        pd.date_range(start = x.CycleOfRepricingAnchorTime, \
                         end = pd.to_datetime('31.12.2019'),
                         freq = x.CycleOfRepricing), axis = 1)).explode('ReportingTime')

Output:

|    |   id | CycleOfRepricingAnchorTime   | CycleOfRepricing   | ReportingTime       |
|---:|-----:|:-----------------------------|:-------------------|:--------------------|
|  0 |  111 | 2018-04-27 00:00:00          | 3M                 | 2018-04-30 00:00:00 |
|  0 |  111 | 2018-04-27 00:00:00          | 3M                 | 2018-07-31 00:00:00 |
|  0 |  111 | 2018-04-27 00:00:00          | 3M                 | 2018-10-31 00:00:00 |
|  0 |  111 | 2018-04-27 00:00:00          | 3M                 | 2019-01-31 00:00:00 |
|  0 |  111 | 2018-04-27 00:00:00          | 3M                 | 2019-04-30 00:00:00 |
|  0 |  111 | 2018-04-27 00:00:00          | 3M                 | 2019-07-31 00:00:00 |
|  0 |  111 | 2018-04-27 00:00:00          | 3M                 | 2019-10-31 00:00:00 |
|  1 |  222 | 2018-09-06 00:00:00          | 5M                 | 2018-09-30 00:00:00 |
|  1 |  222 | 2018-09-06 00:00:00          | 5M                 | 2019-02-28 00:00:00 |
|  1 |  222 | 2018-09-06 00:00:00          | 5M                 | 2019-07-31 00:00:00 |
|  1 |  222 | 2018-09-06 00:00:00          | 5M                 | 2019-12-31 00:00:00 |
like image 82
Scott Boston Avatar answered Mar 18 '26 03:03

Scott Boston



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!