Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Out of bounds nanosecond timestamp: 1-01-01 00:00:00

I import data from github using code below:

series = read_csv('shampoo-sales.csv', header=0, index_col=0, squeeze=True). 

I would like to its index to datetimeindex. I use

series.index = pd.to_datetime(series.index).

But python gives me the following error:

Out of bounds nanosecond timestamp: 1-01-01 00:00:00

I have no idea how to fix this error.

series = read_csv('shampoo-sales.csv', header=0, index_col=0, squeeze=True)

series.index = pd.to_datetime(series.index)


UPDATE: Thanks EdChum for pointing out a way to convert from index to datetimeindex. However, I encounter another problem now. Consider the following code.

X = series.rename("actual").to_frame() 
X = X.loc[~X.index.duplicated(keep='last')].asfreq('d', 'ffill') 

Now I let X = series, it returns an error stating that index must be monotonic increasing or decreasing.

like image 208
Idonknow Avatar asked Oct 30 '25 16:10

Idonknow


1 Answers

You need to pass a format string as a param for to_datetime:

In[20]:
series.index = pd.to_datetime(series.index, format='%d-%m')
series.index

Out[20]: 
DatetimeIndex(['1900-01-01', '1900-02-01', '1900-03-01', '1900-04-01',
               '1900-05-01', '1900-06-01', '1900-07-01', '1900-08-01',
               '1900-09-01', '1900-10-01', '1900-11-01', '1900-12-01',
               '1900-01-02', '1900-02-02', '1900-03-02', '1900-04-02',
               '1900-05-02', '1900-06-02', '1900-07-02', '1900-08-02',
               '1900-09-02', '1900-10-02', '1900-11-02', '1900-12-02',
               '1900-01-03', '1900-02-03', '1900-03-03', '1900-04-03',
               '1900-05-03', '1900-06-03', '1900-07-03', '1900-08-03',
               '1900-09-03', '1900-10-03', '1900-11-03', '1900-12-03'],
              dtype='datetime64[ns]', name='Month', freq=None)

By default it will try to infer the format and it thinks the format is YYYY-MM-DD so the string 01-01 translates to year 1, month 1 which is out of bounds for nanoseconds

If you want a monotonically increasing index, which is what your data actually already looks like, we can just prepend the string '20' to the index and then convert:

In[24]:
series.index = '20' + series.index
series.index

Out[24]: 
Index(['2001-01', '2001-02', '2001-03', '2001-04', '2001-05', '2001-06',
       '2001-07', '2001-08', '2001-09', '2001-10', '2001-11', '2001-12',
       '2002-01', '2002-02', '2002-03', '2002-04', '2002-05', '2002-06',
       '2002-07', '2002-08', '2002-09', '2002-10', '2002-11', '2002-12',
       '2003-01', '2003-02', '2003-03', '2003-04', '2003-05', '2003-06',
       '2003-07', '2003-08', '2003-09', '2003-10', '2003-11', '2003-12'],
      dtype='object')

In[25]:
series.index = pd.to_datetime(series.index, format='%Y-%m')
series

Out[25]: 
2001-01-01    266.0
2001-02-01    145.9
2001-03-01    183.1
2001-04-01    119.3
2001-05-01    180.3
2001-06-01    168.5
2001-07-01    231.8
2001-08-01    224.5
2001-09-01    192.8
2001-10-01    122.9
2001-11-01    336.5
2001-12-01    185.9
2002-01-01    194.3
2002-02-01    149.5
2002-03-01    210.1
2002-04-01    273.3
2002-05-01    191.4
2002-06-01    287.0
2002-07-01    226.0
2002-08-01    303.6
2002-09-01    289.9
2002-10-01    421.6
2002-11-01    264.5
2002-12-01    342.3
2003-01-01    339.7
2003-02-01    440.4
2003-03-01    315.9
2003-04-01    439.3
2003-05-01    401.3
2003-06-01    437.4
2003-07-01    575.5
2003-08-01    407.6
2003-09-01    682.0
2003-10-01    475.3
2003-11-01    581.3
2003-12-01    646.9

Then your code will work:

In[28]:
X = series.rename("actual").to_frame() 
X = X.loc[~X.index.duplicated(keep='last')].asfreq('d', 'ffill')
X

Out[28]: 
            actual
2001-01-01   266.0
2001-01-02   266.0
2001-01-03   266.0
2001-01-04   266.0
2001-01-05   266.0
2001-01-06   266.0
2001-01-07   266.0
2001-01-08   266.0
2001-01-09   266.0
2001-01-10   266.0
2001-01-11   266.0
2001-01-12   266.0
2001-01-13   266.0
2001-01-14   266.0
2001-01-15   266.0
2001-01-16   266.0
2001-01-17   266.0
2001-01-18   266.0
2001-01-19   266.0
2001-01-20   266.0
2001-01-21   266.0
2001-01-22   266.0
2001-01-23   266.0
2001-01-24   266.0
2001-01-25   266.0
2001-01-26   266.0
2001-01-27   266.0
2001-01-28   266.0
2001-01-29   266.0
2001-01-30   266.0
           ...
2003-11-02   581.3
2003-11-03   581.3
2003-11-04   581.3
2003-11-05   581.3
2003-11-06   581.3
2003-11-07   581.3
2003-11-08   581.3
2003-11-09   581.3
2003-11-10   581.3
2003-11-11   581.3
2003-11-12   581.3
2003-11-13   581.3
2003-11-14   581.3
2003-11-15   581.3
2003-11-16   581.3
2003-11-17   581.3
2003-11-18   581.3
2003-11-19   581.3
2003-11-20   581.3
2003-11-21   581.3
2003-11-22   581.3
2003-11-23   581.3
2003-11-24   581.3
2003-11-25   581.3
2003-11-26   581.3
2003-11-27   581.3
2003-11-28   581.3
2003-11-29   581.3
2003-11-30   581.3
2003-12-01   646.9

[1065 rows x 1 columns]
like image 176
EdChum Avatar answered Nov 01 '25 06:11

EdChum



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!