Problem: Given a year, return the century it is in. The first century spans from the year 1 up to and including the year 100, the second - from the year 101 up to and including the year 200, etc.
My code:
def centuryFromYear(year):
century = year/100
decimal = int(str(century[-2:-1]))
integer = int(str(century)[:2])
if decimal > 0:
return integer + 1
else:
return integer
print(centuryFromYear(2017))
This doesn't seem to work in certain cases, for example, when year = 2001 or year = 2000.
Would anyone be able to provide a more simple piece of code?
You can use integer division, operator // in python 3:
def centuryFromYear(year):
return (year) // 100 + 1 # 1 because 2017 is 21st century, and 1989 = 20th century
print(centuryFromYear(2017)) # --> 21
Please note: This does not account for century BC, and it uses a cut off date at Dec 31st xy99 where it is sometimes strictly defined as Dec 31st xy00
more info here
if you wanted to set the cutoff on Dec 31st xy00, which is more strict, you would likely want to do like this:
def centuryFromYear(year):
return (year - 1) // 100 + 1 # 1 because 2017 is 21st century, and 1989 = 20th century
print(centuryFromYear(2017)) # --> 21
Here is a simple one-liner solution:
def centuryFromYear(year):
return (year + 99) // 100
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With