I have a python script that at one point performs division. When I run the script normally from the command line, it treats the numbers as floats, and performs the correct division. i.e.
a = 2
b = 5
print(a/b)
>0.4
I have also written a bash script that among other things, runs this same python script. When I run this bash script, python treats the numbers as integers, performs integer division, and returns zero. Why does this happen?
i.e.
a = 2
b = 5
print(a/b)
>0
EDIT:
python script
#!/usr/bin/python3
import sys
a = 2
b = 5
print(sys.version)
print(a/b)
>3.5.2 (default, Nov 23 2017, 16:37:01)
[GCC 5.4.0 20160609]
0.4
Bash script
python stack.py
>2.7.12 (default, Dec 4 2017, 14:50:18)
[GCC 5.4.0 20160609]
0
Add a shebang line to the top of your script:
#!/usr/bin/python3
That will ensure that you get a consistent version of Python when it is run from a shell. Without this it appears bash is using Python2 where by default division of integers gives an integer result.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With