Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Accuracy of python decimals

Tags:

python-3.x

How do I make the decimal in Python more accurate, so that I can calculate up to

0.0000000000001 * 0.0000000000000000001 = ?

I need to add decimals like 0.0000000145 and 0.00000000000000012314 and also multiply them and get the exact result. Is there a needed code, or is there a module? Thanks in advance.

I need something that is more accurate than decimal.Decimal.

like image 838
wkpk11235 Avatar asked Sep 05 '25 03:09

wkpk11235


1 Answers

Not sure why you're getting downvoted.

decimal.Decimal represents numbers using floating point in base 10. Since it isn't implemented directly in hardware, you can control the level of precision (which defaults to 28 places):

>>> from decimal import *
>>> getcontext().prec = 6
>>> Decimal(1) / Decimal(7)
0.142857

However, you may prefer to use the mpmath module instead, which supports arbitrary precision real and complex floating point calculations:

>>> from mpmath import mp
>>> mp.dps = 50
>>> print(mp.quad(lambda x: mp.exp(-x**2), [-mp.inf, mp.inf]) ** 2)
3.1415926535897932384626433832795028841971693993751
like image 91
Uri Granta Avatar answered Sep 08 '25 00:09

Uri Granta