In trying to learn Haskell, I have implemented a pi calculation in order to understand functions and recursion properly.
Using the Leibniz Formula for calculating pi, I came up with the following, which prints pi to the tolerance of the given parameter, and the number of recursive function calls in order to get that value:
reverseSign :: (Fractional a, Ord a) => a -> a
reverseSign num = ((if num > 0
then -1
else 1) * (abs(num) + 2))
piCalc :: (Fractional a, Integral b, Ord a) => a -> (a, b)
piCalc tolerance = piCalc' 1 0.0 tolerance 0
piCalc' :: (Ord a, Fractional a, Integral b) => a -> a -> a -> b -> (a, b)
piCalc' denom prevPi tolerance count = if abs(newPi - prevPi) < tolerance
then (newPi, count)
else piCalc' (reverseSign denom) newPi tolerance (count + 1)
where newPi = prevPi + (4 / denom)
So when I run this in GHCI, it seems to work as expected:
*Main> piCalc 0.001
(3.1420924036835256,2000)
But if I set my tolerance too fine, this happens:
*Main> piCalc 0.0000001
(3.1415927035898146,*** Exception: stack overflow
This seems wholly counter-intuitive to me; the actual calculation works fine, but just trying to print how many recursive calls fails??
Why is this so?
count, it won't have a value of2000, it will have a value of...+1)+1)+1)+1)+1)(I omitted the 2000 left-parentheses at the start :P). When you print that, it is actually added up.countto be something like anInt(constant in space). You'll get used to this, but's definitely something that can bite you.