I've noticed the following peculiarity in Haskell:
data Tree a = Empty | Branch a (Tree a) (Tree a) deriving (Show, Eq)
leaf:: a->Tree a
leaf x = Branch x Empty Empty
findNumL:: Tree a->Integer
findNumL (Empty) = 0
findNumL (Branch x Empty Empty) = 1
findNumL (Branch x left right) = (findNumL left) + (findNumL right)
This is code will run perfectly fine, and it will return the number of leaves in a binary tree. However, if one tries to call the function leaf x instead of Branch x Empty Empty, the pattern recognition breaks down, making the definition of leaf x much less useful than it could be. Is there a way to circumvent this issue and use the leaf in pattern matching?
findNumL $ Branch "root" (leaf "left") (leaf "right")and so does thisfindNumL $ leaf "root"findNumL (leaf x) = 1in the second line of the function definition, the code will not compile.