In a lot of web articles, functional programming is presented as avoiding all kinds of variables reassignment and so promoting only "final" variables at least for especially just better reading.
Most of them are taken the sample of a poor loop with a counter variable incrementing. (like the famous i++ or x = x + 1.
Here an article of Uncle Bob illustrating it: FP Episode 1
Hence, these articles signal that counting on mutable variables very often leads to side effects, and especially prevent what we call "Referential transparency" and therefore, makes harder to construct a program running on multi-threads or better, multi-processors.
My question is: As we all know, i++ is generally a thread LOCAL variable, so no issues can occur even with concurrent processing.
Why to choose an example like a loop with local variable as a drawback of assignments, and permit directly to conclude that concurrency programming is risking?? These both things are strictly unrelated to me.
Why not, in order to be more clear, choose a reassignment of global variable (or field object) that is clearly the enemy of concurrent programming without overusing all boilerplates of locks as Java does.
I really think that this loop sample isn't the best illustration to transmit benefits of functional programming to imperative programmers guys.
Furthermore, it leads to confusion with "noob" functional programmers since Scala for instance uses a lot of the while loop pattern as in List.scala class:
override def take(n: Int): List[A] = {
val b = new ListBuffer[A]
var i = 0
var these = this
while (!these.isEmpty && i < n) {
i += 1 // reassignment here
b += these.head
these = these.tail
}
if (these.isEmpty) this
else b.toList
}
mapor afoldwe could have substituted it for a parallel map or fold.