The promises of functional programming.
This week’s
entry is about an article by Konrad Hinsen called “The Promises of Functional
Programming”. I really liked reading this particular article because I’m
currently working on parallel programming and found what Hinsen had to say
about that interesting, but more on that later.
Hinsen also
makes note of the way functional programming works just like math, where you
have a function that given the same input will give the same output. I’m
finding this kind of programming really cool (maybe because it’s way more
related to my field where there are lots of applied maths), at first it felt
weird programming without all the variables, but after I got the hang of it, I
think having functions without side effects is way more practical.
What I
still can’t get my head around is what everyone says about the functions being
data, I don’t see how that can be relevant, but I hope in time I’ll see what
all that is about.
Now, about
the parallel programing. When I first got into parallel programming (using
OpenMP on Ubuntu), I found programmers are having a hard time when trying to
parallelize certain programs, and that it’s impossible to have a program that
is 100% parallelized. And the hardest thing of all is to have a program
automatically parallelized that maintains a higher efficiency than the
serialized version of the program.
And just
one of the worst problems I’ve had is dealing with the variables. Sometimes a
thread will mess with a variable and every thread will make its operations on
the messed up variable, turning the whole program useless. That’s why you have to
be extra careful with the variables you make private.
With pure
functions you don’t have to worry about this, thanks to them not having any
side effects. I think this could be the way to optimize parallel programming.
Comentarios
Publicar un comentario