This is about a 17 minute read. Feel free to pause and come back to it later.
Clojure is not one of the handful of "big" mainstream languages. This means that sometimes people are surprised that we are all in on Clojure. Why go against the grain? Why make it
That’s the tricky part. The way I snuck it in at the first job where I got to use it professionally was by making some utilities for internal use first. Stuff like monitoring dashboards, etc. And then we had a couple of other devs who got interested, who started learning. Next project we had we pitched using it to write a prototype quickly, and then it worked so well that we just turned it into an actual app we put in production. After I left that job, I just kept finding places that already used Clojure, so I didn’t have to go through that process again. :)
If you’re working a solo dev, then it’s probably not gonna be too hard to start using it. Main questions you’ll get will be around how hard will it be to hire somebody knew to work with it if you leave. However, if you work on a team then there are a few things to consider. Having introduced a new piece of technology, you’re gonna be on the hook for helping people get ramped up, and get them comfortable with the language. I also found that some people have trouble switching from imperative mindset to functional. I’ve worked with people who just couldn’t make the switch and became really frustrated as a result. So, whether it’s a good idea to introduce Clojure depends on the team dynamics, and what people are comfortable with.
Personally, I do think it’s useful to learn this paradigm because it gives you a different perspective on how to approach problems, and a lot of that translates to working with imperative languages as well. But I wouldn’t push it if people seem averse to it either.
Already found out my team lead is not a fan of Clojure. :( But yeah, if I could sneak it in by way of utils, that would be pretty neat.
I do find it’s one of those languages that people either love or hate. Amusingly, my old job used to hire interns regularly, and what we found was that students from second or third year could pick it up really fast. We could get them up and coding something useful in like a week or two. But students from fourth year had a lot more trouble. It turns out that the difficult part wasn’t in learning Clojure, but unlearning patterns people internalized using an imperative language.
True, a lot of us, myself included, have been working with imperative languages since the beginning.
I think it’s an artifact of how computing developed. Back when personal computers started showing up, they were very slow and resources were extremely tight. C basically appeared as a form of portable assembly that allowed you to write code that could be translated to a specific specific instruction set instead of writing assembly for each chip individually. A whole generation of developers learned to code using this style, and then went to work in the industry and teach at universities. Meanwhile, very few people got exposure to the functional family of languages that weren’t seen as practical due to needing stuff like garbage collection and using higher level abstractions that were seen as being too expensive. I recall how even when Java started showing up in the 90s people balked at the idea of using gc at the time.
Today, we live in a completely different world where squeezing out raw performance out of the machine isn’t the top concern for most programs. Stuff we struggle with is maintaining large code bases, working across teams, making code extensible, and so on. And I think this is where the functional approach really shines.