Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Not quite - to be able to use factories and dependency injection, you have to use interfaces. And not being able to use functions as parameters (or write them on the fly) is annoying in all kinds of software development, not just enterprise software.

Also, all the nice libraries have been updated to use the new language features like generics and annotations.



You don't have to use interfaces in order to do dependency injection, you just probably should, depending on the task. I wrote something last week which injected dependencies directly to a class. Interfaces are actually one of the best ideas in Java, IMO, if you're using them appropriately.

Generics and annotations are pretty useful. What's the matter with them? Have you spent time developing with them? They're certainly better than having to cast all over the place.

RE: functions as parameters.. yeah that's kinda annoying. You can always do new Runnable() { void run() { doSomeStuff; }} but that's ridiculous. So you have a point there. Aside from that though, the language is suitable for most tasks, and requiring a little extra syntax when you want to pass functions around isn't the end of the world.

Anyways, I'm sure I'm arguing against the trendy masses here, and I have a clojure book too, just trying to set up the opportunity to use it for something real.. but there's a reason why most "real" back-end stuff that has to be performant, stable and reliable winds up getting written in Java these days. That includes Hadoop, Lucene and half of the NoSQL key-value stores that you're excited about :)


A lot of the "real" performant/stable/reliable backends seem to be written in Scala and Erlang these days. People use whatever languages are production-ready state-of-the-art at the time they begin their systems. C++ was state-of-the-art when Google started in 1995; Java was state-of-the-art when Lucene/Hadoop started in the early 2000s; now people are building things in Scala and Erlang.

Anyway, be careful when using "big and accepted" projects to indicate ideal platforms. It takes 5-10 years for a project to become big and accepted. In the process, its technology platform has probably become obsolete, but continues on because of inertia. Your goal should be to shoot for what your role models would do now, with the benefit of 10 years of additional technical development, and not where they were 10 years ago.


You might want to double check your dates on Erlang. The first version was developed by Joe Armstrong in 1986.[1] It supports hot swapping thus code can be changed without stopping a system.[2] Erlang was originally a proprietary language within Ericsson, but was released as open source in 1998. http://en.wikipedia.org/wiki/Erlang_%28programming_language%...


I know how old Erlang is. My point is that the computing environment has changed in the meantime, such that it's now a viable alternative for servers.


Ok, my point was I recall some Erlang hype back in 90's. What specifically in your opinion made that 20+ year old language "production ready"?


The list I had in my original comment (which news.YC ate):

1. Software projects have increasingly moved from the desktop (where having a complicated runtime is a non-starter) to the server (where you can control everything).

2. Failover and hot swap capabilities are increasingly demanded for consumer apps.

3. It's now viable, from an efficiency standpoint, to have a language that copies data structures around instead of mutating them.

4. There's been a groundswell of interest in new programming languages, probably mostly because of #1. This has opened the door for old languages that previously lacked mainstream acceptance. (See also Python for an example.)


The computer industry focuses on three things, internal corporate systems, consumer applications / COTS, and embedded systems. Erlang was invented over 20 years ago because it was useful at the time for a wide range of applications. I happened to work on a business system written about that time that custom rolled a database, custom GUI, hot swap capabilities running on apple talk. It was recreated but that had more to do with OSX showing up than anything to be gained from the transition.

I would suggest that you have focused on consumer apps you may want to look up the rest of the industry because much of the new hotness is really old news. There seems to be a trend where people pull out an old idea, give it a new name and start hyping it, but new ideas are uncommon in computing.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: