Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Julia was ahead of the game with automatic differentiation which took a few years for Python to get support.

In what way is this true? Looks like Julia didn't exist until 2012. If I remember correctly, theano was the big AD thing in python at that point.



Theano existed, but it didn't use autodiff at the time, most loss functions had preprogrammed derivative functions. Python first had native autodiff in June of 2013 with the ad package, but previously had bindings to Fortran and C++ libraries that supported autodiff in different use cases. Julia first had native autodiff in April of 2013 with ForwardDiff.jl.


It seems overly nitpicky to differentiate (no pun intended) so much between forward mode AD with DiffRules and theano's specific flavor of symbolic differentiation, but I'm no expert there.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: