I'm trying to remember why Ruby fell out of favor ~10 years ago. I don't think it was all down to performance because it was mostly on par with Python. I seem to recall that it came down to community practices that were a little too freewheeling - widespread monkey patching, etc. and these had become reflected in Ruby's flagship product (Rails). (I used to do a lot of Ruby programming in the 2001-2010 timeframe and like the language, but it's mostly been Python since then)
Ruby didn't "fall out of favor" 10 years ago, usage of Ruby as well as Rails moved into the enterprise space. Instead of startups using Rails, it was larger companies trying to circumvent the bureaucracy of their existing toolchains. I was a Ruby developer that whole time, and only stopped writing it at my day job in 2020.
Others in this comment thread have pointed out JavaScript, and I think that's a pretty good bet on why Ruby isn't really used by newer companies anymore. If you're trying to build an application for the web, it makes a lot more sense to work with _one_ language rather than two. Additionally, because JavaScript is always going to have a good amount of developers in the hiring pool due to its platform exclusivity over the web frontend, it's much easier to build out your engineering team than it was when your code is written in Ruby. Ryan Dahl described JS as "the best dynamic language in the world". I'm not entirely sure if I agree with that, but I do know where he's coming from. JavaScript is much easier and more ergonomic to use these days, and some of the most innovative technologies for web application development are coming out of this space.
I love Rails, but given the way things are going, I'm not sure I'll ever build a project with it again. We just don't need it anymore. That's not to say it doesn't have a place, and there will definitely always be jobs in Rails since it's being heavily used by the enterprise world, just that I don't think I'll be all that interested in it again now that I can use JS for everything.
Sure, but Ruby could've been a player in data science as well. But for some reason Python won handily in that space. Was it all down to numpy, scipy, etc?
Whenever this subject comes up, people seem to forget how poorly Ruby plays with Windows. WSL helps a lot and gives you a practical experience, but it'll never be a first-class citizen on Windows like Python is.
Maybe, but most data science jobs I see (and have been around) seem to be using Linux... with a few on MacOS. Lack of Windows support doesn't seem like a problem in the realm of data science.
Additionally, I think there are a lot of non-programmers, non-engineers working in data analysis or data "science" and Python syntax is much more approachable for a beginner.
Lots of starter programming courses whether university or online are in Python, etc. Again if you're not actually and engineer or developer of sorts, you're not likely to spend much time and effort moving on from the language you already know that everyone else in your field using, etc.
The answer to that is yes. Python was already used a lot in science and when Data Science started to blow up it already had numpy, scipy, scikit-learn and pandas. Also those libraries are really fast because they're wrapping highly optimized Cython, c, c++, and Fortran. I don't think Ruby had anything like that at the time.
Any language with FFI can have similar bindings to the same libraries used by Python, see Java, .NET (C#/F#), Julia, Swift,..... yet it doesn't seem to happen for Ruby.
My memory matches yours. Thanks to things like monkey patching you had the problem where changing the order of imports would break other code. And if a dependency of yours added a new import, your code would randomly break for no apparent reason.
As a result I saw more than one organization say, "No new projects in Ruby."
I'm sure that there were many different experiences in many different organizations. And I have no idea how widespread my impression is. But it certainly discouraged some people from using Ruby.
I see it as a pendulum. In the old times, we used to write web applications in CGI, where there was no fixed structure whatsoever; the application programmer was responsible for creating the entire web stack with their own bare hands. Then the pendulum started swinging towards more structured frameworks and reached a peak with J2EE, where the application programmer only had to write a tiny piece of code (servlet) that went into a massive framework (servlet container), in a language that had a huge standard library (Java), following clear industry standards (e.g. JavaBeans, design patterns). By the time Rails appeared, the pendulum had started swinging back towards less structure and less formalism. Rails was still a framework, with a strong set of conventions, but it was quite simplified compared to J2EE. The pendulum then continued and people switched to Node.js, which was much more barebones and flexible than Rails. Right now, the pendulum seems to be reaching the opposite extreme with Go, where not only there isn't a framework, but there also isn't a virtual machine (everything is compiled to native code), the language itself is almost as simple as C, and there is barely any standard library.
Meanwhile those of us on the JVM/CLR ecosystems, saw it come, influence of some the JEE/Spring/MVC designs, Groovy, coming and going as IronRuby, influence CoffeScript, and eventually fade away we still keep using JVM/CLR ecosystem, and the savy ones will even know how to AOT compile our applications, if needed.