But in cases of needless virtual calls (doesn't Java default to virtual for some strange reason?) it may be a quick and easy win.
Additionally, it's not always so easy to drop to a low-level language. If your architecture is enormous and complicated, it might be totally unfeasible to change languages for hot parts.
In Java, all methods are virtual. You can often achieve a similar effect to non-virtual methods by declaring them final to prevent them being overridden in subclasses, but the same rules about which method is called apply. The reason to simplify the language (in comparison to C++) - the rules about which method are called are much simpler and easy to remember.
Not having to think about the question "should I make this method virtual?" makes the Java language simpler, yes.
In the vast majority of cases where a virtual function call is actually monomorphic or bimorphic at runtime, the JVM JIT can observe that and potentially inline the method (with an if statement in the bimorphic case). It puts guards around the inlined method and deoptimizes in the event that a newly loaded class renders the optimization incorrect.
It just flips it for "should I make this method final?" So at best it's a wash. In the majority of cases, a function shouldn't be virtual so users need to mark it final to accurately represent their design.
Well, the answer to "should I make x final" is always "yes" in Java, so you don't really have to think too hard about that either. :-) `final` really should have been the default setting for all methods, classes, and local variables, but unfortunately inheritance was still in vogue when Java was created, and the benefits of immutable values weren't as well understood or appreciated either (or maybe they just figured the C & C++ programmers they were trying to win over would hate it).