As for anecdotal evidence seen in a consultancy function: You are wrong in assuming that companies always hire database experts or data scientists to write their db queries who know how to not make the db the bottleneck.
In reality you'll find developers who claim "we can't use Ruby here because it's too slow!" while they're unaware that the reason their page needs seconds to load is because of hilariously inefficient queries to the db.
In my consulting experience, I'm shocked if I find that inefficient queries aren't the root cause of poor performance. In fact, thinking about it, I don't think it has ever happened. I always look at the DB layer first because it's virtually guaranteed that someone wrote a "SELECT * FROM MassiveTable" and added it to the common header code used by every page.
Matches my experience too, both consulting and in-house as the person who'd be the first to even consider looking at the database query logs or running "explain" on queries.
A reason for ORMs is that a lot of developers fear the database (EDIT: not the only reason, to be clear; I love and use ORMs). A result of ORMs is that a lot of developers think they can avoid understanding the database.
I'm pretty sure the highest value per character code I've written so far was a 10 line monkey punch back in the rails 3 era that would crash the app if you tried to use an active record query without a limit or with too high a limit.
Hehehe, that reminds me of a similar active record monkey patch at an old job, the name of which was used as a swear word everywhere except where that particular engineer was present.
+1 to this. I've lead performance optimization on enough real-world problems to be conditioned to just go straight to the database access patterns from the start...it's always there.
Modern languages, including Ruby, are all plenty fast enough computationally for the vast majority of business workloads that aren't Google scale. When things slow down...it's the database or something similar like N+1s calling external APIs.