There are two reasons creating software will never (in my lifetime) be considered an engineering discipline:
1) There are (practically) no consequences for bad software.
2) The rate of change is too high to introduce true software development standards.
Modern engineering best practice is "follow the standards". The standards were developed in blood -- people were either injured or killed, so the standard was developed to make sure it didn't happen again. In today's society, no software defects (except maybe aircraft and medical devices) are considered severe enough for anyone to call for the creation and enforcement of standards. Even Teslas full-self-driving themselves into parked fire trucks and killing the occupants doesn't seem enough.
Engineers that design buildings and bridges also have an advantage not available to computers: physics doesn't change, at least not at scales and rates that matter. When you have a stable foundation it is far easier to develop engineering standards on that foundation. Programmers have no such luxury. Computers have only been around for less than 100 years, and the rate of change is so high in terms of architecture and capabilities that we are constantly having to learn "new physics" every few years.
Even when we do standardize (e.g. x86 ISA) there is always something bubbling in research labs or locked behind NDAs that is ready to overthrow that standard and force a generation of programmers into obsolescence so quickly there is no opportunity to realistically convey a "software engineering culture" from one generation to the next.
I look forward to the day when the churn slows down enough that a true engineering culture can develop.
Imagine what scenario we would be in if they laid down the Standards of Software Engineering (tm) 20 years ago. Most of us would likely be chafing against guidelines that make our lives much worse for negative benefit.
In 20 years we'll have a much better idea of how to write good software under economic constraints. Many things we try to nail down today will only get in the way of future advancements.
My hope is that we're starting to get close though. After all, 'general purpose' languages seem to be converging on ML* style features.
* - think standard ML not machine learning. Static types, limited inference, algebraic data types, pattern matching, no null, lambdas, etc.
The Mythical Man-Month came out in 1975. It was written after the development of OS/360, which was released in 1966. Of the many now-universally-acknowledged truths about software development contained in that book, No Silver Bullet encapsulates why "in 20 years" we will still not have a better idea:
There is no single development, in either technology or management technique,
which by itself promises even one order of magnitude improvement within a decade
in productivity, in reliability, in simplicity."
I like to over-simplify that quote down to:
Humans are too stupid to write software any better than they do now.
We have been writing software for 70 years and the real world outcomes have not gotten a lot better than when we started. There are improvements in how the software is developed, but the end result is still unpredictable. Without thorough quality control - which is often disdained, and there is no requirement to perform - the result is often indistinguishable whether it was created by geniuses or amateurs.
That's why I would much rather have "chafing guidelines" that control the morass, than to continue to wade through it and get deeper and deeper. If we can't make it "better", we can at least make it more predictable, and control for the many, many, many problems that we keep repeating over and over as if they're somehow new to us after 70 years.
"Guidelines" can't stop researchers from exploring new engineering materials and techniques. Just having standard measures, practices, and guidelines, does not stop the advancement of true science. But it does improve the real-world practice of engineering, and provides more reliable outcomes. This was the reason professional engineering was created, and why it is still used today.
Engineers that design buildings and bridges also have an advantage not available to computers: physics doesn't change, at least not at scales and rates that matter. When you have a stable foundation it is far easier to develop engineering standards on that foundation. Programmers have no such luxury. Computers have only been around for less than 100 years, and the rate of change is so high in terms of architecture and capabilities that we are constantly having to learn "new physics" every few years.
Even when we do standardize (e.g. x86 ISA) there is always something bubbling in research labs or locked behind NDAs that is ready to overthrow that standard and force a generation of programmers into obsolescence so quickly there is no opportunity to realistically convey a "software engineering culture" from one generation to the next.
I look forward to the day when the churn slows down enough that a true engineering culture can develop.