Efficient code is not as important as it use to be. What is worse is that the situation is getting worse not better. While the vital parts of a program are likely to be written efficiently, a growing percentage of the code is going to be quickly written and probably inefficient. Making this situation even worse is that the code being written is often going to be interpreted or ran on a virtual machine. To understand why this is the case and why things will get worse before they get better one just has to look at how programming has evolved over the years.
The earliest computers were programmed in machine language. In the very early days this was done by re-wiring machine components though this quickly changed to flipping switches and then into punched cards or paper tape. Computers were huge expensive machines so every second of machine time was precious. Programs were often ran in batches meaning that you would submit your cards or tape and wait (sometime days or weeks) to get your results. When you got your results, if there were problems then you would have to re-submit. As computer time was golden, sloppy code was frowned upon. Likewise, because there was a fair amount of time between tests, programmers had plenty of time to make sure their code was well written.
Computer programs were much simpler back then due to the fact that the machines had almost no memory and were slow. Memory in early computers were measured in Kilo-words with the number of bits in a word being machine specific though 18 seemed to be a common number. A gigabyte is 1,048,576 kilobytes or 1,073,741,824 bytes so to think back in terms of kilo-words is scary. Speed was in kilohertz. Yes, a gigahertz is 1,000,000 kilohertz. Still, programming in Machine Language would have been painful work. What most programmers did was write their programs in an English-like format where machine instructions were given simple names such as LOAD, STORE, JUMP. This code was then converted into the numbers that had to be put into the computer.
Hand coding machine language was a big waist of programmer time, but the cost of programmers was a lot less than the machines that were being programmed. I am sure that many early programmers came up with the idea of a tool that would take a human readable form of machine language and converting it into the actual machine language instructions. It wasn 't until the 1950s that such programs started being used. This made programming much more efficient. When a program was written in assembly language, it had to be run through a program called an assembler which converted a human-readable form of machine language into a machine-readable form of machine language. While this added an extra step to the process, the assembly step only had to be done once and the resulting output could then be used any number of times. The assembler resulted in boosted productivity for the programmers.
The cost of computers began to decline while the capabilities continued to grow. This resulted in computers becoming more common. Soon big business and research facilities all wanted to have their own computers. The problem with computers then became the requirement for programmers to write the programs in assembly language (which at this point was starting to be referred to as machine language even though technically speaking there is a difference). If only there was a way for "normal" people to write programs.
This article will continue in a couple of weeks as next week I will be writing about the March release on Blazing Games which is a game that came from a quick protoype I created. This game is an example of how plans for a game can drastically change.