The aim of any computer architecture is to increase the performance and efficiency of the computer system, while reducing cost, memory and resource requirements. One way to achieve this objective is by implementing concurrency. Concurrency refers to the decomposability property of a progam into independent and partially ordered components. Concurrency can be implemented in any stage, such as cache memory, instruction prefetch, memory interleaving and instruction pipelining.
To define any computer architecture, we need to first define some of its basic terms, such as the type of ISA (Instruction Set Architecture), memory addressing of the system, addressing modes, types and size of operands, interfacing etc.
Various technologies influence the design of computer architecture. In the late 70s/early 80s, for example, VLSI (Very Large Scale Integration) utilizing more transistors on smaller chips led to highly reduced costs of computer systems.
With each decade, the fundamental goals of research in computer architecture was to find the best possible way of providing high computing power, with computational and memory ability to solve complex problems in the cheapest way possible. Parallel computing promised to be a better option than superscalar technologies. In parallel computing, the compute power of multiple computers would be aggregated to solve a larger complex problem. In superscalar computer system, a single computer will be dedicated to solve the problems armed with very high performance techniques and hardware implementations, which may be costlier than parallel computing solutions.
Major areas requiring high performance from cost effective computer systems include structural analysis, weather forecasting, industrial automation, expert systems, artificial intelligence, defense, biotechnology, genetic engineering, aviation industry, scientific simulation etc.
The need for high speed computing arose from the desire to search cheaper alternatives that solved a number of problems, including time, cost, accuracy, resource availability, etc.