If a product for an application performs its function perfectly well then, by definition, it can not be improved upon, and so new technology is not required for that application. As such the concept of the right tool for the right job applies in computing as much as it does in any other field. Some tasks, such as word processing and web browsing, do not need computers anywhere near as powerful as even the cheapest PCs of ten years ago because they are trivial tasks that have not greatly increased in magnitude or complexity.
However, there are some tasks which become increasingly difficult over time, either because the problem increases (e.g. air traffic control) or because the problem is so huge that attempted solutions can absorb as much computing power as can be acquired (e.g. weather forecasting). In addition, there are also classes of problems which need more advanced technology, not to solve a problem faster, but to solve it with less effort, complexity, or cost. Most electronic systems can benefit from simplification, most commercial systems can benefit from lower cost, and most users can benefit from lesser effort.
In the context of the problems requiring new technology, the question needs to be asked whether the evolutionary path followed for the past two decades has provided the right tools for the job or whether there should be a revolutionary solution.
Up until the end of the 1990s, computing speeds were increasing rapidly, roughly following Moore’s Law (doubling every 18 months). This was evolutionary progress, but it stumbled in the last years of the 20th century. When the rate of growth of computing needs exceeds the rate of growth of computing power then technological, scientific, and, ultimately, social developments suffer.
There is no point evolving if solutions to today’s problems are decades away. By the time the solutions arrive the problems will have moved on and even greater difficulties will lie in store. Evolutionary development has shown over the past decade that it cannot match demand, so a revolution has been required for many years now.
However, it is not sufficient to have a revolution if after the revolution there is merely evolution at the previous rate. The traditional doubling of power every 18 months is not a fundamental law but a consequence of the way the technology is applied. There is no fundamental reason why computational systems cannot improve faster, it is simply the case that computers the way they are currently designed cannot improve faster. Symbiotics addresses this situation and allows much faster growth in the future.
Conventional modern computers are based on the past, not the future. Processors are designed to suit how the pre-existing software operates instead of designed to suit how new software could operate. Software is designed to suit the sequential nature of computers and so future computers remain sequential. Software is designed to gloss over the complexities of multitasking, multithreading, multiprocessing, and networking, and so future computers do not address these issues even with the fashionable “multi-core” approach. It is a vicious circle that people are desperate to escape but afraid to jump.
Once the decision has been made to adopt a new approach then many new possibilities open up. Ideas previously considered technologically impossible can become straightforward once a new tool is available. Intesym believes that Symbiotics is such an approach.
A parallel computing architecture, scalable from embedded systems to supercomputers, efficiently handling fine-grain concurrency levels of hundreds of thousands.
Variants include 16- to 64-bit general purpose systems, transmuteable instructions, and arbitrary precision arithmetic.
A case for a new architecture