GNU Project to speak at the previous entry in this blog, we today to discuss an issue that once I was rather curious: the fifth generation of computers.
But as always, to understand this history is necessary before a little soaking some concepts and history. A computer is just that, ultimately, an information processing machine, with the characteristic that is programmable, and therefore versatile and powerful, while certainly the parents of modern computing certainly not they went through your mind that something like Facebook could even exist.

In any case, this is not a history of paleontology computer, so we focus a little on the subject. Basically, there are known and recognized four main generations of computers.

The second generation of computers is based on transistors. With this new technology using silicon semiconductor materials such as computers could be more powerful and economical, but also a smaller size and lower power consumption. In this second generation, we highlight IBM machines such as the Honeywell 7090 or 800.


However, if you look at all the computers and companies mentioned, they were all American. Of course there were more manufacturers in other countries, such Bull in France and Siemens in Germany, but the Anglo-Saxon dominated widely.
Japan was, until then, just a "replicant" English or American technology to the 70. However, following the huge success of the microelectronics industry and automobile consumption, the next target for the Japanese was clear: to lead the next revolution in computers. That is why it was created during the 80 draft fifth generation of computers.
This project was funded by the Ministry of Economy, Trade and Industry (MITI) and developed by the Centre for Development and Japanese Information Processing (JIPDEC) and the main idea behind these machines is based technologies and techniques used in artificial intelligence.
Looking at the wikipedia , the main fields for research of the project were:
- Technologies for knowledge process. Technologies for processing
- databases and massive knowledge bases. Workstations
- high performance. Distributed Computing
- functional.
- Supercomputer for scientific computing.

But the problem with these machines is that, although interesting from a purely academic standpoint, not so from a practical standpoint, as a machine with a microprocessor of a general can do the same things, with a lower price and even better performance even in the same field of artificial intelligence. Furthermore, possible improvements at the architectural level are generally very difficult to implement in other systems, because we are talking about machines that do not even follow the von Neumann architecture .
is why that, although Japan has not considered the project as a failure, not much talk about the success of it. However, as I said a friend of mine, who failed not achieved no goals but who does not even try, so that despite the apparent waste of resources and money by Japanese industry in a research project with so few positive results, it is always commendable and admirable for a country decides to engage in this kind of project.
0 comments:
Post a Comment