Saturday, September 25, 2010

Running With Your Period?

The birth of UNIX

everyone. I am aware that is a chapter of the Commodore Amiga series, the third and last but due to personal circumstances I have no available access to the documentation that was on it, so, if I may, until you can finish the series will continue to speak of other issues. The

today is one that I have always aroused curiosity. The legend goes on to say that Ken Thompson was idle for the world after the failure of the MULTICS operating system development work, a PDP-7 with laughter by the Bell (which worked) and began to develop an operating system with many MULTICS ideas but taking away some things, then taught it to his colleagues and decided to port it to another more powerful machine with the language that had just formed Denis Ritchie, C. That is myth, but ... How true is it?

Fernando J. Corbató.jpg

As always, every event has to be placed in its proper context, ie we must go back to the top. And the principle of UNIX is in MULTICS. MULTICS was a project led by Fernando J. Corbató (pictured), a computer scientist who worked at that time at MIT as an expert in computer systems timeshare. Under his tutelage in 1964 teamed General Electric, Bell Labs and MIT itself to create a time-sharing operating system that will provide future operational systems of this style. Although there were several types of time-sharing systems, most were the result of experiments that had been developed in universities and laboratories, so the idea of \u200b\u200bMULTICS was to take all these experiments and agglutinate into one that incorporated all the good things that had been created over some of their own.

However, the project encountered many problems (it was too big and heavy for the time, we might even say too ambitious), prompting many critics to abandon Bell made the development of the system. Finally, yes MULTICS was completed a few years later, but was a major commercial failure, although many of his ideas were very interesting and innovative, as demonstrated used to see it implemented in future operating systems.

However, although Bell had abandoned the project, did not mean that all the work done in the past five years fell on deaf ears, and several of the engineers involved in the development of MULTICS (Ken Thompson, Dennis Ritchie, Doug McIlroy and JF Ossanna) continued to work in the field of shared systems.

There was one problem. They were working on developing a new system capable of supporting multiple users at once (which could do MULTICS) in a simple and economical (which did MULTICS), but of course ... without a computer to work with all was, how to say, a little more complicated, so it made several requests to the Bell for the purchase of equipment with which to work, which were all rejected requests (think of the enormous costs that have computers at that time, something difficult to justify for a group like this that just was investigating and did not even know if I could get something out there, huge costs even for a giant to the Bell).

On the same day or so, Ken Thomson, who could be called the father of the original idea of \u200b\u200bUNIX (least the first drive of the creation of a new operating system), had developed a game called "Space Travel" a GE 635 (a General Electric machine, one of three partners MULTICS). However, this was not very suitable for computer programs such as graphic about the limited capacities in this regard with which it was designed, so for all was a blessing to find a DEC PDP-7, which was equipped display capabilities far superior to the GE 635 and was hardly used by anyone, so Ken Thomson began working on porting their game to this machine. It may not seem to have this much to do with UNIX, but thanks to this project Ken Thomson learned the idiosyncrasies of the DEC machine and enough tricks to feel confident at the time of developing her ideas about the new operating system that came prowling through his mind for some time. They had to work with computer.

first thing that was developed was the system files, and once followed the shell, an editor and an assembler would be able to "assemble" himself (remember that much of the code then those systems are pulling in assembler). Once you could create and manipulate files, edit code and assemble without the need to work on other machines and then pass it to the PDP-7, ie once had a system that could be autonomous in order to work, it began to work directly on the operating system itself.

And like any system, needed a name. This came from the hand of Dennis Ritchie in 1970, which suggested to Ken Thomson on behalf of UNIX, because on one hand it seemed their pronunciation to that of MULTICS, a system in which it was based, and the eunuch , a eunuch in English, since it basically was a MULTICS UNIX "castrated" that have been removing things in the pursuit of simplicity.

NewImage.jpg

system gradually grew and reached the point where the original machine that was installed on the PDP-7, had become too small to accommodate the project. Fortunately, with the growth of the project grew its usefulness and also the number of people who used it in the Bell.

For the type of use being made of UNIX, its developers were adapting the system to enhance the treatment of text files. Due to system facilities for the treatment of flat files (ie text only), an early internal customers of the project was the patent department of Bell.

Since the experiment was very successful, the group gained credibility within the firm and strong enough to get permission to buy a new machine, a whole brand new, brand new PDP-11/45 DEC for $ 65,000 in 1970 (which you can see in the picture along with Dennis Ritchie, who is standing, and Ken Thomson).

Now there was a small problem when porting all the old UNIX system to the new PDP-7 PDP-11, which is that the whole system was written in assembly and therefore was inconsistent from one machine to another (now hard to understand that a new system is backwards incompatible, but ... were other times, and times were more expensive computer that hours of programming).

course, there was the possibility of carrying all the code for the PDP-7 PDP-11, but in the future they wanted to port to another machine that was not a PDP-7 or PDP-11 would require to adapt back again code as Ken Thomson and Dennis Ritchie decided to port the UNIX code to the new language that Ritchie had developed from language B in 1971. While UNIX until then had been written in assembly, applications are developed in assembler and some other language B, but that language had some shortcomings, so Ritchie decided to develop a new language based on B to include concepts such as structures and data types, thing lacking B. Since it was an evolution of language B, called C Ritchie, finishing porting UNIX code Assembler PDP-7 to C language in 1973.

Until that time operating systems were written exclusively in assembler for efficiency reasons, so developing an operating system written in a portable programming language was a milestone. While assembly code was (presumably) more efficient, the C language was considered more legible and easy to modify, as well as being portable between machines with different architecture, so that was quickly evolving.

UNIX in 1974 made his first appearance in public when published in the journal Communications of the ACM an article explaining the simple and elegant design of UNIX. With this document, the number of machines installed UNIX grew at about the 600, a success that if you did not grow in time UNIX entered college and began to be studied, developed further by the number of new students to use it and, above all, working on it both to create new applications and to improve the system itself .

UNIX was growing, expanding, mutating, creating disputes over licensing issues and rights ... But that's another story and will be told at another time ...

Tuesday, March 23, 2010

Seman That Smells Like Fish

The missing link: Commodore Amiga (2 / 3)

If we talk about video games market many will surely come to mind companies like Nintendo, Microsoft with its X-box, Sony PS3, or maybe SEGA, Capcom, LucasArts, Blizzard, or any other major company. However, probably very few people think of Atari. But this was not always so.

Although Atari did not invent the game, it was the first company to make money with them. This was by far the largest company dedicated to video games from the 70's and first half of the 80 with legendary games like Pong, Breakout, Centipede, and so many others. Your game console, the Atari VCS 2600, was the favorite toy of all children (and not so young) of the first generation of video games, generating billions of $ for Atari.

One of the designers who worked on the VCS (acronym for Video Computer System) was Jay Miner. Miner was an engineer born in 1932 who was hired by Harold Lee on Atari. Like many other engineers, his passion was to design new electronic devices increasingly advanced and sophisticated. Jay's work was to design chips for the console, and the success of this work was that Atari will charge in 1978 to develop a more ambitious project: a personal computer.

After several months of work, Atari released the Atari 400 (pictured), a computer with amazing graphics and sound capabilities for the moment, and soon after introduced the Atari 800, a version a little more evolved and a professional keyboard Atari 400.

After the departure of these two computers on the market, Atari Jay Miner wanted to continue to develop computers within the company, and Miner would have been happy if Atari had followed his instincts instead of showing the more conservative side of the business.

At that time (1979), Motorola was working on a new microprocessor, the Motorola 68000, far more advanced and sophisticated than any other time microprocessor, either Intel or MOS Technology (remember, owned by Commodore .) Although the processor was still far from complete and could work on the design new computers, and Jay Miner (pictured, though some years later) began designing a computer with capabilities unthinkable at the time home computers.

However, there was a problem. As advanced computer design involved using a lot of chips and components that would make this damn expensive (consider, for example, the Apple Lisa and its $ 10,000 price). Unfortunately for Miner, Atari executives did not know or did not trust the so-called Moore's Law (the number of transistors doubles every 18 months, reducing by half the money it costs to do so. In short, what is now a in 3 years will be stupidity normal, costing even less than normal now.)

Therefore, in early 1982 Jay Miner left Atari taking the ideas of computer dreamed it, starting to work on a small chip maker called Zimast .

Shortly after mid-1982, Larry Kaplan, one of the very first programmers of the Atari VCS, Jay Miner called to offer a business opportunity. Larry Kaplan, following his departure from Atari, the company was founded Activision (Activision Blizzard, along with Electronic Arts, are currently the two largest building companies in the world game.)

The business opportunity was given by Larry Kaplan of founding a company to create a new game system, so that Jay Miner Zimast Kaplan produced the hardware and software and Activision, and all won tons of money. The venture would be financed by private capital, namely $ 7 million out of a consortium formed by a businessman in the oil and three dentists with no experience or knowledge of the world in which they got to that kind of money seemed an appropriate price enter into a business that already moved in 1982 billion $.

Finally, the company was founded Hi-Toro, whose name was chosen because a side sounded a high-tech company and another gave him a touch Texas, where he was the consortium that financed the entire business. Leading the project was Larry Kaplan was hired as vice president David Morse.

But months after Kaplan left the project, so to be without a leader was offered the job to Jay Miner, who accepted only under two conditions: that the final product will use the Motorola 68000 processor that could operate also as a computer.

And then the bubble burst of video games. And create the ultimate gaming machine a project no longer seemed so appealing. In fact, everything that had sounded like videogame longer sound appealing. So the Hi-Toro investors, worried about the turn of events, anxious to Jay Miner asked if he could turn the project into a fully functional computer and forget some of the gaming machine. Music to the ears of Miner.

However, there was a problem, although not with the computer, and there was already a company called Hi-Toro, a Japanese manufacturer of mowing machines. So it was necessary to find a new name. Miner wanted a name that sounded friendly and sexy at the same time, similar to how Apple could be a nice name and pleasing to the ear. Finally, although it seemed Jay horrible at the time, the company name changed to Amiga, mainly because nobody thought any better and also because the Amiga name was on the phone before Apple and Atari.

Following the recruitment of several engineers both for hardware and software, work began on the development of the computer. Since what was intended to do something spectacular compared to the competition at that time, were careful to prevent industrial espionage. Nobody knew what he did or what made the Amiga company, so they wanted. However, to avoid attention, planned to establish developing some small unimportant enough not to take many resources from the firm but attractive enough to generate good income, so he launched several products, both hardware and software for the console VCS with the Amiga brand (you can see the Amiga Joyboard in the accompanying photo.) Another measure

anti spy was to give women's names to different parts of the computer. The project itself is called Lorraine , as the name of the wife of Dave Morse (remember, the vice president hired by Kaplan).

One thing I was surprised to discover is that the Lorraine was designed by committee. Since the number of engineers working on the project was rather limited, in meetings that were attended by all and everyone could comment and give ideas, coming up to the various compromises between efficiency, speed or final cost for all. Yes, Jay Miner was head of the project, but decisions were made in a group where everyone could contribute their opinions and vision.

Unlike what were the computers of the era, the Amiga each task (sound, graphics ...) was controlled in a decentralized manner, much like you would a modern console. This feature made the Amiga had outstanding performance in their personal computer equivalent range. Besides, other features novel that brought the Amiga was sharing IRQs, I / O or memory mapped real preemptive multitasking (think that Microsoft did not have it until Windows NT, and Apple to Mac OS X, and the latter was released in 2001).

The difference between multi-tasking Windows 3.0 or Mac OS 8 (the first Macintosh did not have any multi-tasking) and the Amiga is that the former are called cooperative multitasking, ie, there may be multiple applications running on the computer but they are the ones who have control of the computer and, therefore, are the ones who decide when to let the other run. A so-called cooperative, it is not the operating system who time distributed applications but they are when they decide when to stop running to the other. In this way, if you have an application that locks the system does not choice but to restart, because the application never passes the bar to the other. However, the preemptive multitasking operating system, and not the applications, decide when an application is running and for how long. In this way, if a program crashes just kill the program and ready, do not hang the entire system.

Returning to Lorraine, to understand the titanic effort of the development of Amiga, consider the example of how it designed the Apple II or IBM PC. In these cases, engineers are "limited" to catch the chips that were in the market and place the best possible way. However, the engineers designed Amiga several of these chips from the start. Not that I made for Steve Wozniak on Apple II or what was done by IBM engineers did not have merit or were easy, it's just that the chips they used were already designed and manufactured, they had to make them from scratch and therefore is a job that was saved.

Amiga And as they were not flush with money made the old way, none of powerful workstations with that design but with paper, pen, platelets, logic gates, cables and lots of patience. In the attached picture you can see a prototype of the Amiga Lorraine or the design of three of its major chips (and to complete the picture, we think that the goal was to bring to market a computer for about $ 2,000).

respect to software, Jay Miner was aware of its limitations in this area, so Bob Pariseau hired to work with the software. Jay did not want an operating system like MS-DOS, CP / M or Appled. Bob had no experience in software development for microcomputers since he came to work with large mainframes used in banking, where the preemptive multitasking was normal and did not see any reason why a Personal Computer would not have to have these advances. Another decision of the software development group was to create a graphical interface, an important decision when you consider that we still speak in 1983/1984 and very very few computers still had a windowing environment.

Finally, the Lorraine was mature enough to be displayed for the CES 1984. The goal of the project, designing a personal computer several years ahead of what was in the market was achieved. Now all that remained was to find a way out to the market. It needed a partner able to bring to fruition the Lorraine. Would you be able to find girlfriend at CES?

Thursday, March 18, 2010

Registration Number Proshow Producer

the last entry that I will dedicate ...

I seek,
in the world to drown me,
hugs me and I forget,
in rush of people to return
the corner, and you escape
like fish the banks
like night and day
always close and do not look,
never look ...

And I wanted,
meet face to face,
return from injury,
Atraves from scratch, without reservation or
lies
and delivered without fear,
in the light of a new day, always looking
illusions ,
by the footprint of life.

And I face at night to a bed

very empty and filled with stories, adventures and mischief

then comes your memory,
and
farewell song and find myself every night,
point starting ...

morning I wake up, and breakfast

a new day and paint colors, if you return

my life and I dress up as a poet, minstrel of Andalusia

and I look around the streets, and
people do not look at me ...

And again at night, this bed is so empty

that filled with stories, adventures and mischief
, then comes your memory

and
farewell song and I am night after night, in
starting point ...


Every night I find myself on this point. *
. Rocio Jurado

[press play]
*
Pablo Neruda once wrote:
"Although this is the last pain she causes me,
and these the last verses that I write "
*
This will be the last pain that causes me and
this the last entry that I will dedicate ...
*

Tuesday, March 16, 2010

Final Post Date Australia

The missing link: Commodore Amiga (third)

Never before had gone to a RetroMadrid, that is, a fair retrocomputing with a small exhibition, lectures and other activities where Spectrums, Commodore C64, old retro consoles and other paraphernalia permeating a central role forgotten. When I saw some of the most revolutionary personal computers of all time, which went unnoticed at the time too and then his legacy has been unjustly forgotten. I speak, of course, the Commodore Amiga.

I knew of the existence of this computer and video game magazines, namely the Micromanía magazine (yes, I had regular size) and since then I was impressed by his ability to chart (and that he never had come to see in motion!). However, the first time I finally sit down in front of an Amiga computer, specifically an Amiga 500, was in August of 1994. I had purchased a few months earlier, in late June, a 486 DX2 66 Mhz with 4 MB RAM and 340 Mbytes of hard disk and the computer of the 80 did things that my brand new computer could not do (I had no card sound yet.)

So here is my tribute to the Commodore Amiga. Since it is a bit long I'll have to divide it into 3 different items. Hope you like it, and you know, if you have any comments or questions do not hesitate to write.



A company in trouble

The story of the birth of the Amiga platform consists of a set of circumstances and converging forces. One was, of course, Commodore, the company that eventually released the computer.

Commodore was founded by Jack Tramiel in 1958, engaged in the manufacture of mechanical typewriters. In 1966 a Canadian investor Irving Gould called the company bought keeping in front of it to Tramiel. But now that Irving was Commodore, it decided to focus on the emerging business calculators.

the 70 already entered the competition in this market (electronic calculator) was terrible, with manufacturers like Texas Instruments or HP monopolizing the Japanese market and pulling the price (and, therefore, profits) down. As a way to compete, Commodore bought MOS Technology 1976, the company, which had a strategic product for Commodore: MOS 6502 microprocessor.

is possible that many this processor you are familiar, for he was wearing, for example, the Apple I. However, the initial idea of \u200b\u200bJack Tramiel was, of course, use this chip on their calculators, but since before the acquisition by Commodore MOS was already working on the construction of a new computer using the 6502 as a processor, the KIM-1. Using this computer as a model, in 1977 came the Commodore PET, Commodore tucking at the top of the personal computer business.

In the 70's had three major manufacturers of personal computers. On the one hand, this Tandy with Radioshack, who was the top-selling computer. On the other hand, we had Apple, which was the fastest growing and knew best how to invest your advertising, but of the three was, until the arrival of VisioCalc, the least selling. And then we Commodore, the number 2 in sales by focusing on home computers and economic, as also deduced from its slogan "Computers for the masses, not classes."

Al Commodore PET was followed by the Commodore VIC-20 in 1981, a computer that cost about $ 299 and offered all he could offer a home computer over a high capacity for video games. The following year came the Commodore 64 computer for some time became the world's best selling computer. While in some respects was even slower than the Commodore VIC-20, thanks to its "huge" capacity of 64 kbytes and to be particularly well endowed time graphics and sound and a smart sales strategy in Europe (especially Germany) was auparon the first in the world of home computing.

Tramiel In 1983 Jack decided to focus on the domestic market and sharply cut the price of the Commodore VIC-20 and C64, thus starting a price war that swept away many computer manufacturers. However, while the Commodore's market share increased with this strategy, the price cuts had also associated with reductions in the margins of the company, which its owner, Irving Gould, was not willing to tolerate, so it began a small proxy war that ended in giving Commodore Jack and his men out of it.

What's more, just to exploit the market bubble of the game. Since the birth of Atari in the first half of the 70 to 1983, the video game business had done nothing but grow and grow uncontrollably.

Atari was by far the number one worldwide in this area, but a myriad of home computers and platforms have emerged to tow several Atari and took advantage of the stunning fashion and the golden future that seemed to promise this business just emerging .

But as happens in all markets who can only grow exponentially, at one point around 1982/1983 that the situation was untenable. Famous is the expression of one of the directors of Atari boasting how good they were selling video games if cagaran crap game in a box and put on sale probably would remove it from the hands.

To all who have witnessed the explosion of several bubbles already know what that is. The bursting of this is considered the release of the game ET, of which was paid an exorbitant amount for those then in respect of operating licenses ($ 200,000 at the time plus a trip with all expenses paid to Hawaii) and was a failure monumental. Atari games and consoles manufactured both sold assuming that everyone would want to play this game and, while not sell as many games, it sold more than half of expectations. However, the game was so bad and had so many failures that triggered an avalanche of returns, so that Atari came across a stock of millions of cartridges occupying site and not likely to use them for anything. To say I had to eat is not enough, Atari decided to go middle of the desert of New Mexico and buried. In a hole in the ground game would take place not in a warehouse to be paid monthly.

This was the starting signal for the crisis of video games so that the fall of Atari (with huge losses the next year) dragged the whole market and have games left this halo of modernity and attraction that had until that time. And, by the fall of the games, also stopped selling consoles of course, but also home computers where the main attraction was the ability to run games attractive. So if we sell fewer computers and also sell them cheaper ... for that, good crisis.

Commodore tried to go through with what I had and was duped in the development of a new computer, the successor to the Commodore 64. Finally, in 1985, Commodore 128 came to light, with a more powerful processor, double the memory and many more improvements, such as the incorporation of the operating system CP / M 3.0 (although it was a very fine implementation) and the ability to run in C64 mode sure (ahem) 100% compatibility, then not so very high.

Unfortunately for Commodore, the C128 was not a sales success as it was at the time the C64, and indeed was the old model that still kept the company afloat. Had to do something and had to do it now, because if not paint a very black future for the company ...

(continued)

Sunday, March 7, 2010

Whey Havevintage Receiver Repear Shop

On

I've always been amused by the texts that talk about what opportunities were lost this or that company. It is as if a company or project is going to be successful regardless of who leads. However, the computer world is full of examples of successful projects and technologies that have not been set and no fruit set. For example, it has left on this blog, we have the opportunity lost by Xerox PARC invented technologies in its laboratories at the 70, which were imposed on the market in 80, and many others who enriched and non-Xerox (the graphical interface, the laser printer, Ethernet network ...).

Another example too. Commodore was able to buy Apple in the 70's. An opportunity lost by the Commodore? Possibly not. "Apple would have been the same under the guidance of Commodore or would have been lost in the oblivion of history like so many others? Compaq, the leading manufacturer of PCs in the 90's, could also be bought by Apple in the 80's. "Apple missed the opportunity to become the No. 1 PC or Compaq would have been lost in the idiosyncrasies of the apple company?

Today I will talk about a missed opportunity, perhaps come too soon, perhaps for business blindness. But better start at the beginning.

A new generation of computer may only ring the company they DEC (acronym for Digital Equipment Corporation ) as a manufacturer of large machines that lost for the Paleolithic of information, something we can sound like many of us UNIVAC. DEC was a company that could say that practically invented the minicomputer market, much less powerful machines and pretentious large systems of the time but also much cheaper and with fewer requirements, say, environmental.

This market, the minicomputers, was to develop and grow, making the DEC in the second half of the 80 in the # 2 in the computer world (and here we talk about computers in general, both software and hardware).

However, the 80 came with very very deep changes throughout the business ecosystem turned over to the new paradigms and business models. There were companies that have adapted very well (eg HP), others were hard to adapt but eventually became as IBM, and others were unable to adapt and over the years were getting smaller until they finally disappear. Like, for example, DEC.
DEC
The situation in the mid 90's was a bit tricky. I have a couple of years without generating profits (his latest results in 1995 were 2,000 million losses). However, they had to his credit several technologies that could turn around your situation.

One was the Alpha processor. The chips were light years ahead in technology and performance of their x86 equivalents and were generally much higher than its competitors Sun and Silicon Graphics. However, DEC was a company throughout its history of minicomputers and therefore had failed to take advantage of the Alpha processor (in fact, reached 18 months delay to release for the simple reason that a microprocessor, however advanced it was, sounded a personal computer or as much workstation. DEC minicomputers did not single-person machines.)

However, there was an internal project that he could take advantage of great performance and capabilities of the 64-bit Alpha processor (yes, still in 1995). The project consisted in using the enormous bandwidth available for DEC had the time, traveled the World Wide Web completely and create an index that could be consulted at any time. His name was AltaVista, and although in the mid 90's was just an internal project, and therefore only used from the DEC's intranet, those of the company that used it were in love with him.

What was so special about AltaVista? Seen through the eyes of 2010, not much, mainly because both Google and Bing do the same. But remember we are talking about 1995, Google did not exist Bing and he had more than a decade to be born, so let's see what the situation was at that time.

The main website to mid 90's was, undoubtedly, Yahoo. However, Yahoo was not a search but it was just a directory. What does that mean? Then there was a spider to analyze the web looking for new links and including them in a weighted index that would be available then it simply was a list of websites stored. Yes, like a yellow pages or telephone directory. To understand the success of Yahoo, we must understand that before this there if you wanted to access a web-based resource management had to know of it or of any site that we link. Yahoo was simply a collection of sites, so you just had to memorize www.yahoo.com and once there, find what you wanted. Yes

had, however, projects to create search engines. Most of them were projects with very limited or not particularly ambitious objectives. For example, searchers had processed only titles from sites and not the content, or some who analyzed the contents, but of course, to encompass the web TOO (about 18,000 sites in early 1995) was needed a lot of bandwidth and processing power enough, otherwise you to finish processing all websites the result would be useless by the ever-changing nature of it. Also, of course, it was necessary to create a proper intertaz to interact with the entire system to perform searches, and most browsers "pure" of the time were quite cryptic (some had to connect via telnet and learned a few commands to perform a query). AltaVista had

solved all the problems of search engines before it. They had the huge (for the time) bandwidth of DEC, had a powerful processor and several parallel processing units all the information gathered by the spiders (so that the index was quickly obsolete) and also offered very clean interface for the time.

So in 1996 DEC agreed to publicly open your firewall to make public the AltaVista service. Like any business, the ultimate goal of this movement was to make money, but ... DEC sought how to win?

This is the part where it shows that what matters is not have the right technology but knowing what to do with it. If you do not have the possibility always exists that you can buy or at least license, but if you have it and do not know what to do with it, is naught badly.

As this was the problem of DEC. DEC for the dome (not project managers AltaVista), the engine was a great opportunity to get good positive publicity ... to sell computers with Alpha processors. That is, for them AltaVista was not a public demo of how great it was their hardware. It's as if Pixar had wanted to do business by selling computers that created Toy Story instead of doing business the film itself.

Were managers DEC blind? Yes, but because the world was full of blind by those then. In 1996, when Compaq bought DEC, the amount of the transaction was approximately $ 9,600 million, of which, in respect of the acquisition of AltaVista, paid $ 0 (and why should have been paid anything? It was just a demo and not generated not a single $. Maybe it was good publicity, but it was only, at least for the time, expense). Did

despite the blindness of their parent companies have become AltaVista Google before Google even existed? The technology was there, but not the business model. When AltaVista was made public, the mentality of the time was to capture Internet users and keep them inside as long as possible. The fashion at that time was to create portals where clump together all the services that a user might need in a way to remain always within the portal. These services ranging from email to weather information, news, chat rooms and dozens of other services. Despite the fact that AltaVista had advocates of maintaining clean and simple as AltaVista, ie focus on search technology and consultation, the triumph of the more "traditional" portal to create a large, heavy laden with a multitude of services.

What was finally AltaVista? Compaq sold to CMGI in June 1999 by $ 2,300 million, which wanted to bring it to bag the next year. However, the dot-com bubble had burst, so they stopped the IPO. In 2003, CMGI AltaVista sold the site to Overture Services, Inc. for $ 140 million. Today, AltaVista www.altavista.com still operational, but as an unimportant minor player in the world of web browsers.

And this is just an example of having the best technology does not give you guaranteed success. That is why when I see any news that this or that company could buy a few years ago euros for four current or a large company that sold for almost nothing a giant present I can not help but smile. The opportunities presented to us daily, sometimes recognizable, sometimes not ...

Monday, February 8, 2010

Cheats For Silver Strike Bowling

great missed opportunities or the Apple Newton Grandfather of the iPad

had long wanted to talk about today, so after the presentation on January 27, Apple iPad left me little reason to keep delaying it, so here you have this new pill computing history.

However, as I always say, to understand fully what is a specific event occurs and why it is necessary to put it in context, ie explain their background and situation in which it operates that fact.

And that is the case of Newton is a special case. Perhaps in that sense, is somewhat similar to the hype iPad produced and the enormous excitement generated, but we'll see if history repeats itself or not Apple's new toy. Now, let's focus on today's topic: the birth of the Apple Newton .

Our story begins with an Apple engineer dissatisfaction with his work called Steve Sakom . If you follow this blog regularly you might sound as this name, for it appeared when we talk about GO, Corp and Microsoft tactics. Until 1987, Sakom had worked as a hardware engineer involved on projects for Apple, but I was tired of working on yet another redesign of the Macintosh and wanted some "excitement" in his life.

was more or less then those contacted Jerry Kaplan, the founder of GO Corp. and when he decided to leave Apple to design the hardware of the PDA GO. However, to advertise its low spoke with his boss, Jean-Louis Gassée , to explain the reasons for his retirement and his future projects. This, having listened attentively to talk about what I wanted to create GO, proposed to continue in Apple and start a new project that implement all these ideas under his leadership.

Gassée Jean-Louise (pictured) was the replacement for Steve Jobs at Apple as chief of new products even before Jobs himself leave Apple to found NeXT, so it was in his power to initiate new projects and negotiating . What Sakom wanted the project was small, with a limited number of engineers and bureaucracy added, something like what was the original Macintosh project started by Jeff Raskin .

The idea was to create a new concept laptop, but completely different from what seen so far. Until 1987, laptop computers were rather slow, very large, very heavy (none fell of 6-7 kilos) and very limited autonomy. The Newton would also be relatively large (slightly larger than the new IPAD), with limited consumption and hence greater battery life and a way to interact with it entirely new, with a touchscreen and a stylus that could be used both as a mouse and, especially, to write on the screen itself and recognize that writing and prosecuted.

Therefore, the project, which is called Newton was not only a challenge at the hardware but also software. Was to create a new operating system, seeking new forms of interaction between man and machine as the Macintosh model did not fit completely to what accounted for Newton, and, above all, had to develop a handwriting recognition system that not only works but cupiese in what would be the device.

However, for 1989, the star of Jean-Louise Gassée Apple was fading and gradually his protege Newton project ceased to be so protected. As more and more funds were allocated to the project, the Apple device was interfering more in the daily life of the project, adding new red tape, reports, spreadsheets requirements and all that excites both an engineer.

Finally, Gassée was pressured to such an extent that he resigned, and he also presented Sakom. Thus, as with the Macintosh, the man who had started the project went through the back door. Together, they founded Sakom Gassée and Be, Inc. , whose name if you've been aware of these lines will ring you because they were mentioned in The return of Steve Jobs at Apple (1997) to be about to be bought Be, Apple Inc MacOS BeOS to replace (if not read, Gassée asked too much money thinking that Apple was desperate, what made Apple look for alternatives and end up buying NeXT and, therefore, allowing the return of Jobs to Apple) .

still in 1989, Bill Atkinson, the creative genius of the Lisa user interface and the Mac and possibly one of the best programmers in the world, prepared a meeting to discuss future projects and he invited a former senior engineers and engineers Apple and Steve Capps (pictured) and Andy Hertfelder and also attended by John Sculley, Apple's CEO.

After hearing about the project Capps wonders of Newton, who had just left without a leader and was in danger of disappearing, it asked for a show to see what I did with him, as he himself claimed not to understand and before deciding he wanted to see for themselves eyes. When Sculley saw what was developing the Newton was delighted and decided to put all that were needed (funds, engineers, whatever) and set 1992 as deadline (ie, just over two years).

Sculley To understand why I was so excited about the Newton we must go back to 1987. Apple's success so far rested on two main products, each of them revolutionary in its time. First came the Apple II, the machine that made Wozniak and Apple in a heavyweight micro industry. Then came the Macintosh, with its graphical interface and charisma, the machine and the main Jobs Apple support in the late 80's.

Jobs and Woz was both visionary people who understand technology. Sculley envied them for that. Apple has never made so much money at his command, and Sculley was also winning awards and was admired as CEO of Apple. But despite its apparent success, he only was a top executive. It was a visionary. He had "his" machine. There was a pileup of Sculley, as it was the Apple II, Woz or Jobs Mac. In his autobiography Odyssey, spoke of the Knowledge Navigator like a futuristic device that can offer all times the information they need when they need it.

This vision, in fact, finished shaping it into a series of promotional videos. Everything shown in the videos was just science fiction, but Sculley symmetry between the Knowledge Navigator and Newton was clear, so that he thought he could go down in history not only as the great but as Apple CEO a visionary level Jobs and Wozniak. Here in this video the concept of what is Sculley's Knowledge Navigator .



However, the support of the Newton Sculley was grounded in a sense megalomaniac. We talk about the end of 1989, Windows 3.0 gave him little to hit the market, Apple had to reduce their prices and, therefore, profit margins, "so a new device to open a new line of business in a completely new market for which there is even competition was a blessing fall the sky.

So, back in 1990, Sculley took over the project Larry Tesler, an engineer came from Xerox PARC in 1980 and had worked so hard for Lisa to the Macintosh, and you can see in a more recent picture right here. Now that the Newton had a new leader, work began to take new directions. The project was divided into development three different devices. On the one hand, we had the Newton Plus, the darling of the "old guard" of Apple, consisting of a "pileup" of about 12 inches long by 9 high, an intermediate unit, of which I have not even found his name - 9 by 6 inches and had very little support, and finally a small device dubbed Pocket Newt 4.5 by 7 inches and was preferred by most young engineers of the project, also including Steve Capps.

Tesler The idea was to develop all technology on Newton Newton Plus instead of focusing on lower-powered device, so that the Pocket Newt virtually became a project Underground within Apple have almost by Capps and two engineers (Tchao and Michael Culbert) almost alone and in secret while the rest of the company was turning to the Newton Plus.

As I said, the Newton was not only a challenge at the hardware level but also at the software meant that many important challenges led to new technologies. One of the most promising was a programming language, trying to combine the efficiency of C + + with the elegance and simplicity of Smalltalk. Such language, object-oriented course, intended as an open language with a powerful scavenger and especially able to be embedded in devices such as computers, digital phones or media players. Yes, something similar to what is Java, a few years before it. In this programming language was called Ralph, after Ralph Ellison, author of Invisible Man .

However, Newton was developing a processor designed specifically for Apple and AT & T which was particularly efficient compiled C code, but whose performance waned when it came to running the code Ralph. Therefore, Apple sought alternatives (despite having paid to AT & T a millionaire by the chip), finding the solution in a company called Cambridge ARM Ltd., of which Newton ride the ARM processor 610 and Apple also buy 43% of it by the sum of $ 2.5 million (in case anyone is curious, as Apple was talking to middle 90's in financial trouble has been selling shares of ownership of ARM to stay with a 0% of it).

One of three engineers working on the Pocket Newt, Tchao, took a chance and contact Mike Markkula, chairman at the time Apple and one of the few who had been with the company since the beginnings in the garage Jobs. To Tchao, Newton Plus was an error. A device was too large, too expensive and too complex for to be completed in time.

The ploy worked and Tesler Sculley ordered to forget (at least for now) of Newton Plus and concentrate on completing the development on Pocket Newt, who became known from then Junior.

On January 7, 1992, Sculley spoke during his keynote of "digital convergence" of the computer industry, content and communications and the need for a new type of device as a digital assistants. It was then when he made the prediction that these new devices could go as far as sales of $ 3 trillion by the beginning of the XXI century (yes, no billion. Trillion). hype begins. That Sculley subsequently insisted that the money would be distributed stupidity throughout the industry and not the stay only Apple did not care. Everyone wanted a piece of the pie so damn great that Apple promised.

Finally, on May 29, Apple developed the first public appearance of Newton. It consisted of a closed-door pass several people chosen from the press, and although Newton was not yet finished (and, indeed, the first terminal is not even turned thirty submission was an emulation until Tchao got a second terminal operating) generated a huge buzz among the attendees, telling wonders their respective columns and articles about Apple's new wonder. It overlooked the errors seen, because ... Was not it a beta? It is normal that these things fail.

However, while the first presentation had been (relatively) well, the Newton was not yet ready. By the fall of that year, 1992, Calligrapher, the recognition program of writing, still not working well. In addition, the wonderful programming language designed for the project, Ralph, I needed more machine that Junior was able to afford to go fine, so we had to rewrite much of the language software C.

is in this period when the pressures, marathon sessions of work, disappointments and more pressure could be one of the members of the project, Ko Isono, who ended his life on December 12, 1992 shooting with a pistol. Naturally and understandably, the morale of the project vanished along with the life of Isono.

Despite intensive work on Apple, June 3, Casio and Tandy took the honor of removing Apple's first PDA market with the launch of the Zoomer PDA. Apple announced a new date of departure, the August 2, 1993.

But the problems are piling up in Cupertino. In addition to delays in Newton, Sculley was becoming questioned, because the board felt that was too dump in the project (which had not provided even one $ to Apple and had spent money hand over fist) and because of that Apple lost for the first time in many years money in a quarter specifically $ 188.3 million (in fact, small change compared to what it would lose in 1996 and 1997). On June 18 the board relieved Sculley as CEO and put in place Michael Spindler (pictured), leaving Sculley as new Apple chairman. However, recent economic setbacks caused irrevocable Sculley submit his resignation as chairman and left for Apple always.

Finally, during the Macworld Expo August 2, 1993, Newton was presented to society. The final model, called the Newton MessagePad , had dimensions of 4.5 by 7.25 inches, LCD touchscreen 240x366 pixel resolution, an ARM processor from 610 to 20 Mhz, 4 MB ROM and 640 Kbytes of RAM. The battery, four AAA batteries 6 volt, gave a range of approximately 14 hours. And all this in the amount of $ 699 in 1993.

The pregnancy had been long and difficult, but in the first few weeks there seemed to be worthwhile. Apple sold 50,000 units in its first 10 weeks. And, far from the 3 trillion predicted but a lot of Newtons in any case. However, as it did with the Macintosh in 1984, after the initial joy sales were declining, staying at around 7,500 units per month.

The reasons for this slowdown were evident. Handwriting recognition, when it worked, it was magical, but failed more than a gun fair. Just had third-party programs, and assume nearly $ 700 more an expensive gadget for wealthy executives that something truly useful to the masses, these were very professional.

At this time, Newton was stumbling from side to side. Gradually, they were out new models with better features and is particularly interesting recent models like the eMate 200 and MessagePad 300 (basically a redesigned 200 MessagePad Jonathan Eve).

By the way, do you remember Ralph? Well, when the project was nearly completed, Apple decided to change the name and call Dylan, short for Dynamic Language and, incidentally, a tribute to Bob Dylan, who did not feel particularly flattered and sued Apple for the use of her stage name on a product. Finally, in the fall of 1995, Apple canceled the project after making cuts to reduce expenses.

When Gil Amelio took over as CEO of Apple on February 2, 1996 continued to cut projects, and although one of the leading candidates to disappear was the Newton Amelio decided to go with him. It was decided to create a separate company, which was to be called Newton, Inc, which is responsible for development and commercialization of Newton now, with new models (eMate 200 and MessagePad 300) at the end was still profitable. The idea of \u200b\u200bNewton Amelio was detached from selling to any other major company (Sun, Oracle, Ericsson, Samsung, Sony ...), and the easiest way was to pack it all in a company that could buy. Finally, after refusing all companies contacted to stay with "dead," he went on the Newton project, Inc.

However, they definitely constitute Newton, Inc. as an independent and autonomous Apple Gil Amelio was ousted by the board of Apple and Jobs took over in the power vacuum created by stopping the process of the new company with the promise of developing the concept of the eMate 300 (pictured). In fact, under the command of Jobs, Apple released on 20 October 1997, the MessagePad 2100, Newton latter would leave the market as the end of that year, Apple announced the cessation of platform development, and February 27, 1998 it was announced that any operation that had to do with the Newton was canceled.

With this action, closed 11 years of development that Apple accounted for a $ 500 million. It is estimated that between 150,000 and 300,000 sold Newtons in the four and a half years were in the market. Maybe Apple, with the iPad, learned its lesson ... After

Saturday, February 6, 2010

On Sims 2 Can You Get A Pram

can wait on the other side ...

really got me ... but now it's time to go, now do not make me false promises, we both know you will not go for me, but you can wait ... there where you are. Do not want to ruin your life is hurting me away from you, I die inside you gently but do not ask me that will not leave I have to, I should not have let you kiss me now I'm sad to see us apart ... perhaps because we waste a lot time and I know that there is no more to use it at your side, but you can wait and when you get enough strength Zahir, when both stop being weak and indecisive, immature, and as long as we love you go for me ... catch me on the other side, I'll be waiting day and night, you arrive ... because they gave us this life to be born and then die, I spend time thinking about how much I want to be with you, and
Kzhael Insist that what I want now change once it matures, I will change my views, the way you view life Zahir but love will not change, may grow but will not end if he is not going for me ... again myself to make sure you are well, I will make sure that form a family with the woman he loves, of which this hopelessly in love, which carry until the other side of the world if it goes away ... if you do that then I've never been that woman is the most probably ... but when you love to have him fight at his side, but when love is not mutual only thing you can do is make sure that person will be happier still. Never give it to end, I have no idea how he managed to steal my heart.


Kzhael ... another character in my life, it is characterized by stronger than Zahir ... lying if I deny that I'm in love with him but in my heart there is something stronger than opaque feelings towards this new being in my life, Kzhael represents the dark ... the sage who has the answers of the occult, and I ... Hence the hidden love the appeal of that being. It can sometimes be a bit harsh but I say it's for my sake, your life will change and you're not prepared to do, once I told you that your trip is not going to do what expected and it will be! So I tell you you have to fight for you and nobody else, nobody will be there to sobreprotegerte and you know ... I do not want anything from you, do not deserve anything! You have to find your happiness in any way for you and nobody else! You hold on to what you want ... you usually hurt yourself now have to be strong yourself ... more than ever.






If fighting for myself and anyone else means putting aside my Zahir ... I refuse to do so, but I do not think I have 14 years ... have grown at least 20% and that small percentage that tells me I shouted cling to what I want for myself, I scream that soon I'll go across the world where you learn to grow the force, I'll be alone ... no one beside me who can go, I will commit many mistakes and learn from them I will rise because there will be no one around me ... my tears dry up because there will not be a shoulder to mourn ... I'll be waiting for him while I'm getting crazy and running around ... you should also prepare for what might happen, I can not afford to get hurt again ... a soul like Zahir need much affection from now, I must be ready to face what is coming like it or not ... but Susan must be sure the decision you are going to take the time necessary ... because even though love and follow you is difficult you say to yourself "It's time to close the curtains ..." Julio Cardenas


once wrote me:
"I can only say that distill a lot of love and affection towards another person in your blog ... I hope it is reciprocated
..."



Sunday, January 31, 2010

Sewing Pakestanicolth

Japan and the fifth generation of computers

GNU Project to speak at the previous entry in this blog, we today to discuss an issue that once I was rather curious: the fifth generation of computers.

But as always, to understand this history is necessary before a little soaking some concepts and history. A computer is just that, ultimately, an information processing machine, with the characteristic that is programmable, and therefore versatile and powerful, while certainly the parents of modern computing certainly not they went through your mind that something like Facebook could even exist.

However, computers have evolved over time and revolutionized every X time. While the first calculating machines date back centuries, and even Charles Babbage designed and conceptually similar to what is a computer, not until the 40's when they finally are built first electronic computers, with examples such as the ENIAC (U.S. made in the image) or the Mark I (UK).

In any case, this is not a history of paleontology computer, so we focus a little on the subject. Basically, there are known and recognized four main generations of computers.

The first generation of computers based on the technology of vacuum tubes. They are the first electronic computers, large and giant monsters weighing many tons needed entire rooms chilled and exorbitant power consumption. In this generation we out machines like the UNIVAC (pictured) or the IBM 701.

The second generation of computers is based on transistors. With this new technology using silicon semiconductor materials such as computers could be more powerful and economical, but also a smaller size and lower power consumption. In this second generation, we highlight IBM machines such as the Honeywell 7090 or 800.

The third generation makes use of integrated circuits as a major technological innovation over the previous generation. Again, thanks to this technology is getting cheaper the product, reduce its size (it had computers that could go into a closet) and consumption. The number of computer manufacturers has increased after the development of the minicomputer, PC much more limited than the large mainframes but allowed many companies and schools for themselves have a computer with which to work. Regarding examples of machines that can stand out of this generation are the IBM 360 computer family, the CDC 6600 (as already mentioned , the first supercomputer in history) or the DEC PDP-1 (pictured).

The fourth generation is the generation of microprocessor and miniaturization. This is the generation that know the most about today's computers because basically we have are an evolution of those early 70's models. After the invention of Intel microprocessor with Intel 4004 in 1971 (pictured), over the years the computer world suffered an explosion of colossal proportions in which the computers were popularized and finally reached the public, first as curiosity, then as a toy, then as a working tool and finally as a life centered on microprocessor controlled machines. Machine that could highlight of this generation would be the MITS Altair 8800, Apple II, IBM PC or Apple Macintosh, to name some major milestones.

However, if you look at all the computers and companies mentioned, they were all American. Of course there were more manufacturers in other countries, such Bull in France and Siemens in Germany, but the Anglo-Saxon dominated widely.

Japan was, until then, just a "replicant" English or American technology to the 70. However, following the huge success of the microelectronics industry and automobile consumption, the next target for the Japanese was clear: to lead the next revolution in computers. That is why it was created during the 80 draft fifth generation of computers.

This project was funded by the Ministry of Economy, Trade and Industry (MITI) and developed by the Centre for Development and Japanese Information Processing (JIPDEC) and the main idea behind these machines is based technologies and techniques used in artificial intelligence.

Looking at the wikipedia , the main fields for research of the project were:
  • Technologies for knowledge process. Technologies for processing
  • databases and massive knowledge bases. Workstations
  • high performance.
  • Distributed Computing
  • functional.
  • Supercomputer for scientific computing.
Japan At that time there lived a pretty sweet moment. Had already advanced to most Western industrialized countries, growth was highest in the world since the end of World War II and had an aura of invincibility that put efficiency and nerve to the other industrialized powers. It is for this reason that in many Western countries (USA, UK and some European countries) launched their own side projects with principles similar to those of the fifth generation Japanese to try to counter Japan's initiative.

However, after a very large amount of money invested and 11 years of development, Japan takes for completion of the project in 1993. However, the results were not at all expected. We designed a series of technologies, such as operating system Simper (later rewritten and renamed PIMOS) KL1 programming language or development of five Parallel Inference Machine (PIM), which we can see an example of a the five computers in the photo.

But the problem with these machines is that, although interesting from a purely academic standpoint, not so from a practical standpoint, as a machine with a microprocessor of a general can do the same things, with a lower price and even better performance even in the same field of artificial intelligence. Furthermore, possible improvements at the architectural level are generally very difficult to implement in other systems, because we are talking about machines that do not even follow the von Neumann architecture .

is why that, although Japan has not considered the project as a failure, not much talk about the success of it. However, as I said a friend of mine, who failed not achieved no goals but who does not even try, so that despite the apparent waste of resources and money by Japanese industry in a research project with so few positive results, it is always commendable and admirable for a country decides to engage in this kind of project.

Saturday, January 16, 2010

Brother Mfc-490 Printer Offline

RMS: the last of the true hackers

had long wanted to talk about today. Not surprisingly, in my college years, I had some (good) teachers who were very committed to the free software movement and ended up greatly influencing my view of the world of computing in general and software in particular.

A little to know a little computer world certainly have heard of GNU. Perhaps you've never known exactly what it is, or think that GNU and Linux are the same thing, or perhaps you are an expert in philosophy and free software licenses and ethical hacker. However, do not pretend to give an explanation about what is GNU but what are his roots, who founded the project and why he did it.

The first question is easy to answer. Richard M. Stallman (henceforth, RMS, as he himself likes to be called), which you can see in the picture. GNU was founded in 1983 by RMS, but to understand the real reasons we must go back a long time ago.

several universities in USA are well known and recognized worldwide. So to the eye, comes to head Berkley University and MIT. Within this second school, and distant from the 50's, began to see a generation of young talent for the computer and generally quite small for relationships. Hackers were the first recognized as such, the first generation.

These young people shared a common set of values \u200b\u200band even utopian revolutionaries, where each person was measured by their competence with computers, so that he could do for them and not their sex, race or any traditional measure. This group of young people has evolved over time just as the technology that could work and the new generations of hackers who were arriving.

could say that any hacker philosophy of the time was represented by the Artificial Intelligence Laboratory at MIT. RMS reached the laboratory in 1971 from the hand of Noftsker Russ, who was hired as a systems programmer, work that combined it with his physics studies.

Here, RMS is the hacker ethic soaked into a particular community member active. However, the time kept moving and the idyllic AI lab was starting to stop being so idyllic.

Some influences may not seem particularly important. Until the 70 access to laboratory systems was free and without any bureaucratic hurdle. It just comes, you sit at a terminal, and had access to the same resources as everyone else. Exactly the same, including hardware, printers and even files or programs, since the concept of privacy does not exist in the laboratory. Anyone could see your files, anyone could copy your files, anyone could delete your files. But nobody did. It was a community that shared and worked for the common good and common good is not devoted to destroying the work of others.

However, although this way of thinking and working was accepted into the laboratory, it was the thing changed. MIT was actually getting off the threat of the machines at the AI \u200b\u200bLab and ARPA network that anyone could connect to one of these machines and access the network and, therefore, possible military secrets.
privacy
So was the AI \u200b\u200bLab mode user accounts with passwords. However, RMS always fought for what he could, for example with the first deployment of encrypted passwords, which got cracked and all users sent a message saying what his password and it would be better to leave her white it is much easier to write, is known to all and comes to have the same security as the chosen key.

With the encryption system upgrade, RMS would have it harder when the keys cracking, but found that changing a little getting login program greeted with the same password each time a user is authenticated, so that ultimately end up on passwords as well. Moreover, giving much evidence of its disagreement with the topic of passwords, decided that the emacs text editor could not be installed on machines that used a password system. Why

RMS was so against the keys? Again, must understand the philosophy behind this movement. The MIT AI Lab was characterized as a cooperative world where everyone works for the common good. Set passwords and add privacy was making it difficult for the sharing of knowledge. For RMS, for hackers at MIT, this was the way it should be. If your job because you got some results or information that someone may need to make or improve theirs, would not it be better than all that information was always available for everyone? This is how it worked in the laboratory of MIT. And if someone argued that this could not work, anyone could sabotage your job or whatever, he could teach the laboratory. It is possible, there is an example. But to add privacy, security, bureaucracy ... Utopia broke.

However, adding passwords to user accounts was not by far the greatest threat to the hacker ethic that he loved RMS. For the 70 by the MIT Laboratory had already passed, so to speak, two generations of hackers and were entering the first "remittances" in the third. The first generation was that of the 50's, people who "grew up" with vacuum tube machines. The second generation came in the 60's and were the hackers of the systems timeshare. These two generations (especially the first) and was "doing more" with responsibilities of such a family, a job, a mortgage or pay rent and stuff.

The third generation of hackers was different. The 70 brought a different philosophy, new paradigms, and without the support of more experienced hackers, new ethical. Just for get an idea, think of the book titles and Linux Torlvalds RMS, ie "Free As In Freedom" the first and "Just fon fun" the second to understand some of the difference Linus would even years later. For new generations the concept of Copyright was not an aberration nonsense. In the laboratory and was not all share and work for the common good, and not as the private interest was born in the laboratory in the form of a new company, but the same lab was involved in a trade war between two their offspring.

in this laboratory was the birthplace of the LISP programming language. Since it is the world's most popular language, LISP comes from LISt Processor, and given its characteristics, and considering where it was designed, was regarded as the programming language of the field of artificial intelligence.

While doing a new implementation of LISP was not particularly complicated, do a deployment conditions and was another story. Precisely for this reason, two of the lab hackers decided to create a company to manufacture and sell machines LISP. Initially founded in 1979, Richard Greenblatt LMI, an acronym for LISP Machines, Inc , following a bit of ethics and values \u200b\u200bof the MIT hackers. Here you can see an image of a machine developed by this company.

Later, Russ Noftsker, the same hired in 1971 to RMS at MIT, founded Symbolics , the main task of this company ... accurate, manufacture and sell machines LISP. Both companies, therefore, both companies were competing and also had dealings with MIT. And, of course, thrived both the largest pool of experts in LISP which had at that time, namely the MIT AI Lab.

Symbolics (which you can see an example of his machine in the picture) had a more entrepreneurial business, with a more developed marketing and a less than ethical business practices from the standpoint of hacker LMI but yet, perhaps because of it, the success of the far more attracted hackers LMI laboratory. And because of that, Symbolics became the most representative RMS of all that was going wrong in his laboratory.

Since the MIT had agreements with two companies, Symbolics was not much for the work to open their programs because, they claimed, that could mean working for the competition. Therefore, no longer provide the source code of their programs. Although

RMS did not work for either company, he did what he was doing ethical Symbolics, so he decided to act, and every time I got the binary of a program as compared with the previous version and from reverse engineer saw there what he was doing the program again, will implement it and passed it LMI. It is possible, as shown in the English Wikipedia, that the reason was because they wanted to have a company with sufficient advantage to have a monopoly on the LISP machines. Perhaps, as Steven Levy says in his book Hackers , is a form of punishment to the unethical practices of a hacker by Symbolics.

In any case, he said in 1982 that could not be passed this way of life, disassembling and re-implemented programs at night and studying for his doctorate in physics in the morning. He gave himself a deadline: one year. Finally, 1983 arrived and with it the moment to rethink the life and future. It was then announced the GNU project, a completely free system where the community work for the community, where red tape is not filed and the rights of an individual, however powerful it might be, prevail over the common good. Many achievements are to be told, both RMS and the GNU project. But that's another story to be told another time ...