Softpanorama
May the source be with you, but remember the KISS principle ;-)

Contents Bulletin Scripting in shell and Perl Network troubleshooting History Humor

Fifty Glorious Years (1950-2000): the Triumph of the US Computer Engineering

News Recommended Links OS History Unix History CPU History Language History Solaris history DOS History
Donald Knuth Richard Stallman Larry Wall Scripting Giants CTSS Multics OS Quotes Etc



Introduction

Invention of computers was the highest moment of the development of the USA high-tech industry, the area which  defined the progress of high-tech as a whole. This is an area were the USA really has been the greatest nation of the world. Real "shining city on the hill". The USA gave the world great programmers, hardware designers, network architects and managers.  Those unique conditions, when the whole country was a large Silicon Valley of the world, were destroyed by neoliberal transformation of society and, especially, neoliberal transformation of higher education (when university were transformed into for profit corporation for the university elite)  started in 80th and getting to full speed after 2001.

When ENIAC was declassified in 1946 ( it made the front page of the New York Times) the computer revolution  was put into fast motion. As early as 1952 during the presidential elections night, Univac computer correctly predicted the winner. While chances were 50% ;-), this was an impressive introduction of computers into mainstream society. IBM, DEC, CDC and later Intel, HP, Apple and Dell emerged as the leading producers of hardware. With the advent of microprocessors all major CPU, Include Intel x86 and Motorola  68000, PowerPC, etc were US designed. The USA programmers created all major world operating systems such as System/360, Multics,  VM/CMS, VAX/VMS,  Unix, CP/M, DOS, Windows, System 7, OS 7, major linux distributions, such as Red Hat and Debian, Android. in 1967 they wrote the first hypervisor ( CP-67; later renamed to CP/CMS was available to IBM customers from 1968 to 1972, in source code form without support). In 1072 the shipped first commercial hypervisor  VM/370. Later they create a series of impressive offering in this area too such as  VirtualBox  and VMware.

Most of the leading programming languages such as Fortran, Cobol, PL/1, PL/M, Snobol, Lisp, Scheme, Basic,  C, C++, C#, Objective-C, Korn shell, Perl, PHP, Java, Javascript, TCL,  and compilers/interpreters to them were "made in the USA" too. From early 1950th till approximately 2000 academic science was also completely dominated by the USA scientists.  From early 50th till ACM was the most influential computer professionals society and till approximately 1975 it flagship periodical, Communications of the ACM was the top professional publication of the field, although British Computer Society The Computer Journal was also of some statute and influence.   

History is written by the winners and computer history of XX century was definitely written in the USA. If we assume that the professional success is a mixture of natural abilities, hard labor and luck (including being born at the right place at right time), as in Malcolm Gladwell suggested in his unscientific, but now popular The 10,000 Hour Rule it is clear that the US scientists have has all those three components.  But they were not alone -- conditions is GB, Germany and France were not bad iether.  While we should take Gladwell findings and his 10000 rule with a grain of slat, it point to one interesting observation. Most those that I mention below were born between 1920 and 1955 -- a window of opportunity in computer science which since then is virtually closed. It is similar to 1830-1840 window for titans of Gilded Age such as Rockefeller (1939), Carnegue(1835), Gould(1836), J.P Morgam(1837) that Gladwell mentioned. "No one—not rock stars, not professional athletes, not software billionaires, and not even geniuses—ever makes it alone", writes Gladwell.

At the same time it is important to see this history not as "people, places and events" but also via artifacts, be it machines, programs or interviews of pioneers. This part of history is badly persevered in the USA. Moreover there is a trend of Dumbing down history of computer science. As Donald Knuth remarked ( Kailath Lecture and Colloquia):

For many years the history of computer science was presented in a way that was useful to computer scientists. But nowadays almost all technical content is excised; historians are concentrating rather on issues like how computer scientists have been able to get funding for their projects, and/or how much their work has influenced Wall Street. We no longer are told what ideas were actually discovered, nor how they were discovered, nor why they are great ideas. We only get a scorecard.

Similar trends are occurring with respect to other sciences. Historians generally no prefer "external history" to "internal history", so that they can write stories that appeal to readers with almost no expertise.

Historians of mathematics have thankfully been resisting such temptations. In this talk the speaker will explain why he is so grateful for the continued excellence of papers on mathematical history, and he will make a plea for historians of computer science to get back on track.

History is always written by the winners, and that means right now it is written by neoliberals.  Dumping down history of computer science is just application of neoliberalism to particular narrow field. The to way an essence of neoliberal history is "to dumb down everything". Dumbing down is a deliberate lowering of the intellectual level of education, literature, cinema, news, and culture. Deliberate dumbing down is the goal.

They use power of vanity to rob us of vision which history can provide. Knuth lecture "Let's Not Dumb Down the History of Computer Science" can be viewed at Kailath Lecture and Colloquia. He did important point that historical errors are as important as achievement, and probably more educational. In this "drama of ideas" (and he mentioned high educational value of errors/blunders  of Linux Torvalds in design of Linux kernel)  errors and achievement s all have their place and historical value.  History gives people stories that are much more educational then anything else. that's that way people learn best.

50 Giants of the field

Giants of the field either were US citizens or people who worked in the USA for a long time. Among them:  

  1. Gene Amdahl (born November 16, 1922)  -- architect of  System/360 hardware.  Also formulated Amdahl's law.
  2. Frances E. Allen (born August 4, 1932) n American computer scientist and pioneer in the field of optimizing compilers. Her achievements include seminal work in compilers, code optimization, and parallelization. She also had a role in intelligence work on programming languages for the National Security Agency. Allen was the first female IBM Fellow and in 2006 became the first woman to win the Turing Award.
  3. John Backus (December 3, 1924 – March 17, 2007) -- designed FORTRAN and the first Fortran complier, one of designers of Algol 60, co-inventor of the Backus-Naur form. The IEEE awarded Backus the W.W. McDowell Award in 1967 for the development of FORTRAN.[1] He received the National Medal of Science in 1975,[2] and the 1977 ACM Turing Award “for profound, influential, and lasting contributions to the design of practical high-level programming systems, notably through his work on FORTRAN, and for publication of formal procedures for the specification of programming languages.”[3]
  4. Gordon Bell (born August 19, 1934) -- designed several of PDP machines (PDP-4, PDP-6, PDP-11 Unibus and VAX). The DEC founders Ken Olsen and Harlan Anderson recruited him for their new company in 1960, where he designed the I/O subsystem of the PDP-1, including the first UART. Bell was the architect of the PDP-4, and PDP-6. Other architectural contributions were to the PDP-5 and PDP-11 Unibus and General Registers architecture.
  5. Fred Brooks (born April 19, 1931) -- managed the development of IBM's System/360 with its innovative ground-breaking hardware and the OS/360 which was the dominant OS on IBM mainframes. He wrote classic book about his experience as the manager of OS/360 development.  The Mythical Man-Month (1975). Which remains one most read computer books from 1970th. Brooks has received  the National Medal of Technology in 1985 and the Turing Award in 1999.
  6. Vint Cerf (born June 23, 1943)--  DAPRA manager, one of "the fathers of the Internet", sharing this title with Bob Kahn.
  7. John Cocke (May 30, 1925 – July 16, 2002)  -- Led the design and implementation of famous optimizing compilers produced by IBM (IBM Fortran H compiler), one of the "fathers" of RISC architecture. Contributed to the analysis of program graphs theory.  He was instrumental in the design of  the IBM 801 minicomputer, where his realization that matching the design of the architecture's instruction set to the relatively simple instructions actually allow compilers to produce high performance binaries at relatively  low cost. He is also one of the inventors of the CYK algorithm (C for Cocke). He was also involved in the pioneering speech recognition and machine translation work at IBM in the 1970s and 1980s, and is credited by Frederick Jelinek with originating the idea of using a trigram language model for speech recognition.
     
  8. Fernando J. Corbató (born July 1, 1926) --  a pioneer in the development of time-sharing operating systems including famous MIT CTSS Time-Sharing System, and Multics OS. Inventor of Corbató's Law. Essentially a Godfather of Unix, which would never happen if AT&T was not involved in Multics project and learned MIT technology during this period.
  9. Seymour Cray (September 28, 1925 – October 5, 1996)   -- founder of Cray Research, "the father of supercomputing". Cray participated in the design of the ERA 1103, the first commercially successful scientific computer. By 1960 he had completed the design of the CDC 1604, an improved low-cost ERA 1103 that had impressive performance for its price range. After that he designed the CDC 6600 was the first commercial supercomputer, outperforming everything then available by a wide margin. He then further increased the challenge in the later release the 5-fold faster CDC 7600.  In 1963, in a Business Week article announcing the CDC 6600, Seymour Cray clearly expressed an idea that is often misattributed to Herb Grosch as so-called Grosch's law: Computers should obey a square law -- when the price doubles, you should get at least four times as much speed. After founding Cray research he released famous Cray-1 supercomputer in 1976.   As with earlier Cray designs, the Cray-1 made sure that the entire computer was fast, as opposed to just the processor.
     
  10. Charles Stark Draper  (October 2, 1901 – July 25, 1987) an American scientist and engineer, known as the "father of inertial navigation". He was the founder and director of the Massachusetts Institute of Technology's Instrumentation Laboratory, later renamed the Charles Stark Draper Laboratory, which made the Apollo moon landings possible through the Apollo Guidance Computer it designed for NASA.
  11. Whitfield Diffie (born June 5, 1944) is an American cryptographer and one of the pioneers of public-key cryptography. His interest in cryptography began at "age 10 when his father, a professor, brought home the entire crypto shelf of the City College Library in New York." Diffie and Martin Hellman's paper New Directions in Cryptography was published in 1976. It introduced a new method of distributing cryptographic keys. It has become known as Diffie–Hellman key exchange. The article also seems to have stimulated the almost immediate public development of a new class of encryption algorithms, the asymmetric key algorithms. Diffie and Susan Landau's influential book Privacy on the Line was published in 1998 on the politics of wiretapping and encryption. An updated and expanded edition appeared in 2007.
  12. Brendan Eich (born 1960 or 1961) an American computer programmer, who created of the JavaScript scripting language. Later he became the chief technology officer at the Mozilla Corporation. See his site Brendan Eich
  13. Douglas Engelbart (January 30, 1925 – July 2, 2013)  -- co-inventor of the computer mouse, instrumental in the development of hypertext. These were demonstrated at The Mother of All Demos in 1968. Engelbart's Law, the observation that the intrinsic rate of human performance is exponential, is named after him.
  14. Philip Don Estridge (June 23, 1937 - August 2, 1985)  -- led the team, which developed the original IBM Personal Computer (PC), and thus is known as "father of the IBM PC". His decisions dramatically changed the computer industry, resulting in a vast increase in the number of personal computers sold and bought (computer for each family), thus creating an entire PC industry.
  15. David C. Evans (February 24, 1924 – October 3, 1998) the founder of the computer science department at the University of Utah and co-founder (with Ivan Sutherland) of Evans & Sutherland, a computer firm which is known as a pioneer in the domain of computer-generated imagery.
  16. Edward Feigenbaum  (June 23, 1937 - August 2, 1985),  a computer scientist who is often called the "father of expert systems." A former chief scientist of the Air Force, he received the U.S. Air Force Exceptional Civilian Service Award in 1997. In 1984 he was selected as one the initial fellows of the ACMI and in 2007 was inducted as a Fellow of the ACM. In 2011, Feigenbaum was inducted into IEEE Intelligent Systems' AI's Hall of Fame for the "significant contributions to the field of AI and intelligent systems".
  17. Robert W. Floyd (June 8, 1936 – September 25, 2001)  -- a young "genius, who finished school at age 14. Mostly known as a computer scientist who invented Floyd–Warshall algorithm (independently of Stephen Warshall), which efficiently finds all shortest paths in a graph, Floyd's cycle-finding algorithm for detecting cycles in a sequence, and Floyd-Evans stack-based language for parsing. He was a pioneer of operator-precedence grammars. He also introduced the important concept of error diffusion for rendering images, also called Floyd–Steinberg dithering (though he distinguished dithering from diffusion). His lecture notes on sorting and searching served as a blueprint for volume three of the Art of Computer Programming (Sorting and Searching). He obtained full professor position in Stanford without Ph.D. He received the Turing Award in 1978. Floyd worked closely with Donald Knuth, in particular as the major reviewer for Knuth's seminal book The Art of Computer Programming, and is the person most cited in that work.
  18. Bill Gates  (born 28 October 1955) Created FAT filesystem, and was instrumental in creation and success of PCs DOS, and Windows 95, 98, NT, 2000 and 2003 OSes, Microsoft Office and, more importantly, the whole PC ecosystem which dominates computing today. Ensured possibility of linux success by marketing Xenix. See XENIX -- Microsoft Short-lived Love Affair with Unix In a way Microsoft can be called godfather of Linux, which would be impossible without mass produced Windows PC hardware.
  19. Seymour Ginsburg (1927–2004) a pioneer of automata theory, formal language theory, and database theory, in particular; and computer science, in general. Ginsburg was the first to observe the connection between context-free languages and "ALGOL-like" languages.
  20. Robert M. Graham  (born in 1929) one of the key developers of Multics, one of the first virtual memory time-sharing computer operating systems, which broke ground for all modern operating systems.  He had responsibility for protection, dynamic linking, and other key system kernel areas. In 1996 he was inducted as a Fellow of the Association for Computing Machinery. See Robert M. Graham Home Pa
  21. David Gries  born 26 April 1939 ) is the author of influentical 1971 book Compiler Construction for Digital Computers, John Wiley and Sons, New York, 1971, 491 pages. (Translated into Spanish, Japanese, Chinese, Italian and Russian.). That was the first systematical exposure of compiler technology.  He also participated in the development of one of the best educational compilers (for programming language PL/C) which was probably the only real competitor to IBM PL/1 debugging compiler as for quality of diagnostics and correction of syntax errors.
  22. Ralph Griswold  (May 19, 1934,  – October 4, 2006), created groundbreaking string processing languages SNOBOL, SL5, and, later,  Icon.
  23. Richard Hamming (February 11, 1915 – January 7, 1998) His contributions include the Hamming code, the Hamming window (Digital Filters), Hamming numbers, sphere-packing (or hamming bound) and the Hamming distance.
  24. Martin Hellman (born October 2, 1945) an American cryptologist, and is best known for his invention of public key cryptography in cooperation with Whitfield Diffie and Ralph Merkle. Hellman is a long-time contributor to the computer privacy debate and is more recently known for promoting risk analysis studies on nuclear threats, including the NuclearRisk.org .
  25. David A. Huffman(August 9, 1925 – October 7, 1999)  known for his Huffman code, an optimal prefix code found using the algorithm developed  while he was a Ph.D. student at MIT, and published in the 1952 paper "A Method for the Construction of Minimum-Redundancy Codes".[1] Huffman's algorithm derives this table based on the estimated probability or frequency of occurrence (weight) for each possible value of the source symbol.  He is recipient of
  26. Steve Jobs (February 24, 1955 – October 5, 2011). Co-creator on Apple, the marketing force behind the Next computer, iPad and iPhone brands.
  27. Bill Joy  (born November 8, 1954) The major contributor to FreeBSD and Solaris OS. As a UC Berkeley graduate student, Joy created  the Berkeley Software Distribution (BSD) of Unix.  Made important work on improving the Unix kernel, and also handled BSD distributions. Joy's speed of programming is legendary, with an oft-told anecdote that he wrote the vi editor in a weekend. Joy denies this assertion. He is a creator of standard unix editor vi editor and, now less used, but very influential C shell. Joy co-founded Sun Microsystems in 1982 along with Vinod Khosla, Scott McNealy and Andreas von Bechtolsheim, and served as chief scientist at the company until 2003.
  28. Phil Katz  (November 3, 1962 – April 14, 2000)  a computer programmer best known as the co-creator of the ZIP file format for data compression, which became de-facto standard compression method in DOS and Windows. He is the author of PKZIP, a program  which pioneered ZIP format and for coupe of decades of DOS era was ahead of competition in quality of compression.
  29. Alan Kay  (born May 17, 1940) One of the members of Xerox PARC, Atari's chief scientist for three years. Best known for his contribution to Smalltalk language.
  30. Gary Kildall (May 19, 1942 – July 11, 1994) Creator of the concept of the BIOS, CPM and DrDOS operating systems. Gary Arlen Kildall  was an American computer scientist and microcomputer entrepreneur who created the CP/M operating system and founded Digital Research, Inc. (DRI). Kildall was one of the first people to see microprocessors as fully capable computers rather than equipment controllers and to organize a company around this concept.[1] He also co-hosted the PBS TV show The Computer Chronicles. Although his career in computing spanned more than two decades, he is mainly remembered in connection with IBM's unsuccessful attempt in 1980 to license CP/M for the IBM PC.
  31. Donald Knuth (born January 10, 1938)-- made tremendous contribution by systematizing of knowledge of computer algorithms, publishing three volumes of the Art of Computer programming (starting  in 1968); see also Donald Knuth: Leonard Euler of Computer Science. Also created TeX typesetting system.  Invented specific style fo programming called Literate Programming and pioneered experimental study of programs. While working on the the Art of Computer programming  in 1971 he published his groundbreaking paper  "An empirical study of FORTRAN programs." ( Software --Practice and Experience, vol 1, pages 105-133, 1971).   In this paper he laid a foundation of empirical analysis of computer languages by providing convincing empirical evidence about the critical influence of the level of optimization of "inner loops" on performance, the fact that programs appear to exhibit a very important property termed locality of reference and provided powerful argument against orthogonal languages and for introducing "Shannon code style constructs" in the language by observing the fact that only a small rather primitive subset of the languages is used in 90% of all statements(most of arithmetic expressions on the right side of assignment statements are simple increments/decrements or a=a+c where c is a small constant).  Formulated impossibility for programmer to correctly predict bottlenecks in the programs without measurements and related Knuth law ("Premature optimization is the root of all evil. "). Was courageous fighter against early fundamentalism trends in programming promoted by Structured programming cult.
  32. Butler Lampson (born December 23, 1943) Lampson was one of the founding members of Xerox PARC in 1970. In 1973, the Xerox Alto, with its three-button mouse and full-page-sized monitor was born. It is now considered to be the first actual personal computer (at least in terms of what has become the 'canonical' GUI mode of operation). At PARC, Lampson helped work on many other revolutionary technologies, such as laser printer design; two-phase commit protocols; Bravo, the first WYSIWYG text formatting program; Ethernet, the first high-speed local area network (LAN); and designed several influential programming languages such as Euclid.
  33. John Mauchly (August 30, 1907 – January 8, 1980) an American physicist who, along with J. Presper Eckert, designed ENIAC, the first general purpose electronic digital computer, as well as EDVAC, BINAC and UNIVAC I, the first commercial computer made in the United States.
  34. John McCarthy  (September 4, 1927 – October 24, 2011)[ coined the term "artificial intelligence" (AI), developed the Lisp programming language family, significantly influenced the design of the ALGOL programming language, popularized timesharing, and was very influential in the early development of AI.
  35. Bob Miner (December 23, 1941 – November 11, 1994) the co-founder of Oracle Corporation and architect of Oracle's relational database. From 1977 until 1992, Bob Miner led product design and development for the Oracle relational database management system. In Dec., 1992, he left that role and spun off a small, advanced technology group within Oracle. He was an Oracle board member until Oct., 1993.[2]
  36. Cleve Moler a mathematician and computer programmer specializing in numerical analysis. In the mid to late 1970s, he was one of the authors of LINPACK and EISPACK, Fortran libraries for numerical computing. He invented MATLAB, a numerical computing package, to give his students at the University of New Mexico easy access to these libraries without writing Fortran. In 1984, he co-founded MathWorks with Jack Little to commercialize this program.
  37. Gordon E. Moore (born January 3, 1929)  an American businessman and co-founder and Chairman Emeritus of Intel Corporation and the author of Moore's Law (published in an article April 19, 1965 in Electronics Magazine).
  38. Robert Morris (July 25, 1932 – June 26, 2011)    a researcher at Bell Labs who worked on Multics and later Unix. Morris's contributions to early versions of Unix include the math library, the bc programming language, the program crypt, and the password encryption scheme used for user authentication. The encryption scheme was based on using a trapdoor function (now called a key derivation function) to compute hashes of user passwords which were stored in the file /etc/passwd; analogous techniques, relying on different functions, are still in use today.
  39. Allen Newell  (March 19, 1927 – July 19, 1992) contributed to the Information Processing Language (1956) and two of the earliest AI programs, the Logic Theory Machine (1956) and the General Problem Solver (1957) (with Herbert A. Simon). He was awarded the ACM's A.M. Turing Award along with Herbert A. Simon in 1975 for their basic contributions to artificial intelligence and the psychology of human cognition.
  40. Robert Noyce co-founded Fairchild Semiconductor in 1957 and Intel Corporation in 1968. He is also credited (along with Jack Kilby) with the invention of the integrated circuit or microchip which fueled the personal computer revolution and gave Silicon Valley its name.
  41. Ken Olsen an American engineer who co-founded Digital Equipment Corporation (DEC) in 1957 with colleague Harlan Anderson.
  42. John K. Ousterhout (born October 15, 1954) the creator of the Tcl scripting language and the Tk toolkit.
  43. Alan Perlis (April 1, 1922 – February 7, 1990)  an American computer scientist known for his pioneering work in programming languages and the first recipient of the Turing Award. In 1982, he wrote an article, Epigrams on Programming, for ACM's SIGPLAN journal, describing in one-sentence distillations many of the things he had learned about programming over his career. The epigrams have been widely quoted.
  44. Dennis Ritchie (September 9, 1941 – October 12, 2011)[ an American computer scientist who  created the C programming language with long-time colleague Ken Thompson, and was instrumental in creation of the Unix operating system. Ritchie and Thompson received the Turing Award from the ACM in 1983, the Hamming Medal from the IEEE in 1990 and the National Medal of Technology from President Clinton in 1999.
  45. Claude Shannon  (April 30, 1916 – February 24, 2001 an American mathematician, electronic engineer, and cryptographer known as "the father of information theory". He is also credited with founding both digital computer and digital circuit design theory in 1937, when, as a 21-year-old master's degree student at the Massachusetts Institute of Technology (MIT), he wrote his thesis demonstrating that electrical applications of Boolean algebra could construct and resolve any logical, numerical relationship. Shannon contributed to the field of cryptanalysis for national defense during World War II, including his basic work on codebreaking and secure telecommunications.
  46. Ivan Sutherland (born May 16, 1938) an American computer scientist and Internet pioneer. He received the Turing Award from the Association for Computing Machinery in 1988 for the invention of Sketchpad, an early predecessor to the sort of graphical user interface that has become ubiquitous in personal computers. He is a member of the National Academy of Engineering, as well as the National Academy of Sciences among many other major awards. In 2012 he was awarded the Kyoto Prize in Advanced Technology for "pioneering achievements in the development of computer graphics and interactive interfaces"
  47. Richard Stallman (born March 16, 1953)- Creator of the GNU project; see also Nikolai Bezroukov. Portraits of Open Source Pioneers. Ch.3 Prince Kropotkin of Software (Richard Stallman and War of Software Clones).  He is the founder of the GNU project: the second after FreeBSD project explicitly oriented on creation of existing commercial software clones and first of all Unix OS. Since mid-90th GNU project was by-and-large superseded/integrated into Linux project and movement, but still has its own historical place and importance due to the value of GPL and GNU toolchain for free/open source software movement.  Also was initial developer of two important software packages GCC and GNU Emacs.
  48. Robert Tarjan (born April 30, 1948) an American computer scientist. He discovered of several important graph algorithms, including Tarjan's off-line least common ancestors algorithm, and is a co-inventor of both splay trees and Fibonacci heaps.  The Hopcroft-Tarjan planarity testing algorithm was the first linear-time algorithm for planarity-testing.
  49. Ken Thompson (born February 4, 1943) designed and implemented the original Unix operating system. He also invented the B programming language, the direct predecessor to the C programming language, and was one of the creators and early developers of the Plan 9 operating systems. Thompson had developed the CTSS version of the editor QED, which included regular expressions for searching text. QED and Thompson's later editor ed (the standard text editor on Unix) contributed greatly to the eventual popularity of regular expressions,
  50. Larry Wall (born September 27, 1954) -- Creator of Perl language and patch program. Creator of the idea of dual licensing and influential Artistic license. See also Slightly Skeptical View on Larry Wall and Perl

Those people mentioned above are all associated with the USA. And I named just a few about work of which I personally know...  The USA  computer science research was often conducted in close collaboration with British computer scientists which also made some significant contributions (some of the most impressive IBM compilers were actually designed and implemented in Britain) but the leadership role of the USA was indisputable. CACM was always more important publications then Computer Journal. 

Large part of this unique technology culture was destroyed via outsourcing frenzy which started around 1998, but the period from approximately 1950 till approximately 2000 was really the triumph of the US computer engineering. Simultaneously this was a triumph of New Deal policies. When they were dismantled (starting from Reagan or even Carter), and neoliberalism became the ruling ideology, computer science quickly was overtaken by commercial interests and became very similar to economics in the level of corruption of academics and academic institutions.

But that did not happened overnight and the inertia lasted till late 90th. 

Firms also did not escape this transformation into money making machines with IBM as a primary example of the disastrous results of such transformations which started under "American Express"-style leadership of Lou Gerstner. The first of financial shenanigans, who became CEO of a major technical company. And who will later destroy several other major US computer companies. In the interests of shareholders and personal bonuses ;-). See  IBM marry Linux to Outsourcing.

Here is the timeline modified from History of Computer Science

Timeline of Fifty Glorious Years

1950's 1960's 1970's 1980's 1990's Notes

1950's

In 1949 The U.S. Army and the University of Illinois jointly fund the construction of two computers, ORDVAC and ILLIAC (ILLInois Automated Computer). The Digital Computer Laboratory is organized. Ralph Meagher, a physicist and chief engineer for ORDVAC, is head. 1951 ORDVAC (Ordnance Variable Automated Computer), one of the fastest computers in existence, is completed.  1952 ORDVAC moves to the Army Ballistic Research Laboratory in Aberdeen, Maryland. It is used remotely from the University of Illinois via a teletype circuit up to eight hours each night until the ILLIAC computer is completed

Grace Murray Hopper (1906-1992) invented the notion of a compiler, at Remington Rand, in 1951. Earlier, in 1947, Hopper found the first computer "bug" -- a real one -- a moth that had gotten into the Harvard Mark II. (Actually, the use of ``bug'' to mean defect goes back to at least 1889.). The first compiler was written by Grace Hopper, in 1952, for the A-0 System language. The term compiler was coined by Hopper. History of compiler construction - Wikipedia, the free encyclopedia

In a famous paper that appeared in the journal Mind in 1950, Alan Turing introduced the Turing Test, one of the first efforts in the field of artificial intelligence. He proposed a definition of "thinking" or "consciousness" using a game: a tester would have to decide, on the basis of written conversation, whether the entity in the next room responding to the tester's queries was a human or a computer. If this distinction could not be made, then it could be fairly said that the computer was "thinking".

In 1952, Alan Turing was arrested for "gross indecency" after a burglary led to the discovery of his affair with Arnold Murray. Overt homosexuality was taboo in 1950's England, and Turing was forced to take estrogen "treatments" which rendered him impotent and caused him to grow breasts. On June 7, 1954, despondent over his situation, Turing committed suicide by eating an apple laced with cyanide.

In the same 1952 ILLIAC, the first computer built and owned entirely by an educational institution, becomes operational. It was ten feet long, two feet wide, and eight and one-half feet high, contained 2,800 vacuum tubes, and weighed five tons.

In the same 1952 IBM developed first magnetic disk. In September 1952, IBM opened a facility in San Jose, Calif.—a critical moment in the story of Silicon Valley. The company set to work developing a new kind of magnetic memory for its planned Model 305 Ramac (Random Access Method of Accounting and Control), the world's first "supercomputer."

In 1952 Univac correctly predicted the results of presidential elections in the USA.  Remington Rand seized the opportunity to introduce themselves to America as the maker of UNIVAC – the computer system whose name would become synonymous with computer in the 1950s. Remington Rand was already widely known as the company that made the Remington typewriters. The company bought out the struggling Eckert-Mauchly Computer Corporation in 1950. Pres Eckert and John Mauchly had led the ENIAC project and made one of the first commercially available computer, UNIVAC. See Computer History Museum @CHM Have you got a prediction for us, UNIVAC

The IBM 650 Magnetic Drum Data Processing Machine was announced 2 July 1953 (as the "Magnetic Drum Calculator", or MDC), but not delivered until December 1954 (same time as the NORC). Principal designer: Frank Hamilton, who had also designed ASCC and SSEC. Two IBM 650s were installed at IBM Watson Scientific Computing Laboratory at Columbia University, 612 West 116th Street, beginning in August 1955.

Edsger Dijkstra invented an efficient algorithm for shortest paths in graphs as a demonstration of the ARMAC computer in 1956. He also invented an efficient algorithm for the minimum spanning tree in order to minimize the wiring needed for the X1 computer. (Dijkstra is famous for his caustic, opinionated memos. For example, see his opinions of some programming languages).

In 1956 IBM 305 RAMAC was announced. It was the first commercial computer that used a moving head hard disk drive (magnetic disk storage) for secondary storage. The 305 was one of the last vacuum tube computers that IBM built. The IBM 350 disk system stored 5 million 8-bit (7 data bits plus 1 parity bit) characters. It had fifty 24-inch-diameter (610 mm) disks.

The same year Case University Computing Center got IBM 650 and the same year Donald Knuth entered this college and  he managed to start working at the Case University Computing Center. That later led to creation of his three volume series the Art of Computer Programming -- the bible of programming as it was called.

On October 4, 1957, the first artificial Earth satellite Sputnik was launched by USSR it into an elliptical low Earth orbit. In a way it as a happenstance due to iron will and talent of Sergey Korolev, a charismatic head of the USSR rocket program (who actually served some years in GULAG). But it opened a new era.  The ILLIAC I (Illinois Automatic Computer), a pioneering computer built in 1952 by the University of Illinois, was the first computer built and owned entirely by a US educational institution, was the first to calculated Sputnik orbit. The launch of Sputnik led to creation   NASA  and indirectly of the US Advanced Research Projects Agency (DARPA) in February 1958 to regain a technological lead. It also led to dramatic increase in U.S. government spending on scientific research and education via President Eisenhower's bill called the National Defense Education Act. This bill encouraged students to go to college and study math and science. The students' tuition fees would be paid for. This led to a new emphasis on science and technology in American schools. In other words Sputnik created building blocks which probably led to the general establishment of the way computer science developed in the USA for the next decade of two.  DARPA latter funded the creation of the TCP/IP protocol and Internet as we know it. It also contributed to development of large integral circuits. The rivalry in space, even though it had military reasons served as tremendous push forward for computers and computer science. 

John Backus and others developed the first complete complier -- FORTRAN compiler in April 1957.  FORTRAN stands for FORmula TRANslating system. Heading the team is John Backus, who goes on to contribute to the development of ALGOL and the well-known syntax-specification system known as BNF. The first FORTRAN compiler took 18 person-years to create.

LISP, a list-processing language for artificial intelligence programming, was invented by John McCarthy about 1958. The same year Alan Perlis, John Backus, Peter Naur and others developed Algol.

In hardware, Jack Kilby (Texas Instruments) and Robert Noyce (Fairchild Semiconductor) invented the integrated circuit in 1959.

In 1959 LISP 1.5 appears. The same year COBOL is created by the Conference on Data Systems and Languages (CODASYL).

See also Knuth Biographic Notes


1960's

In the 1960's, computer science came into its own as a discipline. In fact this decade became a gold age of computer science. In fact, the term was coined by George Forsythe, a numerical analyst. The first computer science department was formed at Purdue University in 1962. The first person to receive a Ph. D. from a computer science department was Richard Wexelblat, at the University of Pennsylvania, in December 1965.

Operating systems saw major advances. Fred Brooks at IBM designed System/360, a line of different computers with the same architecture and instruction set, from small machine to top-of-the-line. DEC designed PDP series. The first PDP-1 was delivered to Bolt, Beranek and Newman in November 1960,[14] and formally accepted the next April.[15] The PDP-1 sold in basic form for $120,000, or about $900,000 in 2011 US dollars.[16] By the time production ended in 1969, 53 PDP-1s had been delivered.[11][17]At the end of the decade, ARPAnet, a precursor to today's Internet, began to be constructed.

In 1960 ALGOL 60, the first block-structured language, appears. This is the root of the family tree that will ultimately produce the Pl/l, algol 68, Pascal, Modula, C, Java, C# and other languages.  ALGOL become popular language in Europe in the mid- to late-1960s. Attempts to simplify Algol lead to creation of  BASIC (developed c. 1964 by John Kemeny (1926-1992) and Thomas Kurtz (b. 1928)).  It became very popular with PC revolution.

The 1960's also saw the rise of automata theory and the theory of formal languages. Big names here include Noam Chomsky and Michael Rabin. Chomsky introduced the notion of context free languages and later became well-known for his theory that language is "hard-wired" in human brains, and for his criticism of American foreign policy.

Sometime in the early 1960s , Kenneth Iverson begins work on the language that will become APL--A Programming Language. It uses a specialized character set that, for proper use, requires APL-compatible I/O devices. APL is documented in Iverson's book, A Programming Language  published in 1962

In 1962 ILLIAC II, a transistorized computer 100 times faster than the original ILLIAC, becomes operational. ACM Computing Reviews says of the machine, "ILLIAC II, at its conception in the mid-1950s, represents the spearhead and breakthrough into a new generation of machines." in 1963 Professor Donald B. Gillies discovered three Mersenne prime numbers while testing ILLIAC II, including the largest then known prime number, 2**11213 -1, which is over 3,000 digits.

The famous IBM System/360 (S/360) was  first announced by IBM on April 7, 1964.  S/360 became the most popular computer systems for more then a decade. It introduced 8-bit byte address space, byte addressing and many other things.  The same year (1964) PL/1 was released. It became the most widely used programming language in Eastern Europe and the USSR. It later served as a prototype of several other languages including PL/M and C.

In 1964 the IBM 2311 Direct Access Storage Facility was introduced (History of IBM magnetic disk drives - Wikipedia,) for the System/360 series. It was also available on the IBM 1130 and (using the 2841 Control Unit) the IBM 1800. The 2311 mechanism was largely identical to the 1311, but recording improvements allowed higher data density. The 2311 stored 7.25 megabytes on a single removable IBM 1316 disk pack (the same type used on the IBM 1311) consisting of six platters that rotated as a single unit. Each recording surface had 200 tracks plus three optional tracks which could be used as alternatives in case faulty tracks were discovered. Average seek time was 85 ms. Data transfer rate was 156 kB/s.

Along with the development unified System 360 series of computers, IBM wanted a single programming language for all users. It hoped that Fortran could be extended to include the features needed by commercial programmers. In October 1963 a committee was formed[4] composed originally of 3 IBMers from New York and 3 members of SHARE, the IBM scientific users group, to propose these extensions to Fortran. Given the constraints of Fortran, they were unable to do this and embarked on the design of a “new programming language” based loosely on Algol labeled “NPL". This acronym conflicted with that of the UK’s National Physical Laboratory and was replaced briefly by MPPL (MultiPurpose Programming Language) and, in 1965, with PL/I (with a Roman numeral “I” ). The first definition appeared in April 1964. IBM took NPL as a starting point and completed the design to a level that the first compiler could be written: the NPL definition was incomplete in scope and in detail.[7] Control of the PL/I language[8] was vested initially in the New York Programming Center and later at the IBM UK Laboratory at Hursley. The SHARE and GUIDE user groups were involved in extending the language and had a role in IBM’s process for controlling the language through their PL/I Projects. The language was first specified in detail in the manual “PL/I Language Specifications. C28-6571” written in New York from 1965 and superseded by “PL/I Language Specifications. GY33-6003” written in Hursley from 1967. IBM continued to develop PL/I in the late sixties and early seventies, publishing it in the GY33-6003 manual. These manuals were used by the Multics group and other early implementers. The first production PL/I compiler was the PL/I F compiler for the OS/360 Operating System, built by John Nash's team at Hursley in the UK: the runtime library team was managed by I.M.(Nobby) Clarke. Release 1 shipped in 1966. That was a significant step forward in comparison with earlier compilers. The PL/I D compiler, using 16 kilobytes of memory, was developed by IBM Germany for the DOS/360 low end operating system. It implemented a subset of the PL/I language requiring all strings and arrays to have fixed extents, thus simplifying the run-time environment. Reflecting the underlying operating system it lacked dynamic storage allocation and the controlled storage class. It was shipped within a year of PL/I F.

 Hoare also invented Quicksort while on business trip to Moscow.

Douglas C. Englebart invents the computer mouse c. 1968, at SRI.

The first volume of The Art of Computer Programming was published in 1968 and instantly became classic Donald Knuth (b. 1938) later published  two additional volumes of his world famous three-volume treatise

In 1968 ALGOL 68 , a monster language compared to ALGOL 60, appears. Some members of the specifications committee -- including C.A.R. Hoare and Niklaus Wirth -- protest its approval. ALGOL 68 proves difficult to implement. The same year Niklaus Wirth begins his work on a simple teaching language which later becomes Pascal.

Ted Hoff (b. 1937) and Federico Faggin at Intel designed the first microprocessor (computer on a chip) in 1969-1971.

In late 60th the PDP-11 one of the first 16-bit minicomputers was designed in a crash program by Harold McFarland, Gordon Bell, Roger Cady, and others as a response to NOVA 16-bit minicomputers. The project was able to leap forward in design with the arrival of Harold McFarland, who had been researching 16-bit designs at Carnegie Mellon University. One of his simpler designs became the PDP-11. It was launched in 1970 and became huge success. The first officially named version of Unix ran on the PDP-11/20 in 1970. It is commonly stated that the C programming language took advantage of several low-level PDP-11–dependent programming features, albeit not originally by design. A major advance in the PDP-11 design was Digital's Unibus, which supported all peripherals through memory mapping. This allowed a new device to be added easily, generally only requiring plugging a hardware interface board into the backplane, and then installing software that read and wrote to the mapped memory to control it. The relative ease of interfacing spawned a huge market of third party add-ons for the PDP-11, which made the machine even more useful. The combination of architectural innovations proved superior to competitors and the "11" architecture was soon the industry leader, propelling DEC back to a strong market position.

A second generation of programming languages, such as Basic, Algol 68 and Pascal (Designed by Niklaus Wirth in 1968-1969) appeared at the end of decade


1970's

Flat uniform record (relational) databases got a fashionable pseudo-theoretical justification with the work of Edgar F. Codd.  While mostly nonsense it help to spread relational database which became dominant type of databases. That was probably one of the first of bout of fashion in computer science. Many more followed. Codd won the Turing award in 1981.

Unix, a very influential operating system, was developed at Bell Laboratories by Ken Thompson (b. 1943) and Dennis Ritchie (b. 1941) after ATT withdraw from Multics project. Brian Kernighan and Ritchie together developed C, which became the most influential system programming language and also was used as general purpose language on personal computers.  The first release was made in 1972. The definitive reference manual for it will not appear until 1974.

In early 1970th the PL/I Optimizer and Checkout compilers produced in Hursley supported a common level of PL/I language[23] and aimed to replace the PL/I F compiler. The compilers had to produce identical results - the Checkout Compiler was used to debug programs that would then be submitted to the Optimizer. Given that the compilers had entirely different designs and were handling the full PL/I language this goal was challenging: it was achieved. The PL/I optimizing compiler took over from the PL/I F compiler and was IBM’s workhorse compiler from the 1970s to the 1990s. Like PL/I F, it was a multiple pass compiler with a 44kByte design point, but it was an entirely new design. Unlike the F compiler it had to perform compile time evaluation of constant expressions using the run-time library - reducing the maximum memory for a compiler phase to 28 kilobytes. A second-time around design, it succeeded in eliminating the annoyances of PL/I F such as cascading diagnostics. It was written in S/360 Macro Assembler by a team, led by Tony Burbridge, most of whom had worked on PL/I F. Macros were defined to automate common compiler services and to shield the compiler writers from the task of managing real-mode storage - allowing the compiler to be moved easily to other memory models. Program optimization techniques developed for the contemporary IBM Fortran H compiler were deployed: the Optimizer equaled Fortran execution speeds in the hands of good programmers. Announced with the IBM S/370 in 1970, it shipped first for the DOS/360 operating system in Aug 1971, and shortly afterward for OS/360, and the first virtual memory IBM operating systems OS/VS1, MVS and VM/CMS (the developers were unaware that while they were shoehorning the code into 28kB sections, IBM Poughkeepsie was finally ready to ship virtual memory support in OS/360). It supported the batch programming environments and, under TSO and CMS, it could be run interactively.

Simultaneously PL/C  a dialect of PL/1 for education was developed at Cornell University in the early 1970s. It was designed with the specific goal of being used for teaching programming. The main authors were  Richard W. Conway and Thomas R. Wilcox. They submitted the famous article "Design and implementation of a diagnostic compiler for PL/I" published in the Communications of ACM in March 1973. PL/C eliminated some of the more complex features of PL/I, and added extensive debugging and error recovery facilities. The PL/C compiler had the unusual capability of never failing to compile any program, through the use of extensive automatic correction of many syntax errors and by converting any remaining syntax errors to output statements.

In 1972 Gary Kildall implemented a subset of PL/1, called "PL/M" for microprocessors. PL/M was used to write the CP/M operating system  - and much application software running on CP/M and MP/M. Digital Research also sold a PL/I compiler for the PC written in PL/M. PL/M was used to write much other software at Intel for the 8080, 8085, and Z-80 processors during the 1970s.

In 1973-74 Gary Kildall developed CP/M during , an operating system for an Intel Intellec-8 development system, equipped with a Shugart Associates 8-inch floppy disk drive interfaced via a custom floppy disk controller. It was written in PL/M. Various aspects of CP/M were influenced by the TOPS-10 operating system of the DECsystem-10 mainframe computer, which Kildall had used as a development environment.

The LSI-11 (PDP-11/03), introduced in February, 1975 was the first PDP-11 model produced using large-scale integration a precursor to personal PC. 

The first RISC architecture was begun by John Cocke in 1975, at the Thomas J. Watson Laboratories of IBM. Similar projects started at Berkeley and Stanford around this time.

In March 1976 one of the first supercomputer CRAY-1 was shipped, designed by Seymour Cray (b. 1925) It could perform 160 million operations in a second. The Cray XMP came out in 1982. Later Cray Research was taken over by Silicon Graphics.

There were also major advances in algorithms and computational complexity. In 1971, Steve Cook published his seminal paper on NP-completeness, and shortly thereafter, Richard Karp showed that many natural combinatorial problems were NP-complete. Whit Diffie and Martin Hellman published a paper that introduced the theory of public-key cryptography, and a public-key cryptosystem known as RSA was invented by Ronald Rivest, Adi Shamir, and Leonard Adleman.

Microsoft was formed on April 4, 1975 to develop and sell BASIC interpreters for the Altair 8800. Bill Gates and Paul Allen write a version of BASIC that they sell to MITS (Micro Instrumentation and Telemetry Systems) on a per-copy royalty basis. MITS is producing the Altair, one of the earlier  8080-based microcomputers that came with a interpreter for a programming language.

The Apple I went on sale in July 1976 and was market-priced at $666.66 ($2,572 in 2011 dollars, adjusted for inflation.)

The Apple II was introduced on April 16, 1977 at the first West Coast Computer Faire. It differed from its major rivals, the TRS-80 and Commodore PET, because it came with color graphics and an open architecture. While early models used ordinary cassette tapes as storage devices, they were superseded by the introduction of a 5 1/4 inch floppy disk drive and interface, the Disk II.

In 1976, DEC decided to extend the PDP-11 architecture to 32-bits while adding a complete virtual memory system to the simple paging and memory protection of the PDP-11. The result was the VAX architecture. The first computer to use a VAX CPU was the VAX-11/780, which DEC referred to as a superminicomputer. Although it was not the first 32-bit minicomputer, the VAX-11/780's combination of features, price, and marketing almost immediately propelled it to a leadership position in the market after it was released in 1978. VAX systems were so successful that it propelled Unix to the status of major OS.  in 1983, DEC canceled its Jupiter project, which had been intended to build a successor to the PDP-10 mainframe, and instead focused on promoting the VAX as the single computer architecture for the company.

In 1978 AWK -- a text-processing language named after the designers, Aho, Weinberger, and Kernighan -- appears. The same year the ANSI standard for FORTRAN 77 appears.

In 1977 Bill Joy, then a graduate student at Berkeley, started compiling the first Berkeley Software Distribution (1BSD), which was released on March 9, 1978

In 1979, three graduate students in North Carolina developed a distributed news server which eventually became Usenet.

The Second Berkeley Software Distribution (2BSD), was released in May 1979. It included updated versions of the 1BSD software as well as two new programs by Joy that persist on Unix systems to this day: the vi text editor (a visual version of ex) and the C shell.

The same 1979 VisiCalc  the first spreadsheet program available for personal computers was conceived by Dan Bricklin, refined by Bob Frankston, developed by their company Software Arts,[1] and distributed by Personal Software in 1979 (later named VisiCorp) for the Apple II computer

At the end of 1979 the kernel of BSD Unix was largely rewritten by Berkeley students to include a virtual memory implementation, and a complete operating system including the new kernel, ports of the 2BSD utilities to the VAX, was released as 3BSD at the end of 1979.

Microsoft purchased a license for Version 7 Unix from AT&T in 1979, and announced on August 25, 1980 that it would make it available for the 16-bit microcomputer market.


1980's

The success of 3BSD was a major factor in the Defense Advanced Research Projects Agency's (DARPA) decision to fund Berkeley's Computer Systems Research Group (CSRG), which would develop a standard Unix platform for future DARPA research in the VLSI Project and included TCP stack. CSRG released 4BSD, containing numerous improvements to the 3BSD system, in October 1980. 4BSD released in November 1980 offered a number of enhancements over 3BSD, notably job control in the previously released csh, delivermail (the antecedent of sendmail), "reliable" signals, and the Curses programming library.

This decade also saw the rise of the personal computer, thanks to Steve Wozniak and Steve Jobs, founders of Apple Computer.

In 1981 IBM PC was launched which made personal computer mainstream. The first computer viruses are developed also in 1981. The term was coined by Leonard Adleman, now at the University of Southern California. The same year, 1981, the first truly successful portable computer (predecessor of modern laptops) was marketed, the Osborne I.

In 1982 one of the first scripting languages REXX was released by IBM as a product. It was four years after AWK was released. Over the years IBM included REXX in almost all of its operating systems (VM/CMS, VM/GCS, MVS TSO/E, AS/400, VSE/ESA, AIX, CICS/ESA, PC DOS, and OS/2), and has made versions available for Novell NetWare, Windows, Java, and Linux.

In 1982 PostScript appears, which revolutionized printing on dot matrix and laser printers.

1983 was the year of major events in language area:

4.2BSD would take over two years to implement and contained several major overhauls. It incorporated a modified version of BBN's preliminary TCP/IP implementation;  new Berkeley Fast File System, implemented by Marshall Kirk McKusick; The official 4.2BSD release came in August 1983. The same 1983 Stallman resigns from MIT to start the GNU project with the explicit goal of reimplementing  Unix as a "free" operating system. The name stands for "GNU is Not Unix."

In 1984 Stallman published a rewritten version of Gosling's Emacs (GNU Emacs, where G stand for Goslings) as "free" software (Goslings sold the rights for his code to a commercial company), and launches the Free Software Foundation (FSF) to support the GNU project. One of the first program he decided to write is a C compiler that became widely knows as gcc. The same year Steven Levy "Hackers" book is published with a chapter devoted to RMS that presented him in an extremely favorable light.

In October 1983 Apple introduced the Macintosh computer which was the first GUI-based mass produced  personal computer. It was three years after IBM PC was launched and six years after Apple II launch. It went of sale on Jan 24, 1984 two days after US$1.5 million Ridley Scott television commercial, "1984" was aired during Super Bowl XVIII on January 22, 1984. It is now considered a  a "masterpiece.". In it an unnamed heroine to represent the coming of the Macintosh (indicated by a Picasso-style picture of Apple's Macintosh computer on her white tank top) as a means of saving humanity from the "conformity" of IBM's attempts to dominate the computer industry. 

In 1985 Intel 80386 introduced 32-bit logical addressing. It became instrumental in Unix Renaissance which started the same year the launch of of Xenix 2.0 by Microsoft. It was based on UNIX System V. An update numbered 2.1.1 added support for the Intel 80286 processor. The Sperry PC/IT, an IBM PC AT clone, was advertised as capable of supporting eight simultaneous dumb terminal users under this version. Subsequent releases improved System V compatibility. The era of PC Unix started and Microsoft became dominant vendor of Unix: in the late 1980s, Xenix was, according to The Design and Implementation of the 4.3BSD UNIX Operating System, "probably the most widespread version of the UNIX operating system, according to the number of machines on which it runs". In 1987, SCO ported Xenix to the 386 processor. Microsoft used Xenix on Sun workstations and VAX minicomputers extensively within their company as late as 1992

Microsoft Excel was first released for Macintosh, not IBM PC, in 1985. The same year the combination of the Mac, Apple's LaserWriter printer, and Mac-specific software like Boston Software's MacPublisher and Aldus PageMaker enabled users to design, preview, and print page layouts complete with text and graphics—an activity to become known as desktop publishing.

The first version of GCC was able to compile itself in late 1985. The same year GNU Manifesto published

In 1986-1989 a series of computer viruses for PC DOS made headlines. One of the first mass viruses was boot virus called Brain created in 1986 by the Farooq Alvi Brothers in Lahore, Pakistan, reportedly to deter piracy of the software they had written.

In 1987, the US National Science Foundation started NSFnet, precursor to part of today's Internet.

The same year, 1987, Perl was released by Larry Wall. In 1988 Perl 2 was released.

Steve Jobs was ousted from Apple and formed his new company NeXT Computer  with a dozen of former Apple employees. NeXT was the first affordable workstation with over megaflop computer power. It was in 1988, and the smaller NeXTstation in 1990. It was NeXTstation  that was used to develop World Wide Web in CERN. It was also instrumental in creating of complex modern GUI interfaces and launching object oriented programming into mainstream...

In 1998 Human genome sequncing project started. A Brief History of the Human Genome Project

In 1988, Congress funded both the NIH and the DOE to embark on further exploration of this concept, and the two government agencies formalized an agreement by signing a Memorandum of Understanding to "coordinate research and technical activities related to the human genome."

James Watson was appointed to lead the NIH component, which was dubbed the Office of Human Genome Research. The following year, the Office of Human Genome Research evolved into the National Center for Human Genome Research (NCHGR).

In 1990, the initial planning stage was completed with the publication of a joint research plan, "Understanding Our Genetic Inheritance: The Human Genome Project, The First Five Years, FY 1991-1995." This initial research plan set out specific goals for the first five years of what was then projected to be a 15-year research effort.

In 1992, Watson resigned, and Michael Gottesman was appointed acting director of the center. The following year, Francis S. Collins was named director.

The advent and employment of improved research techniques, including the use of restriction fragment-length polymorphisms, the polymerase chain reaction, bacterial and yeast artificial chromosomes and pulsed-field gel electrophoresis, enabled rapid early progress. Therefore, the 1990 plan was updated with a new five-year plan announced in 1993 in the journal Science (262: 43-46; 1993).

1989 FSF introduces a General Public License (GPL). GPL is also known as 'copyleft'. Stallman redefines the word "free" in software to mean "GPL compatible". In 1990 As the president of the League for Programming Freedom (organization that fight software patterns), Stallman is given a $240,000 fellowship by the John D. and Catherine T. MacArthur Foundation.


1990's  

Microsoft Windows 3.0, which began to approach the Macintosh operating system in both performance and feature set, was released in May 1990 and was a less expensive alternative to the Macintosh platform.

4.3BSD-Reno came in early 1990. It was an interim release during the early development of 4.4BSD, and its use was considered a "gamble", hence the naming after the gambling center of Reno, Nevada. This release was explicitly moving towards POSIX compliance. Among the new features was an NFS implementation from the University of Guelph. In August 2006, Information Week magazine rated 4.3BSD as the "Greatest Software Ever Written".They commented: "BSD 4.3 represents the single biggest theoretical undergirder of the Internet."

On December 25 1990 the first successful communication between a Hypertext Transfer Protocol (HTTP) client and server via the Internet was accomplished in CERN. It was running on NeXT:

" Mike Sendall buys a NeXT cube for evaluation, and gives it to Tim [Berners-Lee]. Tim's prototype implementation on NeXTStep is made in the space of a few months, thanks to the qualities of the NeXTStep software development system. This prototype offers WYSIWYG browsing/authoring! Current Web browsers used in "surfing the Internet" are mere passive windows, depriving the user of the possibility to contribute. During some sessions in the CERN cafeteria, Tim and I try to find a catching name for the system. I was determined that the name should not yet again be taken from Greek mythology. Tim proposes "World-Wide Web". I like this very much, except that it is difficult to pronounce in French..." by Robert Cailliau, 2 November 1995.[22]

In 1991 Linux was launched. The USSR was dissolved that led to influx of Russian programmers (as well as programmers from Eastern European countries) in the USA.

The first website was online on 6 August 1991:

"Info.cern.ch was the address of the world's first-ever web site and web server, running on a NeXT computer at CERN. The first web page address was http://info.cern.ch/hypertext/WWW/TheProject.html, which centred on information regarding the WWW project. Visitors could learn more about hypertext, technical details for creating their own webpage, and even an explanation on how to search the Web for information. There are no screenshots of this original page and, in any case, changes were made daily to the information available on the page as the WWW project developed. You may find a later copy (1992) on the World Wide Web Consortium website." -CERN

BSDi, the company formed to commercialized Unix BSD system found itself in legal trouble with AT&T's Unix System Laboratories (USL) subsidiary, then the owners of the System V copyright and the Unix trademark. The USL v. BSDi lawsuit was filed in 1992 and led to an injunction on the distribution of Net/2 until the validity of USL's copyright claims on the source could be determined.  That launched Linux into mainstream.

FreeBSD development began in 1993 with a quickly growing, unofficial patchkit maintained by users of the 386BSD operating system. This patchkit forked from 386BSD and grew into an operating system taken from U.C. Berkeley's 4.3BSD-Lite (Net/2) tape with many 386BSD components and code from the Free Software Foundation.

On April 1993 CERN released the web technology into the public domain.

1994 First official Linux version 1.0 kernel released. Linux already has about 500,000 users. Unix renaissance started.

The same 1994 Microsoft incorporates Visual Basic for Applications into Excel, creating a way to knock out the competition of the Microsoft Office.

In February 1995, ISO accepts the 1995 revision of the Ada language. Called Ada 95, it includes OOP features and support for real-time systems. 

In 1995 TCP connectivity in the USA became mainstream. Internet boom (aka dot-com boom) hit the USA. . Red Hat was formed by  merger with ACC with  Robert Yong of ACC (former founder of Linux Journal)  a CEO.

In 1996 first computer monitoring system such as Tivoli and OpenView became established players. 

In 1996  first ANSI C++ standard was released.

In 1997 Java was released. Also weak and primitive programming language if we consider its design (originally intended for imbedded systems), it proved to be durable and successful successor for Cobol.  Sum Microsystems proved to be a capable marketing machine but that lead to deterioration of Solaris position and partial neglect of other projects such as Solaris on X86 and TCL. Microsoft launched a successful derivate of Java, called C# in December 2002.

In 1998 outsourcing that in 10 years destroy the USA programming industry became a fashion, fueleed by finacial industry attempts to exploit Internet boom for quick profits.

In 1999 a crazy connected with so called Millennium bug hit the USA.  Proved lasting intellectual deterioration of some key US political figures including chairmen Greenspan --a cult like figure at the time.

In March 1999. Al Gore revealed that "During my service in the United States Congress, I took the initiative in creating the internet.". Which was partically true

This decade ended with 2000 dot-com boom bust. See Nikolai Bezroukov. Portraits of Open Source Pioneers. Ch 4: Grand Replicator aka Benevolent Dictator (A Slightly Skeptical View on Linus Torvalds)

Notes


Top updates

Softpanorama Switchboard
Softpanorama Search


NEWS CONTENTS

Old News ;-)

2010-2019 2000-2009 1990-1999 1980-1989 1970-1979 1960-1969 1950-1959

[Feb 21, 2017] Designing and managing large technologies

Feb 21, 2017 | economistsview.typepad.com
RC AKA Darryl, Ron : February 20, 2017 at 04:48 AM , 2017 at 04:48 AM
RE: Designing and managing large technologies

http://understandingsociety.blogspot.com/2017/02/designing-and-managing-large.html

[This is one of those days where the sociology is better than the economics or even the political history.]

What is involved in designing, implementing, coordinating, and managing the deployment of a large new technology system in a real social, political, and organizational environment? Here I am thinking of projects like the development of the SAGE early warning system, the Affordable Care Act, or the introduction of nuclear power into the civilian power industry.

Tom Hughes described several such projects in Rescuing Prometheus: Four Monumental Projects That Changed the Modern World. Here is how he describes his focus in that book:

Telling the story of this ongoing creation since 1945 carries us into a human-built world far more complex than that populated earlier by heroic inventors such as Thomas Edison and by firms such as the Ford Motor Company. Post-World War II cultural history of technology and science introduces us to system builders and the military-industrial-university complex. Our focus will be on massive research and development projects rather than on the invention and development of individual machines, devices, and processes. In short, we shall be dealing with collective creative endeavors that have produced the communications, information, transportation, and defense systems that structure our world and shape the way we live our lives. (3)

The emphasis here is on size, complexity, and multi-dimensionality. The projects that Hughes describes include the SAGE air defense system, the Atlas ICBM, Boston's Central Artery/Tunnel project, and the development of ARPANET...


[Of course read the full text at the link, but here is the conclusion:]


...This topic is of interest for practical reasons -- as a society we need to be confident in the effectiveness and responsiveness of the planning and development that goes into large projects like these. But it is also of interest for a deeper reason: the challenge of attributing rational planning and action to a very large and distributed organization at all. When an individual scientist or engineer leads a laboratory focused on a particular set of research problems, it is possible for that individual (with assistance from the program and lab managers hired for the effort) to keep the important scientific and logistical details in mind. It is an individual effort. But the projects described here are sufficiently complex that there is no individual leader who has the whole plan in mind. Instead, the "organizational intentionality" is embodied in the working committees, communications processes, and assessment mechanisms that have been established.

It is interesting to consider how students, both undergraduate and graduate, can come to have a better appreciation of the organizational challenges raised by large projects like these. Almost by definition, study of these problem areas in a traditional university curriculum proceeds from the point of view of a specialized discipline -- accounting, electrical engineering, environmental policy. But the view provided from a discipline is insufficient to give the student a rich understanding of the complexity of the real-world problems associated with projects like these. It is tempting to think that advanced courses for engineering and management students could be devised making extensive use of detailed case studies as well as simulation tools that would allow students to gain a more adequate understanding of what is needed to organize and implement a large new system. And interestingly enough, this is a place where the skills of humanists and social scientists are perhaps even more essential than the expertise of technology and management specialists. Historians and sociologists have a great deal to add to a student's understanding of these complex, messy processes.

[A big YEP to that.]


cm -> RC AKA Darryl, Ron... , February 20, 2017 at 12:32 PM
Another rediscovery that work is a social process. But certainly well expressed.

It (or the part you quoted) also doesn't say, but hints at the obvious "problem" - social complexity and especially the difficulty of managing large scale collaboration. Easier to do when there is a strong national or comparable large-group identity narrative, almost impossible with neoliberal YOYO. You can always compel token effort but not the "intangible" true participation.

People are taught to ask "what's in it for me", but the answer better be "the same as what's in it for everybody else" - and literally *everybody*. Any doubts there and you can forget it. The question will usually not be asked explicitly or in this clarity, but most people will still figure it out - if not today then tomorrow.

[Jan 11, 2017] Fake History Alert Sorry BBC, but Apple really did invent the iPhone

Notable quotes:
"... In many ways Treo/Palm and Windows CE anticipated it, but especially the latter tried to bring a "desktop" UI on tiny devices (and designed UIs around a stylus and a physical keyboard). ..."
"... The N900, N810 and N800 are to this day far more "little computers" than any other smartphone so far. Indeed, as they ran a Debian Linux derivative with a themed Enlightenment based desktop, which is pretty much off the shelf Linux software. While they didn't have multitouch, you could use your finger on the apps no problem. It had a stylus for when you wanted extra precision though. ..."
"... I was reading a BBC news web article and it was wrong too. It missed out emphasising that the real reason for success in 2007 was the deals with operators, cheap high cap data packages, often bundled with iPhone from the Mobile Operator. ..."
"... Actually if you had a corporate account, you had a phone already with email, Apps, ability to read MS Office docs, web browser and even real Fax send/receive maybe 5 or 6 years before the iPhone. Apart from an easier touch interface, the pre-existing phones had more features like copy/paste, voice control and recording calls. ..."
"... I remember having a motorola A920 way back in 2003/2004 maybe, and on that I made video calls, went online, had a touch interface, ran 'apps', watched videos.... in fact I could do everything the iPhone could do and more... BUT it was clunky and the screen was not large... the iPhone was a nice step forward in many ways but also a step back in functionality ..."
"... Apple invented everything... They may have invented the iPhone but they DID NOT invent the "smartphone category" as that article suggests. ..."
"... Microsoft had Smartphone 2002 and Pocket PC 2000 which were eventually merged into Windows Mobile and, interface aside, were vastly superior to the iPhone's iOS. ..."
"... Devices were manufactured in a similar fashion to how android devices are now - MS provided the OS and firms like HTC, HP, Acer, Asus, Eten, Motorola made the hardware. ..."
"... The government was looking for a display technology for aircraft that was rugged, light, low powered and more reliable than CRTs. They also wanted to avoid the punitive royalties taken by RCA on CRTs. It was the work done in the 1960s by the Royal Radar Establishment at Malvern and George William Gray and his team at the University of Hull that led to modern LCDs. QinetiQ, which inherited RSRE's intellectual property rights, is still taking royalties on each display sold. ..."
"... The key here is that Steve Jobs had the guts to force the thought of a useful smartphone, gadget for the user first and phone second into the minds of the Telcos, and he was the one to get unlimited/big data bundles. ..."
"... He identified correctly, as many had before but before the power to do anything about it, that the customers are the final users, not the telcos. ..."
Jan 11, 2017 | theregister.co.uk

deconstructionist

Re: The point stands

the point is flat on it's back just like the sophistic reply.

Lets take apples first machines they copied the mouse from Olivetti , they took the OS look from a rank XEROX engineers work, the private sector take risks and plagiarize when they can, but the missing person here is the amateur, take the BBS private individuals designed, built and ran it was the pre cursor to the net and a lot of .com company's like AOL and CompuServe where born there.

And the poor clarity in the BBC article is mind numbing, the modern tech industry has the Fairchild camera company as it's grand daddy which is about as far from federal or state intervention and innovation as you can get .

Deconstructionism only works when you understand the brief and use the correct and varied sources not just one crackpot seeking attention.

Lotaresco

Re: Engineering change at the BBC?

"The BBC doesn't "do" engineering "

CEEFAX, PAL Colour TV, 625 line transmissions, The BBC 'B', Satellite Broadcasting, Digital Services, the iPlayer, micro:bit, Smart TV services.

There's also the work that the BBC did in improving loudspeakers including the BBC LS range. That work is one reason that British loudspeakers are still considered among the world's best designs.

By all means kick the BBC, but keep it factual.

LDS

Re: I thought I invented it.

That was the first market demographics - iPod users happy to buy one who could also make calls. But that's also were Nokia failed spectacularly - it was by nature phone-centric. Its models where phones that could also make something else. True smartphones are instead little computers that can also make phone calls.

In many ways Treo/Palm and Windows CE anticipated it, but especially the latter tried to bring a "desktop" UI on tiny devices (and designed UIs around a stylus and a physical keyboard).

the iPod probably taught Apple you need a proper "finger based" UI for this kind of devices - especially for the consumer market - and multitouch solved a lot of problems.

Emmeran

Re: I thought I invented it.

Shortly there-after I duct-taped 4 of them together and invented the tablet.

My version of it all is that the glory goes to iTunes for consumer friendly interface (ignore that concept Linux guys) and easy music purchases, the rest was natural progression and Chinese slave labor.

Smart phones and handheld computers were definitely driven by military dollars world wide but so was the internet. All that fact shows is that a smart balance of Capitalism & Socialism can go a long way.

Ogi

Re: I thought I invented it.

>That was the first market demographics - iPod users happy to buy one who could also make calls. But that's also were Nokia failed spectacularly - it was by nature phone-centric. Its models where phones that could also make something else. True smartphones are instead little computers that can also make phone calls. In many ways Treo/Palm and Windows CE anticipated it, but especially the latter tried to bring a "desktop" UI on tiny devices (and designed UIs around a stylus and a physical keyboard). the iPod probably taught Apple you need a proper "finger based" UI for this kind of devices - especially for the consumer market - and multitouch solved a lot of problems.

I don't know exactly why Nokia failed, but it wasn't because their smart phones were "phone centric". The N900, N810 and N800 are to this day far more "little computers" than any other smartphone so far. Indeed, as they ran a Debian Linux derivative with a themed Enlightenment based desktop, which is pretty much off the shelf Linux software. While they didn't have multitouch, you could use your finger on the apps no problem. It had a stylus for when you wanted extra precision though.

I could apt-get (with some sources tweaking) what I wanted outside of their apps. You could also compile and run proper Linux desktop apps on it, including openoffice (back in the day). It ran like a dog and didn't fit the "mobile-UI" they created, but it worked.

It also had a proper X server, so I could forward any phone app to my big PC if I didn't feel like messing about on a small touchscreen. To this day I miss this ability. To just connect via SSH to my phone over wifi, run an smartphone app, and have it appear on my desktop like any other app would.

It had xterm, it had Perl built in, it had Python (a lot of it was written in Python), you even could install a C toolchain on it and develop C code on it. People ported standard desktop UIs on it, and with a VNC/RDP server you could use it as a portable computer just fine (just connect to it using a thin client, or a borrowed PC).

I had written little scripts to batch send New years SMS to contacts, and even piped the output of "fortune" to a select few numbers just for kicks (the days with free SMS, and no chat apps). To this day I have no such power on my modern phones.

Damn, now that I think back, it really was a powerful piece of kit. I actually still miss the features *sniff*

And now that I think about it, In fact I suspect they failed because their phones were too much "little computers" at a time when people wanted a phone. Few people (outside of geeks) wanted to fiddle with X-forwarding, install SSH, script/program/modify, or otherwise customise their stuff.

Arguably the one weakest app on the N900 was the phone application itself, which was not open source, so could not be improved by the community, so much so people used to say it wasn't really a phone, rather it was a computer with a phone attached, which is exactly what I wanted.

Mage

Invention of iPhone

It wasn't even really an invention.

The BBC frequently "invents" tech history. They probably think MS and IBM created personal computing, when in fact they held it back for 10 years and destroyed innovating companies then.

The only significant part was the touch interface by Fingerworks.

I was reading a BBC news web article and it was wrong too. It missed out emphasising that the real reason for success in 2007 was the deals with operators, cheap high cap data packages, often bundled with iPhone from the Mobile Operator.

This is nonsense:

http://www.bbc.com/news/technology-38550016

"Those were the days, by the way, when phones were for making calls but all that was about to change."

Actually if you had a corporate account, you had a phone already with email, Apps, ability to read MS Office docs, web browser and even real Fax send/receive maybe 5 or 6 years before the iPhone. Apart from an easier touch interface, the pre-existing phones had more features like copy/paste, voice control and recording calls.

The revolution was ordinary consumers being able to have a smart phone AND afford the data. The actual HW was commodity stuff. I had the dev system for the SC6400 Samsung ARM cpu used it.

Why did other phones use resistive + stylus instead of capacitive finger touch?

  • 1) Apple Newton and Palm: Handwriting & annotation. Needs high resolution.
  • 2) Dominance of MS CE interface (only usable with with a high resolution stylus.)

The capacitive touch existed in the late 1980s, but "holy grail" was handwriting recognition, not gesture control, though Xerox and IIS both had worked on it and guestures were defined before the 1990s. So the UK guy didn't invent anything.

Also irrelevant.

http://www.bbc.com/news/technology-38552241

Mines the one with a N9110 and later N9210 in the pocket. The first commercial smart phone was 1998 and crippled by high per MByte or per second (or both!) charging. Also in 2002, max speed was often 28K, but then in 2005 my landline was still 19.2K till I got Broadband, though I had 128K in 1990s in the city (ISDN) before I moved.

xeroks

Re: Invention of iPhone

The ground breaking elements of the iPhone were all to do with usability:

The fixed price data tariff was - to me - the biggest innovation. It may have been the hardest to do, as it involved entrenched network operators in a near monopoly. The hardware engineers only had to deal with the laws of physics.

The apple store made it easy to purchase and install apps and media. Suddenly you didn't have to be a geek or an innovator to make your phone do something useful or fun that the manufacturer didn't want to give to everyone.

The improved touch interface, the styling, and apple's cache all helped, and, I assume, fed into the efforts to persuade the network operators to give the average end user access to data without fear.

MrXavia

Re: Invention of iPhone

"Those were the days, by the way, when phones were for making calls but all that was about to change."

I remember having a motorola A920 way back in 2003/2004 maybe, and on that I made video calls, went online, had a touch interface, ran 'apps', watched videos.... in fact I could do everything the iPhone could do and more... BUT it was clunky and the screen was not large... the iPhone was a nice step forward in many ways but also a step back in functionality

imaginarynumber

Re: Invention of iPhone

"The fixed price data tariff was - to me - the biggest innovation".

In my experience, the iphone killed the "all you can eat" fixed price data tariffs

I purchased a HTC Athena (T-Mobile Ameo) on a T-Mobile-Web and Walk contract in Feb 2007. I had unlimited 3.5G access (including tethering) and fixed call minutes/texts.

When it was time to upgrade, I was told that iphone 3G users were using too much data and that T-Mobile were no longer offering unlimited internet access.

Robert Carnegie

"First smartphone"

For fun, I put "first smartphone" into Google. It wasn't Apple's. I think a BBC editor may have temporarily said that it was.

As for Apple inventing the first multitouch smartphone, though -

http://www.bbc.co.uk/news/technology-38552241 claims, with some credibility, that Apple's engineers wanted to put a keyboard on their phone. The Blackberry phone had a keyboard. But Steve Jobs wanted a phone that you could work with your finger (without a keyboard).

One finger.

If you're only using one finger, you're not actually using multi touch?

nedge2k

Apple invented everything... They may have invented the iPhone but they DID NOT invent the "smartphone category" as that article suggests.

Microsoft had Smartphone 2002 and Pocket PC 2000 which were eventually merged into Windows Mobile and, interface aside, were vastly superior to the iPhone's iOS.

Devices were manufactured in a similar fashion to how android devices are now - MS provided the OS and firms like HTC, HP, Acer, Asus, Eten, Motorola made the hardware.

People rarely know how long HTC has been going as they used to OEM stuff for the networks - like the original Orange SPV (HTC Canary), a candybar style device running Microsoft Smartphone 2002. Or the original O2 XDA (HTC Wallaby), one the first Pocket PC "phone edition" devices and, IIRC, the first touchscreen smartphone to be made by HTC.

GruntyMcPugh

Re: Apple invented everything...

Yup, I had Windows based smartphones made by Qtek and HTC, and my first smartphone was an Orange SPV M2000 (a Qtek 9090 ) three years before the first iPhone, and I had a O2 XDA after that, which in 2006, had GPS, MMS, and an SD card slot, which held music for my train commute.

Now I'm a fan of the Note series, I had one capacitive screen smartphone without a stylus (HTC HD2), and missed it too much.

nedge2k

Re: Apple invented everything...

Lotaresco, I used to review a lot of the devices back in the day, as well as using them daily and modifying them (my phone history for ref: http://mowned.com/nedge2k ). Not once did they ever fail to make a phone call. Maybe the journalist was biased and made it up (Symbian was massively under threat at the time and all sorts of bullshit stories were flying about), maybe he had dodgy hardware, who knows.

Either way, it doesn't mean that the OS as a whole wasn't superior to what Nokia and Apple produced - because in every other way, it was.

imaginarynumber

Re: Apple invented everything...

@Lotaresco

"The weak spot for Microsoft was that it decided to run telephony in the application layer. This meant that any problem with the OS would result in telephony being lost....

Symbian provided a telephone which could function as a computer. The telephony was a low-level service and even if the OS crashed completely you could still make and receive calls. Apple adopted the same architecture, interface and telephony are low level services which are difficult to kill."

Sorry, but if iOS (or symbian) crashes you cannot make calls. In what capacity were you evaluating phones in 2002? I cannot recall ever seeing a Windows Mobile blue screen. It would hang from time to time, but it never blue screened.

MR J

Seeing how much free advertising the BBC has given Apple over the years I doubt they will care.

And lets be honest here, the guy is kinda correct. We didn't just go from a dumb phone to a smart phone, there was a gradual move towards it as processing power was able to be increased and electronic packages made smaller. Had we gone from the old brick phones straight to an iPhone then I would agree that they owned something like TNT.

Did Apple design the iPhone - Yes, of course.

Did Apple invent the Smart Phone - Nope.

IBM had a touch screen "smart" phone in 1992 that had a square screen with rounded corners.

What Apple did was put it into a great package with a great store behind it and they made sure it worked - and worked well. I personally am not fond of Apple due to the huge price premium they demand and overly locked down ecosystems, but I will admit it was a wonderful product Design.

Peter2

Re: "opinion pieces don't need to be balanced"

"I am no fan of Apple, but to state that something was invented by the State because everyone involved went to state-funded school is a kindergarten-level of thinking that has no place in reasoned argument."

It's actually "Intellectual Yet Idiot" level thinking. Google it. Your right that arguments of this sort of calibre have no place in reasoned argument, but the presence of this sort of quality thinking being shoved down peoples throats by media is why a hell of a lot of people are "fed up with experts".

TonyJ

Hmmm....iPhone 1.0

I actually got one of these for my wife. It was awful. It almost felt like a beta product (and these are just a few of things I still remember):

  • It had no kind of face sensor so it was common for the user to disconnect mid-call via their chin or cheek;
  • It's autocorrect functions were terrible - tiny little words above the word in question and even tinier x to close the option;
  • Inability to forward messages;
  • No email support;
  • No apps.

I think it's reasonably fair to say that it was the app store that really allowed the iPhone to become so successful, combined with the then Apple aura and mystique that Jobs was bringing to their products.

As to who invented this bit or that bit - I suggest you could pull most products released in the last 10-20 years and have the same kind of arguments.

But poor show on the beeb for their lack of fact checking on this one.

TonyJ

Re: Hmmm....iPhone 1.0

"...The original iPhone definitely has a proximity sensor. It is possible that your wife's phone was faulty or there was a software issue...."

Have an upvote - hers definitely never worked (and at the time I didn't even know it was supposed to be there), so yeah, probably faulty. I'd just assumed it didn't have one.

Lotaresco

There is of course...

.. the fact that the iPhone wouldn't exist without its screen and all LCD displays owe their existence to (UK) government sponsored research. So whereas I agree that Mazzucato is guilty of rabidly promoting an incorrect hypothesis to the status of fact, there is this tiny kernel of truth.

The government was looking for a display technology for aircraft that was rugged, light, low powered and more reliable than CRTs. They also wanted to avoid the punitive royalties taken by RCA on CRTs. It was the work done in the 1960s by the Royal Radar Establishment at Malvern and George William Gray and his team at the University of Hull that led to modern LCDs. QinetiQ, which inherited RSRE's intellectual property rights, is still taking royalties on each display sold.

anonymous boring coward

Re: There is of course...

I had a calculator in the late 1970s with an LCD display. It had no resemblance to my phone's display.

Not even my first LCD screened laptop had much resemblance with a phone's display. That laptop had a colour display, in theory. If looked at at the right angle, in the correct light.

Innovation is ongoing, and not defined by some initial stumbling attempts.

juice

Apple invented the iPhone...

... in the same way that Ford invented the Model T, Sony invented the Walkman or Nintendo invented the Wii. They took existing technologies, iterated and integrated them, and presented them in the right way in the right place at the right time.

And that's been true of pretty much every invention since someone discovered how to knap flint.

As to how much of a part the state had to play: a lot of things - especially in the IT and medical field - have been spun out of military research, though by the same token, much of this is done by private companies funded by government sources.

Equally, a lot of technology has been acquired through trade, acquisition or outright theft. In WW2, the United Kingdom gave the USA a lot of technology via the Tizard mission (and later, jet-engine technology was also licenced), and both Russia and the USA "acquired" a lot of rocket technology by picking over the bones of Germany's industrial infrastructure. Then, Russia spent the next 40 years stealing whatever nuclear/military technology it could from the USA - though I'm sure some things would have trickled the other way as well!

Anyway, if you trace any modern technology back far enough, there will have been state intervention. That shouldn't subtract in any way from the work done by companies and individuals who have produced something where the sum is greater than the parts...

Roland6

Re: Apple invented the iPhone...

... in the same way that Ford invented the Model T, Sony invented the Walkman or Nintendo invented the Wii. They took existing technologies, iterated and integrated them, and presented them in the right way in the right place at the right time.

And that's been true of pretty much every invention since someone discovered how to knap flint.

Not so sure, Singer did a little more with respect to the sewing machine - his was the forst that actually worked. Likewise Marconi was the first with a working wireless. Yes both made extensive use of existing technology, but both clearly made that final inventive step; something that isn't so clear in the case of the examples you cite.

Equally, a lot of technology has been acquired through trade, acquisition or outright theft.

Don't disagree, although your analysis omitted Japanese and Chinese acquisition of 'western' technology and know-how...

Anyway, if you trace any modern technology back far enough, there will have been state intervention.

Interesting point, particularly when you consider the case of John Harrison, the inventor of the marine chronometer. Whilst the government did offer a financial reward it was very reluctant to actually pay anything out...

Aitor 1

Apple invented the iPhone, but not the smartphone.

The smartphone had been showed before inseveral incarnations, including the "all touch screen" several years before Apple decided to dabble in smartphones. So no invention here.

As for the experience, again, nothing new. Al thought of before, in good part even implemented.

The key here is that Steve Jobs had the guts to force the thought of a useful smartphone, gadget for the user first and phone second into the minds of the Telcos, and he was the one to get unlimited/big data bundles.

He identified correctly, as many had before but before the power to do anything about it, that the customers are the final users, not the telcos.

The rest of the smartphones were culled before birth by the Telecomm industry, as they demanded certain "features" that nobody wanted but lined their pockets nicely with minumum investment.

So I thank Steve Jobs for that and for being able to buy digital music.

[Dec 26, 2016] FreeDOS 1.2 Is Finally Released

Notable quotes:
"... Jill of the Jungle ..."
Dec 26, 2016 | news.slashdot.org
(freedos.org) 59

Posted by EditorDavid on Sunday December 25, 2016 @02:56PM from the long-term-projects dept.

Very long-time Slashdot reader Jim Hall -- part of GNOME's board of directors -- has a Christmas gift. Since 1994 he's been overseeing an open source project that maintains a replacement for the MS-DOS operating system, and has just announced the release of the "updated, more modern" FreeDOS 1.2 !

[Y]ou'll find a few nice surprises. FreeDOS 1.2 now makes it easier to connect to a network. And you can find more tools and games, and a few graphical desktop options including OpenGEM. But the first thing you'll probably notice is the all-new new installer that makes it much easier to install FreeDOS. And after you install FreeDOS, try the FDIMPLES program to install new programs or to remove any you don't want. Official announcement also available at the FreeDOS Project blog .

FreeDOS also lets you play classic DOS games like Doom , Wolfenstein 3D , Duke Nukem , and Jill of the Jungle -- and today marks a very special occasion, since it's been almost five years since the release of FreeDos 1.1. "If you've followed FreeDOS, you know that we don't have a very fast release cycle," Jim writes on his blog . "We just don't need to; DOS isn't exactly a moving target anymore..."

[Nov 24, 2016] American Computer Scientists Grace Hopper, Margaret Hamilton Receive Presidential Medals of Freedom

Nov 23, 2016 | developers.slashdot.org
(fedscoop.com) 116

Posted by BeauHD on Wednesday November 23, 2016 @02:00AM from the blast-from-the-past dept.

An anonymous reader quotes a report from FedScoop:

President Barack Obama awarded Presidential Medals of Freedom to two storied women in tech -- one posthumously to Grace Hopper, known as the "first lady of software," and one to programmer Margaret Hamilton. Hopper worked on the Harvard Mark I computer, and invented the first compiler.

"At age 37 and a full 15 pounds below military guidelines, the gutsy and colorful Grace joined the Navy and was sent to work on one of the first computers, Harvard's Mark 1," Obama said at the ceremony Tuesday. "She saw beyond the boundaries of the possible and invented the first compiler, which allowed programs to be written in regular language and then translated for computers to understand." Hopper followed her mother into mathematics, and earned a doctoral degree from Yale, Obama said.

She retired from the Navy as a rear admiral. "From cell phones to Cyber Command, we can thank Grace Hopper for opening programming up to millions more people, helping to usher in the Information Age and profoundly shaping our digital world," Obama said. Hamilton led the team that created the onboard flight software for NASA's Apollo command modules and lunar modules, according to a White House release . "

At this time software engineering wasn't even a field yet," Obama noted at the ceremony. "There were no textbooks to follow, so as Margaret says, 'there was no choice but to be pioneers.'" He added: "Luckily for us, Margaret never stopped pioneering. And she symbolizes that generation of unsung women who helped send humankind into space."

[Sep 06, 2016] The packet switching methodology employed in the ARPANET was based on concepts and designs by Americans Leonard Kleinrock and Paul Baran, British scientist Donald Davies, and Lawrence Roberts of the Lincoln Laboratory

Notable quotes:
"... The packet switching methodology employed in the ARPANET was based on concepts and designs by Americans Leonard Kleinrock and Paul Baran, British scientist Donald Davies, and Lawrence Roberts of the Lincoln Laboratory.[6] The TCP/IP communications protocols were developed for ARPANET by computer scientists Robert Kahn and Vint Cerf, and incorporated concepts by Louis Pouzin for the French CYCLADES project. ..."
"... In 1980 DoD was a huge percent of the IC business, a lot of the R&D was done at Bell Labs, some of that for telecom not DoD. By 1995 or so DoD was shuttering its IC development as it was all being done for Wii. Which is a minor cause for why software is so hard for DoD; the chips are not under control and change too fast. ..."
"... About 20 years ago I conversed with a fellow who was in ARPANET at the beginning. We were getting into firewalls at the time with concerns for security (Hillary was recently elected to the senate) and he was shaking his head saying: "It was all developed for collaboration.... security gets in the way". ..."
Sep 05, 2016 | economistsview.typepad.com

pgl : Monday, September 05, 2016 at 11:07 AM

Al Gore could not have invented the Internet since Steve Jobs is taking the bow for that. Actually Jobs started NeXT which Apple bought in 1997 for a mere $427 million. NeXT had sold a couple of computer models that did not do so well but the platform software allowed Apple to sell Web based computers. BTW - the internet really began in the 1980's as something called Bitnet. Really clunky stuff back then but new versions and applications followed. But yes - the Federal government in the 1990's was very supportive of the ICT revolution.
ilsm -> pgl... , Monday, September 05, 2016 at 11:59 AM
DARPA did most of it to keep researchers talking.
RC AKA Darryl, Ron -> pgl... , Monday, September 05, 2016 at 12:35 PM
https://en.wikipedia.org/wiki/ARPANET

The Advanced Research Projects Agency Network (ARPANET) was an early packet switching network and the first network to implement the protocol suite TCP/IP. Both technologies became the technical foundation of the Internet. ARPANET was initially funded by the Advanced Research Projects Agency (ARPA) of the United States Department of Defense.[1][2][3][4][5]

The packet switching methodology employed in the ARPANET was based on concepts and designs by Americans Leonard Kleinrock and Paul Baran, British scientist Donald Davies, and Lawrence Roberts of the Lincoln Laboratory.[6] The TCP/IP communications protocols were developed for ARPANET by computer scientists Robert Kahn and Vint Cerf, and incorporated concepts by Louis Pouzin for the French CYCLADES project.

As the project progressed, protocols for internetworking were developed by which multiple separate networks could be joined into a network of networks. Access to the ARPANET was expanded in 1981 when the National Science Foundation (NSF) funded the Computer Science Network (CSNET). In 1982, the Internet protocol suite (TCP/IP) was introduced as the standard networking protocol on the ARPANET. In the early 1980s the NSF funded the establishment for national supercomputing centers at several universities, and provided interconnectivity in 1986 with the NSFNET project, which also created network access to the supercomputer sites in the United States from research and education organizations. ARPANET was decommissioned in 1990...

Creation

By mid-1968, Taylor had prepared a complete plan for a computer network, and, after ARPA's approval, a Request for Quotation (RFQ) was issued for 140 potential bidders. Most computer science companies regarded the ARPA–Taylor proposal as outlandish, and only twelve submitted bids to build a network; of the twelve, ARPA regarded only four as top-rank contractors. At year's end, ARPA considered only two contractors, and awarded the contract to build the network to BBN Technologies on 7 April 1969. The initial, seven-person BBN team were much aided by the technical specificity of their response to the ARPA RFQ, and thus quickly produced the first working system. This team was led by Frank Heart. The BBN-proposed network closely followed Taylor's ARPA plan: a network composed of small computers called Interface Message Processors (or IMPs), similar to the later concept of routers, that functioned as gateways interconnecting local resources. At each site, the IMPs performed store-and-forward packet switching functions, and were interconnected with leased lines via telecommunication data sets (modems), with initial data rates of 56kbit/s. The host computers were connected to the IMPs via custom serial communication interfaces. The system, including the hardware and the packet switching software, was designed and installed in nine months...

sanjait -> RC AKA Darryl, Ron... , Monday, September 05, 2016 at 01:09 PM
Though the thing we currently regard as "the Internet", including such innovations as the world wide web and the web browser, were developed as part of "the Gore bill" from 1991.

http://www.theregister.co.uk/2000/10/02/net_builders_kahn_cerf_recognise/

https://en.wikipedia.org/wiki/High_Performance_Computing_Act_of_1991

In case anyone is trying to argue Gore didn't massively contribute to the development of the Internet, as he claimed.

pgl -> sanjait... , Monday, September 05, 2016 at 02:37 PM
So the American government help paved the way for this ICT revolution. Steve Jobs figures out how Apple could make incredible amounts of income off of this. He also shelters most of that income in a tax haven so Apple does not pay its share of taxes. And Tim Cook lectures the Senate in May of 2013 why they should accept this. No wonder Senator Levin was so upset with Cook.
ilsm -> pgl... , Monday, September 05, 2016 at 04:29 PM
In 1980 DoD was a huge percent of the IC business, a lot of the R&D was done at Bell Labs, some of that for telecom not DoD. By 1995 or so DoD was shuttering its IC development as it was all being done for Wii. Which is a minor cause for why software is so hard for DoD; the chips are not under control and change too fast.
ilsm -> RC AKA Darryl, Ron... , Monday, September 05, 2016 at 04:25 PM
About 20 years ago I conversed with a fellow who was in ARPANET at the beginning. We were getting into firewalls at the time with concerns for security (Hillary was recently elected to the senate) and he was shaking his head saying: "It was all developed for collaboration.... security gets in the way".

[Sep 16, 2015] This Is Why Hewlett-Packard Just Fired Another 30,000

"...An era of leadership in computer technology has died, and there is no grave marker, not even a funeral ceremony or eulogy ... Hewlett-Packard, COMPAQ, Digital Equipment Corp, UNIVAC, Sperry-Rand, Data General, Tektronix, ZILOG, Advanced Micro Devices, Sun Microsystems, etc, etc, etc. So much change in so short a time, leaves your mind dizzy."
Zero Hedge

SixIsNinE

yeah thanks Carly ...

HP made bullet-proof products that would last forever..... I still buy HP workstation notebooks, especially now when I can get them for $100 on ebay ....

I sold HP products in the 1990s .... we had HP laserjet IIs that companies would run day & night .... virtually no maintenance ... when PCL5 came around then we had LJ IIIs .... and still companies would call for LJ I's, .... 100 pounds of invincible Printing ! .... this kind of product has no place in the World of Planned-Obsolesence .... I'm currently running an 8510w, 8530w, 2530p, Dell 6420 quad i7, hp printers hp scanners, hp pavilion desktops, .... all for less than what a Laserjet II would have cost in 1994, Total.

Not My Real Name

I still have my HP 15C scientific calculator I bought in 1983 to get me through college for my engineering degree. There is nothing better than a hand held calculator that uses Reverse Polish Notation!

BigJim

HP used to make fantastic products. I remember getting their RPN calculators back in th 80's; built like tanks.

Then they decided to "add value" by removing more and more material from their consumer/"prosumer" products until they became unspeakably flimsy. They stopped holding things together with proper fastenings and starting hot melting/gluing it together, so if it died you had to cut it open to have any chance of fixing it.

I still have one of their Laserjet 4100 printers. I expect it to outlast anything they currently produce, and it must be going on 16+ years old now.

Fuck you, HP. You started selling shit and now you're eating through your seed corn. I just wish the "leaders" who did this to you had to pay some kind of penalty greater than getting $25M in a severance package.

Fiscal Reality

HP12C. 31 years old and still humming.WTF happened?

Automatic Choke

+100. The path of HP is everything that is wrong about modern business models. I still have a 5MP laserjet (one of the first), still works great. Also have a number of 42S calculators.....my day-to-day workhorse and several spares. I don't think the present HP could even dream of making these products today.

nope-1004

How well will I profit, as a salesman, if I sell you something that works?

How valuable are you, as a customer in my database, if you never come back?

Confucious say "Buy another one, and if you can't afford it, f'n finance it!"

It's the growing trend. Look at appliances. Nothing works anymore.

Normalcy Bias

https://en.wikipedia.org/wiki/Planned_obsolescence

Son of Loki

GE to cut Houston jobs as work moves overseas

http://www.bizjournals.com/houston/news/2015/09/15/ge-to-cut-houston-job...

" Yes we can! "

Automatic Choke

hey big brother.... if you are curious, there is a damn good android emulator of the HP42S available (Free42). really it is so good that it made me relax about accumulating more spares. still not quite the same as a real calculator. (the 42S, by the way, is the modernization/simplification of the classic HP41, the real hardcord very-programmable, reconfigurable, hackable unit with all the plug-in-modules that came out in the early 80s.)

Miss Expectations

Imagine working at HP and having to listen to Carly Fiorina bulldoze you...she is like a blow-torch...here are 4 minutes of Carly and Ralph Nader (if you can take it):

https://www.youtube.com/watch?v=vC4JDwoRHtk

Miffed Microbiologist

My husband has been a software architect for 30 years at the same company. Never before has he seen the sheer unadulterated panic in the executives. All indices are down and they are planning for the worst. Quality is being sacrificed for " just get some relatively functional piece of shit out the door we can sell". He is fighting because he has always produced a stellar product and refuses to have shit tied to his name ( 90% of competitor benchmarks fail against his projects). They can't afford to lay him off, but the first time in my life I see my husband want to quit...

unplugged

I've been an engineer for 31 years - our managements's unspoken motto at the place I'm at (large company) is: "release it now, we'll put in the quality later". I try to put in as much as possible before the product is shoved out the door without killing myself doing it.

AGuy

Do they even make test equipment anymore?

HP test and measurement was spun off many years ago as Agilent. The electronics part of Agilent was spun off as keysight late last year.

HP basically makes computer equipment (PCs, servers, Printers) and software. Part of the problem is that computer hardware has been commodized. Since PCs are cheap and frequent replacements are need, People just by the cheapest models, expecting to toss it in a couple of years and by a newer model (aka the Flat screen TV model). So there is no justification to use quality components. Same is become true with the Server market. Businesses have switched to virtualization and/or cloud systems. So instead of taking a boat load of time to rebuild a crashed server, the VM is just moved to another host.

HP has also adopted the Computer Associates business model (aka Borg). HP buys up new tech companies and sits on the tech and never improves it. It decays and gets replaced with a system from a competitor. It also has a habit of buying outdated tech companies that never generate the revenues HP thinks it will.

BullyBearish

When Carly was CEO of HP, she instituted a draconian "pay for performance" plan. She ended up leaving with over $146 Million because she was smart enough not to specify "what type" of performance.

InnVestuhrr

An era of leadership in computer technology has died, and there is no grave marker, not even a funeral ceremony or eulogy ... Hewlett-Packard, COMPAQ, Digital Equipment Corp, UNIVAC, Sperry-Rand, Data General, Tektronix, ZILOG, Advanced Micro Devices, Sun Microsystems, etc, etc, etc. So much change in so short a time, leaves your mind dizzy.

[Dec 26, 2014] Donald Knuth Worried About the Dumbing Down of Computer Science History

Dec 26, 2014 | Slashdot

Anonymous Coward on Friday December 26, 2014 @01:58PM (#48676489)

The physics does NOT define Computer Science (Score:5, Insightful)

The physics does NOT define Computer Science. Computer Science has nothing that depends on transistors, or tubes, or levers and gears.

Computers can be designed and built, and computing performed, at many different levels of physical abstraction.

You can do computer science all on paper for fucks sake.

Ever heard of this guy called Alan Turing?

Knuth is right, the ignorance, even among technical people, is astounding

Dracos (107777) on Friday December 26, 2014 @12:59PM (#48676173)

Re:False Summary - Haigh Agrees with Knuth's Thesi (Score:5, Insightful)

there are indeed no good technical histories of computer science, and little prospect of any.

I see the posthumous reactions to Steve Jobs and Dennis Ritchie as indicators that Knuth is absolutely right.

I bet anyone here would agree that co-authoring UNIX is a far more important event than being the iPod/iPhone taskmaster.

ripvlan (2609033) on Friday December 26, 2014 @11:53AM (#48675785)

But wait,there's more (Score:3, Insightful)

I returned to college several years ago after a 20 year hiatus (the first 6 years were my creative period). My first time around I studied what might be called pure Computer Science. A lot has happened in the industry after 20 years and I very much enjoyed conversations in class - esp with the perspective of the younger generation. I found it fascinating how many kids of today hoped to enter the gaming industry (my generation - Zork was popular when I was a kid and Myst was a breakout success on a new level). Kids today see blockbuster gaming as an almost make it rich experience - plus a "real world" job that sounds like fun.

But more interesting was the concepts of Computer Engineering vs Computer Science. What is Science vs Engineering? Are software "engineers" really scientists? Do they need to learn all this sciencey stuff in order to enter the business school? I attended a large semi-well-known University. Back in the '80s the CS department was "owned" by the school of business. Programming computers was thought to be the money maker - only business really used them with a strong overlap into engineering because computers were really big calculators. However it was a real CS curriculum with only 1 class for business majors. Fast forward a dozen years and CS is now part of the Engineering school (with Business on its own). The "kids" wondered why they needed to study Knuth et al when they were just going to be programming games. What about art? Story telling? They planned on using visual creative studio tools to create their works. Why all this science stuff? (this in a haptics class). Should a poet learn algorithms in order to operate MS-Word?

Since computers are ubiquitous they are used everywhere. I tell students to get a degree in what interests them - and learn how to use/program computers because...well..who doesn't use a computer? I used to program my TI calculator in highschool to pump out answers to physics & algebra questions (basic formulas).

Are those who program Excel Macros computer scientists? No. Computer Engineers? no. Business people solving real problems? Yes/maybe. The land is now wider. Many people don't care about the details of landing a man on the moon - but they like it when the velcro strap on their shoes holds properly. They receive entertainment via the Discovery Channel and get the dumbed down edition of all things "science."

When creating entertainment - it needs to be relatable to your target audience. The down and dirty details and technicalities interest only a few of us. My wife's eyes glaze over when I talk about some cool thing I'm working on. Retell it as saving the world and improving quality - she gets it (only to politely say I should go play with the kids -- but at least she was listening to that version of events).

I think that the dumbing down of history is ... well.. history. There was this thing called World War 2. The details I learned in grade school - lots of details. Each battle, names of important individuals. Today - lots of history has happened in the meantime. WW2 is now a bit dumbed down - still an important subject - but students still only have 8 grades in school with more material to cover.

My brain melts when I watch the Discovery Channel. I'm probably not the target audience. The details of historical science probably interest me. The history of Computing needs to be told like "The Social Network."

Virtucon (127420) on Friday December 26, 2014 @12:19PM (#48675913)

it's everywhere (Score:3)

we've raised at least two generations of self obsessed, no attention-span kids who want instant gratification. Retards like Justin Bieber who today tweets that he bought a new plane. As the later generations grow into the workforce and into fields like journalism, history and computer science it's no small wonder they want to reduce everything down to one liners or soundbites. Pick your field because these kids started with censored cartoons and wound up with Sponge Bob. Shit, even the news is now brokered into short paragraphs that just say "this shit happened now onto the next.."

Screw that! Yeah I'm getting older so get the fuck off my lawn!

xororand (860319) writes: on Friday December 26, 2014 @02:07PM ( #48676543)

The Machine That Changed The World

There's a gem of a documentary about the history of computing before the web.

The Machine That Changed the World is the longest, most comprehensive documentary about the history of computing ever produced.

It's a whirlwind tour of computing before the Web, with brilliant archival footage and interviews with key players — several of whom passed away since the filming.

Episode 1 featured Interviews with, including but not limited to:

Paul Ceruzzi (computer historian), Doron Swade (London Science Museum), Konrad Zuse (inventor of the first functional computer and high-level programming language, died in 1995), Kay Mauchly Antonelli (human computer in WWII and ENIAC programmer, died in 2006), Herman Goldstine (ENIAC developer, died in 2004), J. Presper Eckert (co-inventor of ENIAC, died in 1995), Maurice Wilkes (inventor of EDSAC), Donald Michie (Codebreaker at Bletchley Park)

luis_a_espinal (1810296) on Friday December 26, 2014 @03:49PM (#48676989) Homepage

There is a CS dumbing down going on (Score:2)

Donald Knuth Worried About the "Dumbing Down" of Computer Science History

Whether CS education is appropriate to all people who do computed-assisted technical work is very irrelevant to me since practical forces in real life simply solve that issue.

The problem I care about is a problem I seen in CS for real. I've met quite a few CS grads who don't know who Knuth, Lamport, Liskov, Hoare Tarjan, o Dijkstra are.

If you (the generic CS grad) do not know who they are, how the hell do you know about basic CS things like routing algorithms, pre and post conditions, data structures, you know, the very basic shit that is supposed to be the bread and butter of CS????

It is ok not to know these things and these people if you are a Computer Engineer, MIS or Network/Telecomm engineer (to a degree dependent on what your job expects from you.)

But if you are Computer Scientist, my God, this is like hiring an Electrical Engineer who doesn't know who Maxwell was. It does not inspire a lot of confidence, does it?

Aikiplayer (804569) on Friday December 26, 2014 @05:59PM (#48677657)

Re:Don't do what they did to math (Score:1)

Knuth did a nice job of articulating why he wants to look at the history of things at the beginning of the video. Those reasons might not resonate with you but he does have definite reasons for wanting technical histories (not social histories which pander to "the stupid") to be written.

[Dec 26, 2014] The Tears of Donald Knuth

History is always written by the winners, and that means right now it is written by neoliberals. Dumping down history of computer science is just application of neoliberalism to particular narrow field. The to way an essence of neoliberal history is "to dumb down everything". Dumbing down is a deliberate lowering of the intellectual level of education, literature, cinema, news, and culture. Deliberate dumbing down is the goal.
They use power of vanity to rob us of vision which history can provide. Knuth lecture "Let's Not Dumb Down the History of Computer Science" can be viewed at Kailath Lecture and Colloquia. He did important point that historical errors are as important as achievement, and probably more educational. In this "drama of ideas" (and he mentioned high educational value of errors/blunders of Linux Torvalds in design of Linux kernel) errors and achievement s all have their place and historical value. History gives people stories that are much more educational then anything else. that's that way people learn best.
Dec 26, 2014 | Communications of the ACM, January 2015

In his lecture Knuth worried that a "dismal trend" in historical work meant that "all we get nowadays is dumbed down" through the elimination of technical detail. According to Knuth "historians of math have always faced the fact that they won't be able to please everybody." He feels that other historians of science have succumbed to "the delusion that ... an ordinary person can understand physics ..."

I am going to tell you why Knuth's tears were misguided, or at least misdirected, but first let me stress that historians of computing deeply appreciate his conviction that our mission is of profound importance. Indeed, one distinguished historian of computing recently asked me what he could do to get flamed by Knuth. Knuth has been engaged for decades with history. This is not one of his passionate interests outside computer science, such as his project reading verses 3:16 of different books of the Bible. Knuth's core work on computer programming reflects a historical sensibility, as he tracks down the origin and development of algorithms and reconstructs the development of thought in specific areas. For years advertisements for IEEE Annals of the History of Computing, where Campbell-Kelly's paper was published, relied on a quote from Knuth that it was the only publication he read from cover to cover. With the freedom to choose a vital topic for a distinguished lecture Knuth chose to focus on history rather than one of his better-known scientific enthusiasms such as literate programming or his progress with The Art of Computer Programming.

... Distinguished computer scientists are prone to blur their own discipline, and in particular few dozen elite programs, with the much broader field of computing. The tools and ideas produced by computer scientists underpin all areas of IT and make possible the work carried out by network technicians, business analysts, help desk workers, and Excel programmers. That does not make those workers computer scientists. The U.S. alone is estimated to have more than 10 million "information technology workers," which is about a hundred times more than the ACM's membership. Vint Cerf has warned in Communications that even the population of "professional programmers" dwarfs the association's membership.7 ACM's share of the IT workforce has been in decline for a half-century, despite efforts begun back in the 1960s and 1970s by leaders such as Walter Carlson and Herb Grosch to broaden its appeal.

... ... ...

So why is the history of computer science not being written in the volume it deserves, or the manner favored by Knuth? I am, at heart, a social historian of science and technology and so my analysis of the situation is grounded in disciplinary and institutional factors. Books of this kind would demand years of expert research and sell a few hundred copies. They would thus be authored by those not expected to support themselves with royalties, primarily academics.

... ... ...

The history of science is a kind of history, which is in turn part of the humanities. Some historians of science are specialists within broad history departments, and others work in specialized programs devoted to science studies or to the history of science, technology, or medicine. In both settings, historians judge the work of prospective colleagues by the standards of history, not those of computer science. There are no faculty jobs earmarked for scholars with doctoral training in the history of computing, still less in the history of computer science. The persistently brutal state of the humanities job market means that search committees can shortlist candidates precisely fitting whatever obscure combination of geographical area, time period, and methodological approaches are desired. So a bright young scholar aspiring to a career teaching and researching the history of computer science would need to appear to a humanities search committee as an exceptionally well qualified historian of the variety being sought (perhaps a specialist in gender studies or the history of capitalism) who happens to work on topics related to computing.

... ... ...

Thus the kind of historical work Knuth would like to read would have to be written by computer scientists themselves. Some disciplines support careers spent teaching history to their students and writing history for their practitioners. Knuth himself holds up the history of mathematics as an example of what the history of computing should be. It is possible to earn a Ph.D. within some mathematics departments by writing a historical thesis (euphemistically referred to as an "expository" approach). Such departments have also been known to hire, tenure, and promote scholars whose research is primarily historical. Likewise medical schools, law schools, and a few business schools have hired and trained historians. A friend involved in a history of medicine program recently told me that its Ph.D. students are helped to shape their work and market themselves differently depending on whether they are seeking jobs in medical schools or in history programs. In other words, some medical schools and mathematics departments have created a demand for scholars working on the history of their disciplines and in response a supply of such scholars has arisen.

As Knuth himself noted toward the end of his talk, computer science does not offer such possibilities. As far as I am aware no computer science department in the U.S. has ever hired as a faculty member someone who wrote a Ph.D. on a historical topic within computer science, still less someone with a Ph.D. in history. I am also not aware of anyone in the U.S. having been tenured or promoted within a computer science department on the basis of work on the history of computer science. Campbell-Kelly, now retired, did both things (earning his Ph.D. in computer science under Randell's direction) but he worked in England where reputable computer science departments have been more open to "fuzzy" topics than their American counterparts. Neither are the review processes and presentation formats at prestigious computer conferences well suited for the presentation of historical work. Nobody can reasonably expect to build a career within computer science by researching its history.

In its early days the history of computing was studied primarily by those who had already made their careers and could afford to indulge pursuing historical interests from tenured positions or to dabble after retirement. Despite some worthy initiatives, such as the efforts of the ACM History Committee to encourage historical projects, the impulse to write technical history has not spread widely among younger generations of distinguished and secure computer scientists.

... ... ...

Contrary both to Knuth's despair and to Campbell-Kelly's story of a march of progress away from technical history, some scholars with formal training in history and philosophy have been turning to topics with more direct connections to computer science over the past few years. Liesbeth De Mol and Maarten Bullynck have been working to engage the history and philosophy of mathematics with issues raised by early computing practice and to bring computer scientists into more contact with historical work.3 Working with like-minded colleagues, they helped to establish a new Commission for the History and Philosophy of Computing within the International Union of the History and Philosophy of Science. Edgar Daylight has been interviewing famous computer scientists, Knuth included, and weaving their remarks into fragments of a broader history of computer science.8 Matti Tedre has been working on the historical shaping of computer science and its development as a discipline.22 The history of Algol was a major focus of the recent European Science Foundation project Software for Europe. Algol, as its developers themselves have observed, was important not only for pioneering new capabilities such as recursive functions and block structures, but as a project bringing together a number of brilliant research-minded systems programmers from different countries at a time when computer science had yet to coalesce as a discipline.c Pierre Mounier-Kuhn has looked deeply into the institutional history of computer science in France and its relationship to the development of the computer industry.16

Stephanie Dick, who recently earned her Ph.D. from Harvard, has been exploring the history of artificial intelligence with close attention to technical aspects such as the development and significance of the linked list data structure.d Rebecca Slayton, another Harvard Ph.D., has written about the engagement of prominent computer scientists with the debate on the feasibility of the "Star Wars" missile defense system; her thesis has been published as an MIT Press book.20 At Princeton, Ksenia Tatarchenko recently completed a dissertation on the USSR's flagship Akademgorodok Computer Center and its relationship to Western computer science.21 British researcher Mark Priestley has written a deep and careful exploration of the history of computer architecture and its relationship to ideas about computation and logic.18 I have worked with Priestly to explore the history of ENIAC, looking in great detail at the functioning and development of what we believe to be the first modern computer program ever executed.9 Our research engaged with some of the earliest historical work on computing, including Knuth's own examination of John von Neumann's first sketch of a modern computer program10 and Campbell-Kelly's technical papers on early programming techniques.5

[Nov 12, 2014] 2014 in video gaming

Nov 12, 2014 | Wikipedia,

The year 2014 will see release of numerous games, including new installments for some well-received franchises, such as Alone in the Dark, Assassin's Creed, Bayonetta, Borderlands, Call of Duty, Castlevania, Civilization, Dark Souls, Donkey Kong, Dragon Age, The Elder Scrolls, Elite, EverQuest, Far Cry, Final Fantasy, Forza Horizon, Infamous, Kinect Sports, Kirby, LittleBigPlanet, Mario Golf, Mario Kart, Metal Gear, MX vs. ATV, Ninja Gaiden, Persona, Pokémon, Professor Layton, Shantae, Sniper Elite, Sonic the Hedgehog, Strider Hiryu, Super Smash Bros., Tales, The Amazing Spider-Man, The Legend of Zelda, The Settlers, The Sims, Thief, Trials, Tropico, Wolfenstein and World of Warcraft.

[Nov 10, 2014] Why Google Glass Is Creepy

May 17, 2013 | Scientific American

The biggest concern seems to be distraction. Google Glass looks like a pair of glasses, minus the lenses; it's just a band across your forehead, with a tiny screen mounted at the upper-right side. By tapping the earpiece and using spoken commands, you direct it to do smartphone-ish tasks, such as fielding a calendar alert and finding a nearby sushi restaurant.

Just what we need, right? People reading texts and watching movies while they drive and attaining new heights of rudeness by scanning their e-mail during face-to-face conversation.

Those are misguided concerns. When I finally got to try Google Glass, I realized that they don't put anything in front of your eyes. You still make eye contact when you talk. You still see the road ahead. The screen is so tiny, it doesn't block your normal vision.

Hilarious parody videos show people undergoing all kinds of injury while peering at the world through a screen cluttered with alerts and ads. But that's not quite how it works. You glance up now and then, exactly as you would check your phone. But because you don't have to look down and dig around in your pocket, you could argue that there's less distraction. By being so hands-free, it should be incredibly handy.

Stormport May 17, 2013, 12:42 PM

Although the fallibility of the human monkey is much trumpeted (e.g. "To Err is Human", NAS study of out of control corporate iatrogenic death in America), there is one area of human activity where we score an almost 100% reliability: the 'justifiability' of the sport shooting of the mentally ill, Big Pharma crazed, suicidal, or just simply angry folks amongst us by our local and national 'law enforcement' (LE). Well, not all are simply shooting for sport, many are the result of overwhelming panic (e.g. 17 or 57 bullet holes in the human target) of individuals who shouldn't be allowed to own a sharp pencil much less carry a gun with a license to kill. I have not bothered to look for the statistics presuming them to be either not available or obfuscated in some way but rely on my local newspaper for my almost daily story of a local police shooting.


With that said, one can only say YES! to Google Glass and its most obvious use, replacing the patrol car dash cam. Every uniformed 'law enforcement' officer and 'security guard' must be so equipped and required to have the camera on and recording at any time on 'duty' and not doing duty in the can or on some other 'personal' time. Consider it simply as having one's supervisor as a 'partner'. Same rules would apply. No sweat.

[Sep 01, 2014] Ex-IBM CEO John Akers dies at 79

The last technically competent CEO, before Lou Gerstner with his financial machinations and excessive greed destroyed IBM as we used to know.
25 Aug 2014 | The Register

Obituary Former IBM CEO John Akers has died in Boston aged 79.

Big Blue announced Akers' passing here, due to a stroke according to Bloomberg.After a stint as a US Navy pilot, the IBM obit states, Akers joined the company in 1960. His 33-year stint with IBM culminated in his appointment as its sixth CEO in 1985, following three years as president.

The top job became something of a poisoned chalice for Akers: the IBM PC project was green-lit during his tenure, and the industry spawned by this computer would cannibalize Big Blue's mainframe revenue, which was already under attack from minicomputers.

His career was founded on the success of the iconic System/360 and System/370 iron, but eventually fell victim to one of the first big disruptions the industry experienced.

He was eventually replaced by Lou Gerstner (as Bloomberg notes, the first CEO to be appointed from outside IBM).

To Gerstner fell the task of reversing the losses IBM was racking up – US$7.8 billion over two years – by embarking on a top-down restructure to shave US$7 billion in costs.

According to retired IBM executive Nicholas Donofrio, Akers took a strong interest in nursing the behind-schedule RS6000 Unix workstation project through to fruition in the late 1980s:

“he asked what additional resources I needed and agreed to meet with me monthly to ensure we made the new schedule”.

[Apr 21, 2014] How Google Screwed Up Google Glass by Gene Marks

Apr 21, 2014 forbes.com

It really is a great idea.

A pair of glasses that can project information or perform actions on a virtual screen in front of you about pretty much anything and all you have to do is ask. Driving directions. LinkedIn LNKD -0.59% connections. Order history. A photo. A video. A phone call. An email. The options seem limitless. And they are. Google GOOG -0.37% Glass really is a great idea. The technology can and probably will change the world. So how did Google screw it up?

Yes, screw it up. Since first announcing the product in 2012, Google Glass has been subject to ridicule and even violence. It’s become a symbol of the anti-tech, anti-Silicon Valley crowd. Surveys like this one demonstrate the American public’s general dislike and distrust of Google Glass. The product has not yet spawned an industry. It has not generated revenues for Google. It’s become a frequent joke on late night TV and a target for bloggers and comedians around the country. The world “glasshole” has now risen to the same prominence as “selfie” and “twerk.” Yes, it’s getting attention. But only as a creepy gimmick which, I’m sure, is not the kind of attention that Google intended when they initially introduced it. As cool as it is, let’s admit that Google Glass will go down in the annals of bad product launches. And it will do so because of these reasons.

[Apr 10, 2014] Google Glass Going On Sale To Public For VERY Limited Time

Apr 10, 2014

For a limited time starting Tuesday, Google will make the wearable device available to more than just the select group of users such as apps developers in its Glass Explorer program.

In a blogpost, Google did not say how many pairs it would sell, just that the quantity would be limited.

"Every day we get requests from those of you who haven't found a way into the program yet, and we want your feedback too," the company said on a Thursday blogpost.

"That's why next Tuesday, April 15th, we'll be trying our latest and biggest Explorer Program expansion experiment to date. We'll be allowing anyone in the U.S. to become an Explorer by purchasing Glass."

Many tech pundits expect wearable devices to go mainstream this year, extending smartphone and tablet capabilities to gadgets worn on the body, from watches to headsets. Google has run campaigns in the past to drum up public involvement, including inviting people to tweet under the hashtag #ifihadglass for a chance to buy a pair of the glasses.

Google Glass has raised privacy concerns, prompting some legislators to propose bans on the gadget.

[Apr 08, 2014] Why won't you DIE IBM's S-360 and its legacy at 50

"... IBM's System 360 mainframe, celebrating its 50th anniversary on Monday, was more than a just another computer. The S/360 changed IBM just as it changed computing and the technology industry. ..."
"... Big Blue introduced new concepts and de facto standards with us now: virtualisation - the toast of cloud computing on the PC and distributed x86 server that succeeded the mainframe - and the 8-bit byte over the 6-bit byte. ..."
Apr 08, 2014 | The Register

IBM's System 360 mainframe, celebrating its 50th anniversary on Monday, was more than a just another computer. The S/360 changed IBM just as it changed computing and the technology industry.

The digital computers that were to become known as mainframes were already being sold by companies during the 1950s and 1960s - so the S/360 wasn't a first.

Where the S/360 was different was that it introduced a brand-new way of thinking about how computers could and should be built and used.

The S/360 made computing affordable and practical - relatively speaking. We're not talking the personal computer revolution of the 1980s, but it was a step.

The secret was a modern system: a new architecture and design that allowed the manufacturer - IBM - to churn out S/360s at relatively low cost. This had the more important effect of turning mainframes into a scalable and profitable business for IBM, thereby creating a mass market.

The S/360 democratized computing, taking it out of the hands of government and universities and putting its power in the hands of many ordinary businesses.

The birth of IBM's mainframe was made all the more remarkable given making the machine required not just a new way of thinking but a new way of manufacturing. The S/360 produced a corporate and a mental restructuring of IBM, turning it into the computing giant we have today.

The S/360 also introduced new technologies, such as IBM's Solid Logic Technology (SLT) in 1964 that meant a faster and a much smaller machine than what was coming from the competition of the time.

Big Blue introduced new concepts and de facto standards with us now: virtualisation - the toast of cloud computing on the PC and distributed x86 server that succeeded the mainframe - and the 8-bit byte over the 6-bit byte.

The S/360 helped IBM see off a rising tide of competitors such that by the 1970s, rivals were dismissively known as "the BUNCH" or the dwarves. Success was a mixed blessing for IBM, which got in trouble with US regulators for being "too" successful and spent a decade fighting a government anti-trust law suit over the mainframe business.

The legacy of the S/360 is with us today, outside of IBM and the technology sector.

naylorjs

S/360 I knew you well

The S/390 name is a hint to its lineage, S/360 -> S/370 -> S/390, I'm not sure what happened to the S/380. Having made a huge jump with S/360 they tried to do the same thing in the 1970s with the Future Systems project, this turned out to be a huge flop, lots of money spent on creating new ideas that would leapfrog the competition, but ultimately failed. Some of the ideas emerged on the System/38 and onto the original AS/400s, like having a query-able database for the file system rather than what we are used to now.

The link to NASA with the S/360 is explicit with JES2 (Job Execution Subsystem 2) the element of the OS that controls batch jobs and the like. Messages from JES2 start with the prefix HASP, which stands for Houston Automatic Spooling Program.

As a side note, CICS is developed at Hursley Park in Hampshire. It wasn't started there though. CICS system messages start with DFH which allegedly stands for Denver Foot Hills. A hint to its physical origins, IBM swapped the development sites for CICS and PL/1 long ago.

I've not touched an IBM mainframe for nearly twenty years, and it worries me that I have this information still in my head. I need to lie down!

Ross Nixon

Re: S/360 I knew you well

I have great memories of being a Computer Operator on a 360/40. They were amazing capable and interesting machines (and peripherals).

QuiteEvilGraham

Re: S/360 I knew you well

ESA is the bit that you are missing - the whole extended address thing, data spaces,hyperspaces and cross-memory extensions.

Fantastic machines though - I learned everything I know about computing from Principals of Operations and the source code for VM/SP - they used to ship you all that, and send you the listings for everything else on microfiche. I almost feel sorry for the younger generations that they will never see a proper machine room with the ECL water-cooled monsters and attendant farms of DASD and tape drives. After the 9750's came along they sort of look like very groovy American fridge-freezers.

Mind you, I can get better mippage on my Thinkpad with Hercules than the 3090 I worked with back in the 80's, but I couldn't run a UK-wide distribution system, with thousands of concurrent users, on it.

Nice article, BTW, and an upvote for the post mentioning The Mythical Man Month; utterly and reliably true.

Happy birthday IBM Mainframe, and thanks for keeping me in gainful employment and beer for 30 years!

Anonymous Coward

Re: S/360 I knew you well

I stated programming (IBM 360 67) and have programmed several IBM mainframe computers. One of the reason for the ability to handle large amounts of data is that these machines communicate to terminals in EBCDIC characters, which is similar to ASCII. It took very few of these characters to program the 3270 display terminals, while modern X86 computers use a graphical display and need a lot data transmitted to paint a screen. I worked for a company that had an IBM-370-168 with VM running both os and VMS.

We had over 1500 terminals connected to this mainframe over 4 states. IBM had visioned that VM/CMS. CICS was only supposed to be a temporary solution to handling display terminals, but it became the main stay in many shops.

Our shop had over 50 3330 300 meg disk drives online with at least 15 tape units. These machines are in use today, in part, because the cost of converting to X86 is prohibitive.

On these old 370 CICS, the screens were separate from the program. JCL (job control language) was used to initiate jobs, but unlike modern batch files, it would attach resources such as a hard drive or tape to the program. This is totally foreign to any modern OS.

Linux or Unix can come close but MS products are totally different.

Stephen Channell

Re: S/360 I knew you well

S/380 was the "future systems program" that was cut down to the S/38 mini.

HASP was the original "grid scheduler" in Houston running on a dedicated mainframe scheduling work to the other 23 mainframes under the bridge.. I nearly wet myself with laughter reading Data-Synapse documentation and their "invention" of a job-control-language. 40 years ago HASP was doing Map/Reduce to process data faster than a tape-drive could handle.

If we don't learn the lessons of history, we are destined to IEFBR14!

Pete 2

Come and look at this!

As a senior IT bod said to me one time, when I was doing some work for a mobile phone outfit.

"it's an IBM engineer getting his hands dirty".

And so it was: a hardware guy, with his sleeves rolled up and blood grime on his hands, replacing a failed board in an IBM mainframe.

The reason it was so noteworthy, even in the early 90's was because it was such a rare occurrence. It was probably one of the major selling points of IBM computers (the other one, with just as much traction, is the ability to do a fork-lift upgrade in a weekend and know it will work.) that they didn't blow a gasket if you looked at them wrong.

The reliability and compatibility across ranges is why people choose this kit. It may be arcane, old-fashioned, expensive and untrendy - but it keeps on running.

The other major legacy of OS/360 was, of course, The Mythical Man Month who's readership is still the most reliable way of telling the professional IT managers from the wannabees who only have buzzwords as a knowledge base.

Amorous Cowherder

Re: Come and look at this!

They were bloody good guys from IBM!

I started off working on mainframes around 1989, as graveyard shift "tape monkey" loading tapes for batch jobs. My first solo job was as a Unix admin on a set of RS/6000 boxes, I once blew out the firmware and a test box wouldn't boot.

I called out an IBM engineer after I completely "futzed" the box, he came out and spent about 2 hours with me teaching me how to select and load the correct firmware. He then spent another 30 mins checking my production system with me and even left me his phone number so I call him directly if I needed help when I did the production box.

I did the prod box with no issues because of the confidence I got and the time he spent with me. Cheers!

David Beck

Re: 16 bit byte?

The typo must be fixed, the article says 6-bit now. The following is for those who have no idea what we are talking about.

Generally machines prior to the S/360 were 6-bit if character or 36-bit if word oriented. The S/360 was the first IBM architecture (thank you Dr's Brooks, Blaauw and Amdahl) to provide both data types with appropriate instructions and to include a "full" character set (256 characters instead of 64) and to provide a concise decimal format (2 digits in one character position instead of 1) 8-bits was chosen as the "character" length.

It did mean a lot of Fortran code had to be reworked to deal with 32-bit single precision or 32 bit integers instead of the previous 36-bit.

If you think the old ways are gone, have a look at the data formats for the Unisys 2200.

John Hughes

Virtualisation

Came with the S/370, not the S/360, which didn't even have virtual memory.

Steve Todd

Re: Virtualisation

The 360/168 had it, but it was a rare beast.

Mike 140

Re: Virtualisation

Nope. CP/67 was the forerunner of IBM's VM. Ran on S/360

David Beck

Re: Virtualisation

S/360 Model 67 running CP67 (CMS which became VM) or the Michigan Terminal System. The Model 67 was a Model 65 with a DAT box to support paging/segmentation but CP67 only ever supported paging (I think, it's been a few years).

Steve Todd

Re: Virtualisation

The 360/168 had a proper MMU and thus supported virtual memory. I interviewed at Bradford university, where they had a 360/168 that they were doing all sorts of things that IBM hadn't contemplated with (like using conventional glass teletypes hooked to minicomputers so they could emulate the page based - and more expensive - IBM terminals).

I didn't get to use an IBM mainframe in anger until the 3090/600 was available (where DEC told the company that they'd need a 96 VAX cluster and IBM said that one 3090/600J would do the same task). At the time we were using VM/TSO and SQL/DS, and were hitting 16MB memory size limits.

Peter Gathercole

Re: Virtualisation @Steve Todd

I'm not sure that the 360/168 was a real model. The Wikipedia article does not think so either.

As far as I recall, the only /168 model was the 370/168, one of which was at Newcastle University in the UK, serving other Universities in the north-east of the UK, including Durham (where I was) and Edinburgh.

They also still had a 360/65, and one of the exercises we had to do was write some JCL in OS/360. The 370 ran MTS rather than an IBM OS.

Grumpy Guts

Re: Virtualisation

You're right. The 360/67 was the first VM - I had the privilege of trying it out a few times. It was a bit slow though. The first version of CP/67 only supported 2 terminals I recall... The VM capability was impressive. You could treat files as though they were in real memory - no explicit I/O necessary.

Chris Miller

This was a big factor in the profitability of mainframes. There was no such thing as an 'industry-standard' interface - either physical or logical. If you needed to replace a memory module or disk drive, you had no option* but to buy a new one from IBM and pay one of their engineers to install it (and your system would probably be 'down' for as long as this operation took). So nearly everyone took out a maintenance contract, which could easily run to an annual 10-20% of the list price. Purchase prices could be heavily discounted (depending on how desperate your salesperson was) - maintenance charges almost never were.

* There actually were a few IBM 'plug-compatible' manufacturers - Amdahl and Fujitsu. But even then you couldn't mix and match components - you could only buy a complete system from Amdahl, and then pay their maintenance charges. And since IBM had total control over the interface specs and could change them at will in new models, PCMs were generally playing catch-up.

David Beck

Re: Maintenance

So true re the service costs, but "Field Engineering" as a profit centre and a big one at that. Not true regarding having to buy "complete" systems for compatibility. In the 70's I had a room full of CDC disks on a Model 40 bought because they were cheaper and had a faster linear motor positioner (the thing that moved the heads), while the real 2311's used hydraulic positioners. Bad day when there was a puddle of oil under the 2311.

John Smith

@Chris Miller

"This was a big factor in the profitability of mainframes. There was no such thing as an 'industry-standard' interface - either physical or logical. If you needed to replace a memory module or disk drive, you had no option* but to buy a new one from IBM and pay one of their engineers to install it (and your system would probably be 'down' for as long as this operation took). So nearly everyone took out a maintenance contract, which could easily run to an annual 10-20% of the list price. Purchase prices could be heavily discounted (depending on how desperate your salesperson was) - maintenance charges almost never were."

True.

Back in the day one of the Scheduler software suppliers made a shed load of money (the SW was $250k a pop) by making new jobs start a lot faster and letting shops put back their memory upgrades by a year or two.

Mainframe memory was expensive.

Now owned by CA (along with many things mainframe) and so probably gone to s**t.

tom dial

Re: Maintenance

Done with some frequency. In the DoD agency where I worked we had mostly Memorex disks as I remember it, along with various non-IBM as well as IBM tape drives, and later got an STK tape library. Occasionally there were reports of problems where the different manufacturers' CEs would try to shift blame before getting down to the fix.

I particularly remember rooting around in a Syncsort core dump that ran to a couple of cubic feet from a problem eventually tracked down to firmware in a Memorex controller. This highlighted the enormous I/O capacity of these systems, something that seems to have been overlooked in the article. The dump showed mainly long sequences of chained channel programs that allowed the mainframe to transfer huge amounts of data by executing a single instruction to the channel processors, and perform other possibly useful work while awaiting completion of the asynchronous I/O.

Mike Pellatt

Re: Maintenance

@ChrisMiller - The IBM I/O channel was so well-specified that it was pretty much a standard. Look at what the Systems Concepts guys did - a Dec10 I/O and memory bus to IBM channel converter. Had one of those in the Imperial HENP group so we could use IBM 6250bpi drives as DEC were late to market with them. And the DEC 1600 bpi drives were horribly unreliable.

The IBM drives were awesome. It was always amusing explaining to IBM techs why they couldn't run online diags. On the rare occasions when they needed fixing.

David Beck

Re: Maintenance

It all comes flooding back.

A long CCW chain, some of which are the equivalent of NOP in channel talk (where did I put that green card?) with a TIC (Transfer in Channel, think branch) at the bottom of the chain back to the top. The idea was to take an interrupt (PCI) on some CCW in the chain and get back to convert the NOPs to real CCWs to continue the chain without ending it. Certainly the way the page pool was handled in CP67.

And I too remember the dumps coming on trollies. There was software to analyse a dump tape but that name is now long gone (as was the origin of most of the problems in the dumps). Those were the days I could not just add and subtract in hex but multiply as well.

Peter Simpson

The Mythical Man-Month

Fred Brooks' seminal work on the management of large software projects, was written after he managed the design of OS/360. If you can get around the mentions of secretaries, typed meeting notes and keypunches, it's required reading for anyone who manages a software project. Come to think of it...*any* engineering project. I've recommended it to several people and been thanked for it.

// Real Computers have switches and lights...

Madeye

The Mythical Man-Month

The key concepts of this book are as relevant today as they were back in the 60s and 70s - it is still oft quoted ("there are no silver bullets" being one I've heard recently). Unfortunately fewer and fewer people have heard of this book these days and even fewer have read it, even in project management circles.

WatAWorld

Was IBM ever cheaper?

I've been in IT since the 1970s.

My understanding from the guys who were old timers when I started was the big thing with the 360 was the standardized Op Codes that would remain the same from model to model, with enhancements, but never would an Op Code be withdrawn.

The beauty of IBM s/360 and s/370 was you had model independence. The promise was made, and the promise was kept, that after re-writing your programs in BAL (360's Basic Assembler Language) you'd never have to re-code your assembler programs ever again.

Also the re-locating loader and method of link editing meant you didn't have to re-assemble programs to run them on a different computer. Either they would simply run as it, or they would run after being re-linked. (When I started, linking might take 5 minutes, where re-assembling might take 4 hours, for one program. I seem to recall talk of assemblies taking all day in the 1960s.)

I wasn't there in the 1950s and 60s, but I don't recall any one ever boasting at how 360s or 370s were cheaper than competitors.

IBM products were always the most expensive, easily the most expensive, at least in Canada.

But maybe in the UK it was like that. After all the UK had its own native computer manufacturers that IBM had to squeeze out despite patriotism still being a thing in business at the time.

PyLETS

Cut my programming teeth on S/390 TSO architecture

We were developing CAD/CAM programs in this environment starting in the early eighties, because it's what was available then, based on use of this system for stock control in a large electronics manufacturing environment. We fairly soon moved this Fortran code onto smaller machines, DEC/VAX minicomputers and early Apollo workstations. We even had an early IBM-PC in the development lab, but this was more a curiosity than something we could do much real work on initially. The Unix based Apollo and early Sun workstations were much closer to later PCs once these acquired similar amounts of memory, X-Windows like GUIs and more respectable graphics and storage capabilities, and multi-user operating systems.

Gordon 10

Ahh S/360 I knew thee well

Cut my programming teeth on OS/390 assembler (TPF) at Galileo - one of Amadeus' competitors.

I interviewed for Amadeus's initial project for moving off of S/390 in 1999 and it had been planned for at least a year or 2 before that - now that was a long term project!

David Beck

Re: Ahh S/360 I knew thee well

There are people who worked on Galileo still alive? And ACP/TPF still lives, as zTPF? I remember a headhunter chasing me in the early 80's for a job in OZ, Quantas looking for ACP/TPF coders, $80k US, very temping.

You can do everything in 2k segments of BAL.

Anonymous IV

No mention of microcode?

Unless I missed it, there was no reference to microcode which was specific to each individual model of the S/360 and S/370 ranges, at least, and provided the 'common interface' for IBM Assembler op-codes. It is the rough equivalent of PC firmware. It was documented in thick A3 black folders held in two-layer trolleys (most of which held circuit diagrams, and other engineering amusements), and was interesting to read (if not understand). There you could see that the IBM Assembler op-codes each translated into tens or hundreds of microcode machine instructions. Even 0700, NO-OP, got expanded into surprisingly many machine instructions.

John Smith 19

Re: No mention of microcode?

"I first met microcode by writing a routine to do addition for my company's s/370. Oddly, they wouldn't let me try it out on the production system :-)"

I did not know the microcode store was writeable.

Microcode was a core (no pun intended) feature of the S/360/370/390/4030/z architecture.

It allowed IBM to trade actual hardware (EG a full spec hardware multiplier) for partial (part word or single word) or completely software based (microcode loop) depending on the machines spec (and the customers pocket) without needing a re compile as at the assembler level it would be the same instruction.

I'd guess hacking the microcode would call for exceptional bravery on a production machine.

Arnaut the less

Re: No mention of microcode? - floppy disk

Someone will doubtless correct me, but as I understood it the floppy was invented as a way of loading the microcode into the mainframe CPU.

tom dial

The rule of thumb in use (from Brooks's Mythical Man Month, as I remember) is around 5 debugged lines of code per programmer per day, pretty much irrespective of the language. And although the end code might have been a million lines, some of it probably needed to be written several times: another memorable Brooks item about large programming projects is "plan to throw one away, because you will."

Tom Welsh

Programming systems product

The main reason for what appears, at first sight, low productivity is spelled out in "The Mythical Man-Month". Brooks freely concedes that anyone who has just learned to program would expect to be many times more productive than his huge crew of seasoned professionals. Then he explains, with the aid of a diagram divided into four quadrants.

Top left, we have the simple program. When a program gets big and complex enough, it becomes a programming system, which takes a team to write it rather than a single individual. And that introduces many extra time-consuming aspects and much overhead.

Going the other way, writing a simple program is far easier than creating a product with software at its core. Something that will be sold as a commercial product must be tested seven ways from Sunday, made as maintainable and extensible as possible, be supplemented with manuals, training courses, and technical support services, etc.

Finally, put the two together and you get the programming systems product, which can be 100 times more expensive and time-consuming to create than an equivalent simple program.

Tom Welsh

"Why won't you DIE?"

I suppose that witty, but utterly inappropriate, heading was added by an editor; Gavin knows better.

If anyone is in doubt, the answer would be the same as for other elderly technology such as houses, roads, clothing, cars, aeroplanes, radio, TV, etc. Namely, it works - and after 50 years of widespread practical use, it has been refined so that it now works *bloody well*. In extreme contrast to many more recent examples of computing innovation, I may add.

Whoever added that ill-advised attempt at humour should be forced to write out 1,000 times:

"The definition of a legacy system: ONE THAT WORKS".

Grumpy Guts

Re: Pay Per Line Of Code

I worked for IBM UK in the 60s and wrote a lot of code for many different customers. There was never a charge. It was all part of the built in customer support. I even rewrote part of the OS for one system (not s/360 - IBM 1710 I think) for Rolls Royce aero engines to allow all the user code for monitoring engine test cells to fit in memory.

dlc.usa

Sole Source For Hardware?

Even before the advent of Plug Compatible Machines brought competition for the Central Processing Units, the S/360 peripheral hardware market was open to third parties. IBM published the technical specifications for the bus and tag channel interfaces allowing, indeed, encouraging vendors to produce plug and play devices for the architecture, even in competition with IBM's own. My first S/360 in 1972 had Marshall not IBM disks and a Calcomp drum plotter for which IBM offered no counterpart. This was true of the IBM Personal Computer as well. This type of openness dramatically expands the marketability of a new platform architecture.

RobHib

Eventually we stripped scrapped 360s for components.

"IBM built its own circuits for S/360, Solid Logic Technology (SLT) - a set of transistors and diodes mounted on a circuit twenty-eight-thousandths of a square inch and protected by a film of glass just sixty-millionths of an inch thick. The SLT was 10 times more dense the technology of its day."

When these machines were eventually scrapped we used the components from them for electronic projects. Their unusual construction was a pain, much of the 'componentry' couldn't be used because of the construction. (That was further compounded by IBM actually partially smashing modules before they were released as scrap.)

"p3 [Photo caption] The S/360 Model 91 at NASA's Goddard Space Flight Center, with 2,097,152 bytes of main memory, was announced in 1968"

Around that time our 360 only had 44kB memory, it was later expanded to 77kB in about 1969. Why those odd values were chosen is still somewhat a mystery to me.

David Beck

Re: Eventually we stripped scrapped 360s for components.

@RobHib-The odd memory was probably the size of the memory available for the user, not the hardware size (which came in powers of 2 multiples). The size the OS took was a function of what devices were attached and a few other sysgen parameters. Whatever was left after the OS was user space. There was usually a 2k boundary since memory protect keys worked on 2k chunks, but not always, some customers ran naked to squeeze out those extra bytes.

Glen Turner 666

Primacy of software

Good article.

Could have had a little more about the primacy of software: IBM had a huge range of compliers, and having an assembling language common across a wide range was a huge winner (as obvious as that seems today in an age of a handful of processor instruction sets). Furthermore, IBM had a strong focus on binary compatibility, and the lack of that with some competitor's ranges made shipping software for those machines much more expensive than for IBM.

IBM also sustained that commitment to development. Which meant that until the minicomputer age they were really the only possibility if you wanted newer features (such as CICS for screen-based transaction processing or VSAM or DB2 for databases, or VMs for a cheaper test versus production environment). Other manufacturers would develop against their forthcoming models, not their shipped models, and so IBM would be the company "shipping now" with the feature you desired.

IBM were also very focused on business. They knew how to market (eg, the myth of 'idle' versus 'ready' light on tape drives, whitepapers to explain technology to managers). They knew how to charge (eg, essentially a lease, which matched company's revenue). They knew how to do politics (eg, lobbying the Australian PM after they lost a government sale). They knew how to do support (with their customer engineers basically being a little bit of IBM embedded at the customer). Their strategic planning is still world class.

I would be cautious about lauding the $0.5B taken to develop the OS/360 software as progress. As a counterpoint consider Burroughs, who delivered better capability with less lines of code, since they wrote in Algol rather than assembler. Both companies got one thing right: huge libraries of code which made life much easier for applications programmers.

DEC's VMS learnt that lesson well. It wasn't until MS-DOS that we were suddenly dropped back into an inferior programming environment (but you'll cope with a lot for sheer responsiveness, and it didn't take too long until you could buy in what you needed).

What killed the mainframe was its sheer optimisation for batch and transaction processing and the massive cost if you used it any other way. Consider that TCP/IP used about 3% of the system's resources, or $30k pa of mainframe time. That would pay for a new Unix machine every year to host your website on.

Continued at Computer History Bulletin Bulletin, 2010-2019

Recommended Links

Softpanorama hot topic of the month

Softpanorama Recommended

Internal

External



Etc

FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available in our efforts to advance understanding of environmental, political, human rights, economic, democracy, scientific, and social justice issues, etc. We believe this constitutes a 'fair use' of any such copyrighted material as provided for in section 107 of the US Copyright Law. In accordance with Title 17 U.S.C. Section 107, the material on this site is distributed without profit exclusivly for research and educational purposes.   If you wish to use copyrighted material from this site for purposes of your own that go beyond 'fair use', you must obtain permission from the copyright owner. 

ABUSE: IPs or network segments from which we detect a stream of probes might be blocked for no less then 90 days. Multiple types of probes increase this period.  

Society

Groupthink : Two Party System as Polyarchy : Corruption of Regulators : Bureaucracies : Understanding Micromanagers and Control Freaks : Toxic Managers :   Harvard Mafia : Diplomatic Communication : Surviving a Bad Performance Review : Insufficient Retirement Funds as Immanent Problem of Neoliberal Regime : PseudoScience : Who Rules America : Neoliberalism  : The Iron Law of Oligarchy : Libertarian Philosophy

Quotes

War and Peace : Skeptical Finance : John Kenneth Galbraith :Talleyrand : Oscar Wilde : Otto Von Bismarck : Keynes : George Carlin : Skeptics : Propaganda  : SE quotes : Language Design and Programming Quotes : Random IT-related quotesSomerset Maugham : Marcus Aurelius : Kurt Vonnegut : Eric Hoffer : Winston Churchill : Napoleon Bonaparte : Ambrose BierceBernard Shaw : Mark Twain Quotes

Bulletin:

Vol 25, No.12 (December, 2013) Rational Fools vs. Efficient Crooks The efficient markets hypothesis : Political Skeptic Bulletin, 2013 : Unemployment Bulletin, 2010 :  Vol 23, No.10 (October, 2011) An observation about corporate security departments : Slightly Skeptical Euromaydan Chronicles, June 2014 : Greenspan legacy bulletin, 2008 : Vol 25, No.10 (October, 2013) Cryptolocker Trojan (Win32/Crilock.A) : Vol 25, No.08 (August, 2013) Cloud providers as intelligence collection hubs : Financial Humor Bulletin, 2010 : Inequality Bulletin, 2009 : Financial Humor Bulletin, 2008 : Copyleft Problems Bulletin, 2004 : Financial Humor Bulletin, 2011 : Energy Bulletin, 2010 : Malware Protection Bulletin, 2010 : Vol 26, No.1 (January, 2013) Object-Oriented Cult : Political Skeptic Bulletin, 2011 : Vol 23, No.11 (November, 2011) Softpanorama classification of sysadmin horror stories : Vol 25, No.05 (May, 2013) Corporate bullshit as a communication method  : Vol 25, No.06 (June, 2013) A Note on the Relationship of Brooks Law and Conway Law

History:

Fifty glorious years (1950-2000): the triumph of the US computer engineering : Donald Knuth : TAoCP and its Influence of Computer Science : Richard Stallman : Linus Torvalds  : Larry Wall  : John K. Ousterhout : CTSS : Multix OS Unix History : Unix shell history : VI editor : History of pipes concept : Solaris : MS DOSProgramming Languages History : PL/1 : Simula 67 : C : History of GCC developmentScripting Languages : Perl history   : OS History : Mail : DNS : SSH : CPU Instruction Sets : SPARC systems 1987-2006 : Norton Commander : Norton Utilities : Norton Ghost : Frontpage history : Malware Defense History : GNU Screen : OSS early history

Classic books:

The Peter Principle : Parkinson Law : 1984 : The Mythical Man-MonthHow to Solve It by George Polya : The Art of Computer Programming : The Elements of Programming Style : The Unix Hater’s Handbook : The Jargon file : The True Believer : Programming Pearls : The Good Soldier Svejk : The Power Elite

Most popular humor pages:

Manifest of the Softpanorama IT Slacker Society : Ten Commandments of the IT Slackers Society : Computer Humor Collection : BSD Logo Story : The Cuckoo's Egg : IT Slang : C++ Humor : ARE YOU A BBS ADDICT? : The Perl Purity Test : Object oriented programmers of all nations : Financial Humor : Financial Humor Bulletin, 2008 : Financial Humor Bulletin, 2010 : The Most Comprehensive Collection of Editor-related Humor : Programming Language Humor : Goldman Sachs related humor : Greenspan humor : C Humor : Scripting Humor : Real Programmers Humor : Web Humor : GPL-related Humor : OFM Humor : Politically Incorrect Humor : IDS Humor : "Linux Sucks" Humor : Russian Musical Humor : Best Russian Programmer Humor : Microsoft plans to buy Catholic Church : Richard Stallman Related Humor : Admin Humor : Perl-related Humor : Linus Torvalds Related humor : PseudoScience Related Humor : Networking Humor : Shell Humor : Financial Humor Bulletin, 2011 : Financial Humor Bulletin, 2012 : Financial Humor Bulletin, 2013 : Java Humor : Software Engineering Humor : Sun Solaris Related Humor : Education Humor : IBM Humor : Assembler-related Humor : VIM Humor : Computer Viruses Humor : Bright tomorrow is rescheduled to a day after tomorrow : Classic Computer Humor

The Last but not Least


Copyright © 1996-2016 by Dr. Nikolai Bezroukov. www.softpanorama.org was created as a service to the UN Sustainable Development Networking Programme (SDNP) in the author free time. This document is an industrial compilation designed and created exclusively for educational use and is distributed under the Softpanorama Content License.

The site uses AdSense so you need to be aware of Google privacy policy. You you do not want to be tracked by Google please disable Javascript for this site. This site is perfectly usable without Javascript.

Original materials copyright belong to respective owners. Quotes are made for educational purposes only in compliance with the fair use doctrine.

FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to advance understanding of computer science, IT technology, economic, scientific, and social issues. We believe this constitutes a 'fair use' of any such copyrighted material as provided by section 107 of the US Copyright Law according to which such material can be distributed without profit exclusively for research and educational purposes.

This is a Spartan WHYFF (We Help You For Free) site written by people for whom English is not a native language. Grammar and spelling errors should be expected. The site contain some broken links as it develops like a living tree...

You can use PayPal to make a contribution, supporting development of this site and speed up access. In case softpanorama.org is down you can use the at softpanorama.info

Disclaimer:

The statements, views and opinions presented on this web page are those of the author (or referenced source) and are not endorsed by, nor do they necessarily reflect, the opinions of the author present and former employers, SDNP or any other organization the author may be associated with. We do not warrant the correctness of the information provided or its fitness for any purpose.

Last modified: February 11, 2017