classes ::: subject,
children ::: programs (Computer Science)
branches ::: Computer Science

bookmarks: Instances - Definitions - Quotes - Chapters - Wordnet - Webgen


object:Computer Science
object:compsci
object:CS
class:subject

--- Concepts
variables
constants
arrays
strings
expression
statements
functions
program flow
pointers
references
classes
objects
inheritance
polymorphism



see also ::: programming




see also ::: programming

questions, comments, suggestions/feedback, take-down requests, contribute, etc
contact me @ integralyogin@gmail.com or
join the integral discord server (chatrooms)
if the page you visited was empty, it may be noted and I will try to fill it out. cheers



now begins generated list of local instances, definitions, quotes, instances in chapters, wordnet info if available and instances among weblinks


OBJECT INSTANCES [0] - TOPICS - AUTHORS - BOOKS - CHAPTERS - CLASSES - SEE ALSO - SIMILAR TITLES

TOPICS
Artifical_Intelligence
Computer_Engineering
Expert_System
Hacking
Linux
programming
programs_(Computer_Science)
Scheme
Scheme
search_engine
Software_Engineering
the_God_of_Computation
The_Internet
the_Internet
SEE ALSO

programming

AUTH
Alan_Perlis
Jaron_Lanier

BOOKS
Computer_Power_and_Human_Reason
Eloquent_Javascript
the_Stack

IN CHAPTERS TITLE

IN CHAPTERS CLASSNAME

IN CHAPTERS TEXT

PRIMARY CLASS

subject
SIMILAR TITLES
Computer Science
Essential Books of Computer Science
programs (Computer Science)

DEFINITIONS


TERMS STARTING WITH

computer science ::: The theory, experimentation, and engineering that form the basis for the design and use of computers. It involves the study of algorithms that process, store, and communicate digital information. A computer scientist specializes in the theory of computation and the design of computational systems.[116]


TERMS ANYWHERE

admissible heuristic ::: In computer science, specifically in algorithms related to pathfinding, a heuristic function is said to be admissible if it never overestimates the cost of reaching the goal, i.e. the cost it estimates to reach the goal is not higher than the lowest possible cost from the current point in the path.[12]

Alan Turing ::: (person) Alan M. Turing, 1912-06-22/3? - 1954-06-07. A British mathematician, inventor of the Turing Machine. Turing also proposed the Turing test. Turing's work was fundamental in the theoretical foundations of computer science.Turing was a student and fellow of King's College Cambridge and was a graduate student at Princeton University from 1936 to 1938. While at Princeton Turing published On Computable Numbers, a paper in which he conceived an abstract machine, now called a Turing Machine.Turing returned to England in 1938 and during World War II, he worked in the British Foreign Office. He masterminded operations at Bletchley Park, UK which perform many repetitive symbolic manipulations quickly. Before the building of the Colossus computer this work was done by a roomful of women.In 1945 he joined the National Physical Laboratory in London and worked on the design and construction of a large computer, named Automatic Computing Engine Manchester where the Manchester Automatic Digital Machine, the worlds largest memory computer, was being built.He also worked on theories of artificial intelligence, and on the application of mathematical theory to biological forms. In 1952 he published the first part of his theoretical study of morphogenesis, the development of pattern and form in living organisms.Turing was gay, and died rather young under mysterious circumstances. He was arrested for violation of British homosexuality statutes in 1952. He died of inquest concluded that it was self-administered but it is now thought by some to have been an accident.There is an excellent biography of Turing by Andrew Hodges, subtitled The Enigma of Intelligence and a play based on it called Breaking the Code. There was also a popular summary of his work in Douglas Hofstadter's book G�del, Escher, Bach. .(2001-10-09)

Alan Turing "person" Alan M. Turing, 1912-06-22/3? - 1954-06-07. A British mathematician, inventor of the {Turing Machine}. Turing also proposed the {Turing test}. Turing's work was fundamental in the theoretical foundations of computer science. Turing was a student and fellow of {King's College Cambridge} and was a graduate student at {Princeton University} from 1936 to 1938. While at Princeton Turing published "On Computable Numbers", a paper in which he conceived an {abstract machine}, now called a {Turing Machine}. Turing returned to England in 1938 and during World War II, he worked in the British Foreign Office. He masterminded operations at {Bletchley Park}, UK which were highly successful in cracking the Nazis "Enigma" codes during World War II. Some of his early advances in computer design were inspired by the need to perform many repetitive symbolic manipulations quickly. Before the building of the {Colossus} computer this work was done by a roomful of women. In 1945 he joined the {National Physical Laboratory} in London and worked on the design and construction of a large computer, named {Automatic Computing Engine} (ACE). In 1949 Turing became deputy director of the Computing Laboratory at Manchester where the {Manchester Automatic Digital Machine}, the worlds largest memory computer, was being built. He also worked on theories of {artificial intelligence}, and on the application of mathematical theory to biological forms. In 1952 he published the first part of his theoretical study of morphogenesis, the development of pattern and form in living organisms. Turing was gay, and died rather young under mysterious circumstances. He was arrested for violation of British homosexuality statutes in 1952. He died of potassium cyanide poisoning while conducting electrolysis experiments. An inquest concluded that it was self-administered but it is now thought by some to have been an accident. There is an excellent biography of Turing by Andrew Hodges, subtitled "The Enigma of Intelligence" and a play based on it called "Breaking the Code". There was also a popular summary of his work in Douglas Hofstadter's book "Gödel, Escher, Bach". {(http://AlanTuring.net/)}. (2001-10-09)

Alonzo Church "person" A twentieth century mathematician and logician, and one of the founders of computer science. Church invented the {lambda-calculus} and posited a version of the {Church-Turing thesis}. (1995-03-25)

Alonzo Church ::: (person) A twentieth century mathematician and logician, and one of the founders of computer science. Church invented the lambda-calculus and posited a version of the Church-Turing thesis. (1995-03-25)

Also artificial emotional intelligence or emotion AI. ::: The study and development of systems and devices that can recognize, interpret, process, and simulate human affects. Affective computing is an interdisciplinary field spanning computer science, psychology, and cognitive science.[13][14]

Also first-order logic, predicate logic, and first-order predicate calculus. ::: A collection of formal systems used in mathematics, philosophy, linguistics, and computer science. First-order logic uses quantified variables over non-logical objects and allows the use of sentences that contain variables, so that rather than propositions such as Socrates is a man one can have expressions in the form "there exists x such that x is Socrates and x is a man" and there exists is a quantifier while x is a variable.[173] This distinguishes it from propositional logic, which does not use quantifiers or relations;[248] in this sense, propositional logic is the foundation of first-order logic.

Also known as first-order predicate calculus and predicate logic. ::: A collection of formal systems used in mathematics, philosophy, linguistics, and computer science. First-order logic uses quantified variables over non-logical objects and allows the use of sentences that contain variables, so that rather than propositions such as Socrates is a man one can have expressions in the form "there exists X such that X is Socrates and X is a man" and there exists is a quantifier while X is a variable.[173] This distinguishes it from propositional logic, which does not use quantifiers or relations.[174]

Also machine intelligence. ::: Any intelligence demonstrated by machines, in contrast to the natural intelligence displayed by humans and other animals. In computer science, AI research is defined as the study of "intelligent agents": any device that perceives its environment and takes actions that maximize its chance of successfully achieving its goals.[27] Colloquially, the term "artificial intelligence" is applied when a machine mimics "cognitive" functions that humans associate with other human minds, such as "learning" and "problem solving".[28]

Also mathematical programming. ::: In mathematics, computer science, and operations research, the selection of a best element (with regard to some criterion) from some set of available alternatives.[212]

Also statistical computing. ::: The interface between statistics and computer science.

Amulet "processor" An implementation or the {Advanced RISC Machine} {microprocessor} architecture using the {micropipeline} design style. In April 1994 the Amulet group in the Computer Science department of {Manchester University} took delivery of the AMULET1 {microprocessor}. This was their first large scale asynchronous circuit and the world's first implementation of a commercial microprocessor architecture (ARM) in {asynchronous logic}. Work was begun at the end of 1990 and the design despatched for fabrication in February 1993. The primary intent was to demonstrate that an asynchronous microprocessor can consume less power than a synchronous design. The design incorporates a number of concurrent units which cooperate to give instruction level compatibility with the existing synchronous part. These include an Address unit, which autonomously generates instruction fetch requests and interleaves ({nondeterministic}ally) data requests from the Execution unit; a {Register} file which supplies operands, queues write destinations and handles data dependencies; an Execution unit which includes a multiplier, a shifter and an {ALU} with data-dependent delay; a Data interface which performs byte extraction and alignment and includes an {instruction prefetch} buffer, and a control path which performs {instruction decode}. These units only synchronise to exchange data. The design demonstrates that all the usual problems of processor design can be solved in this asynchronous framework: backward {instruction set} compatibility, {interrupts} and exact {exceptions} for {memory faults} are all covered. It also demonstrates some unusual behaviour, for instance {nondeterministic} prefetch depth beyond a branch instruction (though the instructions which actually get executed are, of course, deterministic). There are some unusual problems for {compiler} {optimisation}, as the metric which must be used to compare alternative code sequences is continuous rather than discrete, and the {nondeterminism} in external behaviour must also be taken into account. The chip was designed using a mixture of custom {datapath} and compiled control logic elements, as was the synchronous ARM. The fabrication technology is the same as that used for one version of the synchronous part, reducing the number of variables when comparing the two parts. Two silicon implementations have been received and preliminary measurements have been taken from these. The first is a 0.7um process and has achieved about 28 kDhrystones running the standard {benchmark} program. The other is a 1 um implementation and achieves about 20 kDhrystones. For the faster of the parts this is equivalent to a synchronous {ARM6} clocked at around 20MHz; in the case of AMULET1 it is likely that this speed is limited by the memory system cycle time (just over 50ns) rather than the processor chip itself. A fair comparison of devices at the same geometries gives the AMULET1 performance as about 70% of that of an {ARM6} running at 20MHz. Its power consumption is very similar to that of the ARM6; the AMULET1 therefore delivers about 80 MIPS/W (compared with around 120 from a 20MHz ARM6). Multiplication is several times faster on the AMULET1 owing to the inclusion of a specialised asynchronous multiplier. This performance is reasonable considering that the AMULET1 is a first generation part, whereas the synchronous ARM has undergone several design iterations. AMULET2 (under development in 1994) was expected to be three times faster than AMULET1 and use less power. The {macrocell} size (without {pad ring}) is 5.5 mm by 4.5 mm on a 1 micron {CMOS} process, which is about twice the area of the synchronous part. Some of the increase can be attributed to the more sophisticated organisation of the new part: it has a deeper {pipeline} than the clocked version and it supports multiple outstanding memory requests; there is also specialised circuitry to increase the multiplication speed. Although there is undoubtedly some overhead attributable to the asynchronous control logic, this is estimated to be closer to 20% than to the 100% suggested by the direct comparison. AMULET1 is code compatible with {ARM6} and is so is capable of running existing {binaries} without modification. The implementation also includes features such as interrupts and memory aborts. The work was part of a broad {ESPRIT} funded investigation into low-power technologies within the European {Open Microprocessor systems Initiative} (OMI) programme, where there is interest in low-power techniques both for portable equipment and (in the longer term) to alleviate the problems of the increasingly high dissipation of high-performance chips. This initial investigation into the role {asynchronous logic} might play has now demonstrated that asynchronous techniques can be applied to problems of the scale of a complete {microprocessor}. {(http://cs.man.ac.uk/amulet)}. (1994-12-08)

An :::algorithm ::: is set of instructions for solving a problem or accomplishing a task. One common example of an algorithm is a recipe, which consists of specific instructions for preparing a dish/meal. Every computerized device uses algorithms to perform its functions.   BREAKING DOWN 'Algorithm'   Financial companies use algorithms in areas such as loan pricing, stock trading, and asset-liability management. For example, algorithmic trading, known as "algo," is used for deciding the timing, pricing, and quantity of stock orders. Algo trading, also known as automated trading or black-box trading, uses a computer program to buy or sell securities at a pace not possible for humans. Since prices of stocks, bonds, and commodities appear in various formats online and in trading data, the process by which an algorithm digests scores of financial data becomes easy. The user of the program simply sets the parameters and gets the desired output when securities meet the trader's criteria.   Types of Algos   Several types of trading algorithms help investors decide whether to buy or sell. A mean reversion algorithm examines short-term prices over the long-term average price, and if a stock goes much higher than the average, a trader may sell it for a quick profit. Seasonality refers to the practice of traders buying and selling securities based on the time of year when markets typically rise or fall. A sentiment analysis algorithm gauges news about a stock price that could lead to higher volume for a trading period.  Algorithm Example   The following is an example of an algorithm for trading. A trader creates instructions within his automated account to sell 100 shares of a stock if the 50-day moving average goes below the 200-day moving average. Contrarily, the trader could create instructions to buy 100 shares if the 50-day moving average of a stock rises above the 200-day moving average. Sophisticated algorithms consider hundreds of criteria before buying or selling securities. Computers quickly synthesize the automated account instructions to produce desired results. Without computers, complex trading would be time-consuming and possibly impossible.   Algorithms in Computer Science   In computer science, a programmer must employ five basic parts of an algorithm to create a successful program. First, he/she describes the problem in mathematical terms before creating the formulas and processes that create results. Next, the programmer inputs the outcome parameters, and then he/she executes the program repeatedly to test its accuracy. The conclusion of the algorithm is the result given after the parameters go through the set of instructions in the program.  For financial algorithms, the more complex the program, the more data the software can use to make accurate assessments to buy or sell securities. Programmers test complex algorithms thoroughly to ensure the programs are without errors. Many algorithms can be used for one problem; however, there are some that simplify the process better than others.

archie "tool, networking" A system to automatically gather, index and serve information on the {Internet}. The initial implementation of archie by {McGill University} School of Computer Science provided an indexed directory of filenames from all {anonymous FTP} archives on the Internet. Later versions provide other collections of information. See also {archive site}, {Gopher}, {Prospero}, {Wide Area Information Servers}. (1995-12-28)

artificial intelligence (AI): in computer science, the attempt to build machines which can function intelligently, and the use of such machines to test our understanding of human intelligence.

Artificial_intelligence ::: (AI:) is a term for simulated intelligence in machines. These machines are programmed to "think" like a human and mimic the way a person acts. The ideal characteristic of artificial intelligence is its ability to rationalize and take actions that have the best chance of achieving a specific goal, although the term can be applied to any machine that exhibits traits associated with a human mind, such as learning and solving problems.   :::BREAKING DOWN 'Artificial Intelligence - AI'   Artificial intelligence is based around the idea that human intelligence can be defined in such exact terms that a machine can mimic it. The goals of artificial intelligence include learning, reasoning and perception, and machines are wired using a cross-disciplinary approach based in mathematics, computer science, linguistics, psychology and more.  As technology advances, previous benchmarks that defined artificial intelligence become outdated. For example, machines that calculate basic functions or recognize text through methods such as optimal character recognition are no longer said to have artificial intelligence, since this function is now taken for granted as an inherent computer function.  Some examples of machines with artificial intelligence include computers that play chess, which have been around for years, and self-driving cars, which are a relatively new development. Each of these machines must weigh the consequences of any action they take, as each action will impact the end result. In chess, this end result is winning the game. For self-driving cars, the computer system must take into account all external data and compute it to act in a way that prevents collision

artificial intelligence ::: (artificial intelligence) (AI) The subfield of computer science concerned with the concepts and methods of symbolic inference by computer and symbolic faster. The term was coined by Stanford Professor John McCarthy, a leading AI researcher.Examples of AI problems are computer vision (building a system that can understand images as well as a human) and natural language processing (building have foundered on the amount of context information and intelligence they seem to require.The term is often used as a selling point, e.g. to describe programming that drives the behaviour of computer characters in a game. This is often no more intelligent than Kill any humans you see; keep walking; avoid solid objects; duck if a human with a gun can see you.See also AI-complete, neats vs. scruffies, neural network, genetic programming, fuzzy computing, artificial life. CMU Artificial Intelligence Repository .(2002-01-19)

artificial intelligence "artificial intelligence" (AI) The subfield of computer science concerned with the concepts and methods of {symbolic inference} by computer and symbolic {knowledge representation} for use in making inferences. AI can be seen as an attempt to model aspects of human thought on computers. It is also sometimes defined as trying to solve by computer any problem that a human can solve faster. The term was coined by Stanford Professor {John McCarthy}, a leading AI researcher. Examples of AI problems are {computer vision} (building a system that can understand images as well as a human) and {natural language processing} (building a system that can understand and speak a human language as well as a human). These may appear to be modular, but all attempts so far (1993) to solve them have foundered on the amount of context information and "intelligence" they seem to require. The term is often used as a selling point, e.g. to describe programming that drives the behaviour of computer characters in a game. This is often no more intelligent than "Kill any humans you see; keep walking; avoid solid objects; duck if a human with a gun can see you". See also {AI-complete}, {neats vs. scruffies}, {neural network}, {genetic programming}, {fuzzy computing}, {artificial life}. {ACM SIGART (http://sigart.acm.org/)}. {U Cal Davis (http://phobos.cs.ucdavis.edu:8001)}. {CMU Artificial Intelligence Repository (http://cs.cmu.edu/Web/Groups/AI/html/repository.html)}. (2002-01-19)

Association for Computing ::: (body) (ACM, before 1997 - Association for Computing Machinery) The largest and oldest international scientific and educational computer society in and application of Information Technology. John Mauchly, co-inventor of the ENIAC, was one of ACM's founders.Since its inception ACM has provided its members and the world of computer science a forum for the sharing of knowledge on developments and achievements necessary to the fruitful interchange of ideas.ACM has 90,000 members - educators, researchers, practitioners, managers, and engineers - who drive the Association's major programs and services - publications, special interest groups, chapters, conferences, awards, and special activities.The ACM Press publishes journals (notably CACM), book series, conference proceedings, CD-ROM, hypertext, video, and specialized publications such as curricula recommendations and self-assessment procedures. . (1998-02-24)

Association for Computing "body" (ACM, before 1997 - "Association for Computing Machinery") The largest and oldest international scientific and educational computer society in the industry. Founded in 1947, only a year after the unveiling of {ENIAC}, ACM was established by mathematicians and electrical engineers to advance the science and application of {Information Technology}. {John Mauchly}, co-inventor of the ENIAC, was one of ACM's founders. Since its inception ACM has provided its members and the world of computer science a forum for the sharing of knowledge on developments and achievements necessary to the fruitful interchange of ideas. ACM has 90,000 members - educators, researchers, practitioners, managers, and engineers - who drive the Association's major programs and services - publications, special interest groups, chapters, conferences, awards, and special activities. The ACM Press publishes journals (notably {CACM}), book series, conference proceedings, {CD-ROM}, {hypertext}, {video}, and specialized publications such as curricula recommendations and self-assessment procedures. {(http://info.acm.org/)}. (1998-02-24)

automata theory ::: The study of abstract machines and automata, as well as the computational problems that can be solved using them. It is a theory in theoretical computer science and discrete mathematics (a subject of study in both mathematics and computer science).

automated reasoning ::: An area of computer science and mathematical logic dedicated to understanding different aspects of reasoning. The study of automated reasoning helps produce computer programs that allow computers to reason completely, or nearly completely, automatically. Although automated reasoning is considered a sub-field of artificial intelligence, it also has connections with theoretical computer science, and even philosophy.

behavior tree (BT) ::: A mathematical model of plan execution used in computer science, robotics, control systems and video games. They describe switchings between a finite set of tasks in a modular fashion. Their strength comes from their ability to create very complex tasks composed of simple tasks, without worrying how the simple tasks are implemented. BTs present some similarities to hierarchical state machines with the key difference that the main building block of a behavior is a task rather than a state. Its ease of human understanding make BTs less error-prone and very popular in the game developer community. BTs have shown to generalize several other control architectures.[57][58]

Berkeley EDIF200 ::: translator-building toolkitWendell C. Baker and Prof A. Richard Newton of the Electronics Research Laboratory, Department of Electrical Engineering and Computer Sciences at the University of California, Berkeley.Version 7.6. Restriction: no-profit without permission. . (1990-07-01)

Berkeley EDIF200 translator-building toolkit Wendell C. Baker and Prof A. Richard Newton of the Electronics Research Laboratory, Department of Electrical Engineering and Computer Sciences at the {University of California, Berkeley}. Version 7.6. Restriction: no-profit without permission. {(ftp://ic.berkeley.edu/pub/edif)}. (1990-07-01)

bioinformatics "application" The field of science concerning the application of {computer science} and {information technology} to biology; using computers to handle biological information, especially {computational molecular biology}. (2005-01-07)

bioinformatics ::: (application) The field of science concerning the application of computer science and information technology to biology; using computers to handle biological information, especially computational molecular biology.(2005-01-07)

Bioinformatics ::: is the application of computational technology to handle the rapidly growing repository of information related to molecular biology. Bioinformatics combines different fields of study, including computer sciences, molecular biology, biotechnology, statistics and engineering. It is particularly useful for managing and analyzing large sets of data, such as those generated by the fields of genomics and proteomics.

Bioinformatics - the science of collecting and analyzing complex biochemical and biological data using mathematics and computer science, as in the study of genomes. See /r/bioinformatics.

Boolean algebra "logic" (After the logician {George Boole}) 1. Commonly, and especially in computer science and digital electronics, this term is used to mean {two-valued logic}. 2. This is in stark contrast with the definition used by pure mathematicians who in the 1960s introduced "Boolean-valued {models}" into logic precisely because a "Boolean-valued model" is an interpretation of a {theory} that allows more than two possible truth values! Strangely, a Boolean algebra (in the mathematical sense) is not strictly an {algebra}, but is in fact a {lattice}. A Boolean algebra is sometimes defined as a "complemented {distributive lattice}". Boole's work which inspired the mathematical definition concerned {algebras} of {sets}, involving the operations of intersection, union and complement on sets. Such algebras obey the following identities where the operators ^, V, - and constants 1 and 0 can be thought of either as set intersection, union, complement, universal, empty; or as two-valued logic AND, OR, NOT, TRUE, FALSE; or any other conforming system. a ^ b = b ^ a  a V b = b V a   (commutative laws) (a ^ b) ^ c = a ^ (b ^ c) (a V b) V c = a V (b V c)     (associative laws) a ^ (b V c) = (a ^ b) V (a ^ c) a V (b ^ c) = (a V b) ^ (a V c)  (distributive laws) a ^ a = a  a V a = a     (idempotence laws) --a = a -(a ^ b) = (-a) V (-b) -(a V b) = (-a) ^ (-b)       (de Morgan's laws) a ^ -a = 0  a V -a = 1 a ^ 1 = a  a V 0 = a a ^ 0 = 0  a V 1 = 1 -1 = 0  -0 = 1 There are several common alternative notations for the "-" or {logical complement} operator. If a and b are elements of a Boolean algebra, we define a "= b to mean that a ^ b = a, or equivalently a V b = b. Thus, for example, if ^, V and - denote set intersection, union and complement then "= is the inclusive subset relation. The relation "= is a {partial ordering}, though it is not necessarily a {linear ordering} since some Boolean algebras contain incomparable values. Note that these laws only refer explicitly to the two distinguished constants 1 and 0 (sometimes written as {LaTeX} \top and \bot), and in {two-valued logic} there are no others, but according to the more general mathematical definition, in some systems variables a, b and c may take on other values as well. (1997-02-27)

Boolean algebra ::: (mathematics, logic) (After the logician George Boole)1. Commonly, and especially in computer science and digital electronics, this term is used to mean two-valued logic.2. This is in stark contrast with the definition used by pure mathematicians who in the 1960s introduced Boolean-valued models into logic precisely because a Boolean-valued model is an interpretation of a theory that allows more than two possible truth values!Strangely, a Boolean algebra (in the mathematical sense) is not strictly an algebra, but is in fact a lattice. A Boolean algebra is sometimes defined as a complemented distributive lattice.Boole's work which inspired the mathematical definition concerned algebras of sets, involving the operations of intersection, union and complement on sets. complement, universal, empty; or as two-valued logic AND, OR, NOT, TRUE, FALSE; or any other conforming system. a ^ b = b ^ a a V b = b V a (commutative laws)(a ^ b) ^ c = a ^ (b ^ c) There are several common alternative notations for the - or logical complement operator.If a and b are elements of a Boolean algebra, we define a = b to mean that a ^ b = a, or equivalently a V b = b. Thus, for example, if ^, V and - denote set relation = is a partial ordering, though it is not necessarily a linear ordering since some Boolean algebras contain incomparable values.Note that these laws only refer explicitly to the two distinguished constants 1 and 0 (sometimes written as LaTeX \top and \bot), and in two-valued logic there are no others, but according to the more general mathematical definition, in some systems variables a, b and c may take on other values as well. (1997-02-27)

Carnegie Mellon University ::: (body, education) (CMU) A university in Pittsburgh, Pennsylvania. School of Computer Science . (1997-06-23)

Carnegie Mellon University "body, education" (CMU) A university in Pittsburgh, Pennsylvania. {School of Computer Science (http://cs.cmu.edu/Web/FrontDoor.html)}. (1997-06-23)

Centrum voor Wiskunde en Informatica (CWI, Centre for Mathematics and Computer Science) An independent research institute active in the fields of mathematics and computer science. CWI also aims to transfer new knowledge in these fields to society, trade and industry CWI is funded for 70 percent by NWO, the National Organisation for Scientific Research. The remaining 30 percent is obtained through national and international programmes and contract research commissioned by industry. Address: Kruislaan 413, 1098 SJ Amsterdam, The Netherlands; P.O.Box 94079, 1090 GB Amsterdam, The Netherlands. Telephone: +31 (20) 5929 333. {(http://cwi.nl/)}. {(ftp://ftp.cwi.nl/pub/)}.

Centrum voor Wiskunde en Informatica ::: (CWI, Centre for Mathematics and Computer Science) An independent research institute active in the fields of mathematics and computer science. CWI also aims to transfer new knowledge in these fields to society, trade and industryCWI is funded for 70 percent by NWO, the National Organisation for Scientific Research. The remaining 30 percent is obtained through national and international programmes and contract research commissioned by industry.Address: Kruislaan 413, 1098 SJ Amsterdam, The Netherlands; P.O.Box 94079, 1090 GB Amsterdam, The Netherlands.Telephone: +31 (20) 5929 333. . .

Claytronics - an abstract concept that combines nanoscale robotics and computer science to create individual nanometer-scale computers called claytronic atoms, or catoms, which can interact with each other to form tangible 3D objects that a user can interact with.

CLEAR "language" A {specification language} based on {initial algebras}. ["An Informal Introduction to Specification Using CLEAR", R.M. Burstall in The Correctness Problem in Computer Science, R.S. Boyer et al eds, Academic Press 1981, pp. 185-213]. (1994-11-03)

CLEAR ::: (language) A specification language based on initial algebras.[An Informal Introduction to Specification Using CLEAR, R.M. Burstall in The Correctness Problem in Computer Science, R.S. Boyer et al eds, Academic Press 1981, pp. 185-213]. (1994-11-03)

combinatorial optimization ::: In Operations Research, applied mathematics and theoretical computer science, combinatorial optimization is a topic that consists of finding an optimal object from a finite set of objects.[91]

Communications of the ACM "publication" (CACM) A monthly publication by the {Association for Computing Machinery} sent to all members. CACM is an influential publication that keeps computer science professionals up to date on developments. Each issue includes articles, case studies, practitioner oriented pieces, regular columns, commentary, departments, the ACM Forum, technical correspondence and advertisements. {(http://acm.org/cacm/)}. (1995-01-18)

computability theory "mathematics" The area of theoretical computer science concerning what problems can be solved by any computer. A function is computable if an {algorithm} can be implemented which will give the correct output for any valid input. Since computer programs are {countable} but {real numbers} are not, it follows that there must exist real numbers that cannot be calculated by any program. Unfortunately, by definition, there isn't an easy way of describing any of them! In fact, there are many tasks (not just calculating real numbers) that computers cannot perform. The most well-known is the {halting problem}, the {busy beaver} problem is less famous but just as fascinating. ["Computability", N.J. Cutland. (A well written undergraduate-level introduction to the subject)]. ["The Turing Omnibus", A.K. Dewdeney]. (1995-01-13)

computability theory ::: (mathematics) The area of theoretical computer science concerning what problems can be solved by any computer.A function is computable if an algorithm can be implemented which will give the correct output for any valid input.Since computer programs are countable but real numbers are not, it follows that there must exist real numbers that cannot be calculated by any program. Unfortunately, by definition, there isn't an easy way of describing any of them!In fact, there are many tasks (not just calculating real numbers) that computers cannot perform. The most well-known is the halting problem, the busy beaver problem is less famous but just as fascinating.[Computability, N.J. Cutland. (A well written undergraduate-level introduction to the subject)].[The Turing Omnibus, A.K. Dewdeney]. (1995-01-13)

computational learning theory ::: In computer science, computational learning theory (or just learning theory) is a subfield of artificial intelligence devoted to studying the design and analysis of machine learning algorithms.[96]

computational problem ::: In theoretical computer science, a computational problem is a mathematical object representing a collection of questions that computers might be able to solve.

computer science ::: The theory, experimentation, and engineering that form the basis for the design and use of computers. It involves the study of algorithms that process, store, and communicate digital information. A computer scientist specializes in the theory of computation and the design of computational systems.[116]

constructive ::: (mathematics) A proof that something exists is constructive if it provides a method for actually constructing it. Cantor's proof that the real irrational numbers exist. (There are easy constructive proofs, too; but there are existence theorems with no known constructive proof).Obviously, all else being equal, constructive proofs are better than non-constructive proofs. A few mathematicians actually reject *all* makes proof by contradiction invalid. See intuitionistic logic for more information on this.Most mathematicians are perfectly happy with non-constructive proofs; however, the constructive approach is popular in theoretical computer science, both and because intuitionistic logic turns out to be the right theory for a theoretical treatment of the foundations of computer science. (1995-04-13)

constructive proof "mathematics" A proof that something exists that provides an example or a method for actually constructing it. For example, for any pair of finite real numbers n " 0 and p " 0, there exists a real number 0 " k " 1 such that f(k) = (1-k)*n + k*p = 0. A constructive proof would proceed by rearranging the above to derive an equation for k: k = 1/(1-n/p) From this and the constraints on n and p, we can show that 0 " k " 1. A few mathematicians actually reject *all* non-constructive arguments as invalid; this means, for instance, that the law of the {excluded middle} (either P or not-P must hold, whatever P is) has to go; this makes {proof by contradiction} invalid. See {intuitionistic logic}. Constructive proofs are popular in theoretical computer science, both because computer scientists are less given to abstraction than mathematicians and because {intuitionistic logic} turns out to be an appropriate theoretical treatment of the foundations of computer science. (2014-08-24)

Consul "language" A {constraint}-based {declarative language} based on {axiomatic set theory} and designed for {parallel} execution on {MIMD} architectures. Consul's fundamental {data type} is the {set} and its fundamental {operators} are the {logical connectives} ("and", "or", "not") and {quantifiers} ("forall", "exists"). It is written in {Lisp}-like {syntax}, e.g., (plus x y z) which means the relation x = y+z (not an {assignment statement}). {["Design of the CONSUL Programming Language", D. Baldwin, C. A. Quiroz Gonzalez, University of Rochester. Computer Science Department, TR208, 1987 Feb (http://hdl.handle.net/1802/6372)]} {["Consul: A Parallel Constraint Language", D. Baldwin, IEEE Software 6(4):62-71, 1989 July (http://dx.doi.org/10.1109/52.31653)]} (2014-10-04)

Core War ::: (games) (Or more recently, Core Wars) A game played between assembly code programs running in the core of a simulated machine (and vicariously by their authors). The objective is to kill your opponents' programs by overwriting them.The programs are written using an instruction set called Redcode and run on a virtual machine called MARS (Memory Array Redcode Simulator).Core War was devised by Victor Vyssotsky, Robert Morris Sr., and Dennis Ritchie in the early 1960s (their original game was called Darwin and ran on a PDP-1 1984 by D. G. Jones and A. K. Dewdney of the Department of Computer Science at The University of Western Ontario (Canada).Dewdney wrote several Computer Recreations articles in Scientific American which discussed Core War, starting with the May 1984 article. Those articles are the most readable introduction to Core War, even though the Redcode dialect described in there is no longer current.The International Core War Society (ICWS) creates and maintains Core War standards and the runs Core War tournaments. There have been six annual tournaments and two standards (ICWS'86 and ICWS'88).[The Armchair Universe: An Exploration of Computer Worlds, A. K. Dewdney, W. H. Freeman, New York, 1988, ISBN 0-7167-1939-8, LCCN QA76.6 .D517 1988][The Magic Machine: A Handbook of Computer Sorcery, A. K. Dewdney, W. H. Freeman, New York, 1990, ISBN 0-7167-2125-2 (Hardcover), 0-7167-2144-9 (Paperback), LCCN QA76.6 .D5173 1990]. (1998-10-30)

Core War "games" (Or more recently, "Core Wars") A game played between {assembly code} programs running in the {core} of a simulated machine (and vicariously by their authors). The objective is to kill your opponents' programs by overwriting them. The programs are written using an {instruction set} called "{Redcode}" and run on a {virtual machine} called "{MARS}" (Memory Array Redcode Simulator). Core War was devised by Victor Vyssotsky, Robert Morris Sr., and {Dennis Ritchie} in the early 1960s (their original game was called "{Darwin}" and ran on a {PDP-1} at {Bell Labs}). It was first described in the "Core War Guidelines" of March, 1984 by D. G. Jones and A. K. Dewdney of the Department of Computer Science at The University of Western Ontario (Canada). Dewdney wrote several "Computer Recreations" articles in "Scientific American" which discussed Core War, starting with the May 1984 article. Those articles are contained in the two anthologies cited below. A.K. Dewdney's articles are still the most readable introduction to Core War, even though the {Redcode} dialect described in there is no longer current. The International Core War Society (ICWS) creates and maintains Core War standards and the runs Core War tournaments. There have been six annual tournaments and two standards (ICWS'86 and ICWS'88). ["The Armchair Universe: An Exploration of Computer Worlds", A. K. Dewdney, W. H. Freeman, New York, 1988, ISBN 0-7167-1939-8, LCCN QA76.6 .D517 1988] ["The Magic Machine: A Handbook of Computer Sorcery", A. K. Dewdney, W. H. Freeman, New York, 1990, ISBN 0-7167-2125-2 (Hardcover), 0-7167-2144-9 (Paperback), LCCN QA76.6 .D5173 1990]. (1998-10-30)

CSNET ::: Computers and Science Network, operated by CREN for US computer science institutes. It provides electronic mail service via dial-up lines, X.25 and Internet services.

CSNET Computers and Science Network, operated by {CREN} for US computer science institutes. It provides {electronic mail} service via {dial-up} lines, {X.25} and {Internet} services.

data model "database" The product of the {database} design process which aims to identify and organize the required data logically and physically. A data model says what information is to be contained in a database, how the information will be used, and how the items in the database will be related to each other. For example, a data model might specify that a customer is represented by a customer name and credit card number and a product as a product code and price, and that there is a one-to-many relation between a customer and a product. It can be difficult to change a database layout once code has been written and data inserted. A well thought-out data model reduces the need for such changes. Data modelling enhances application maintainability and future systems may re-use parts of existing models, which should lower development costs. A data modelling language is a mathematical formalism with a notation for describing data structures and a set of operations used to manipulate and validate that data. One of the most widely used methods for developing data models is the {entity-relationship model}. The {relational model} is the most widely used type of data model. Another example is {NIAM}. ["Principles of Database and Knowledge-Base Systems", J.D. Ullman, Volume I, Computer Science Press, 1988, p. 32]. (2000-06-24)

data model ::: (database) The product of the database design process which aims to identify and organize the required data logically and physically.A data model says what information is to be contained in a database, how the information will be used, and how the items in the database will be related to each other.For example, a data model might specify that a customer is represented by a customer name and credit card number and a product as a product code and price, and that there is a one-to-many relation between a customer and a product.It can be difficult to change a database layout once code has been written and data inserted. A well thought-out data model reduces the need for such changes. Data modelling enhances application maintainability and future systems may re-use parts of existing models, which should lower development costs.A data modelling language is a mathematical formalism with a notation for describing data structures and a set of operations used to manipulate and validate that data.One of the most widely used methods for developing data models is the entity-relationship model. The relational model is the most widely used type of data model. Another example is NIAM.[Principles of Database and Knowledge-Base Systems, J.D. Ullman, Volume I, Computer Science Press, 1988, p. 32].(2000-06-24)

data science ::: An interdisciplinary field that uses scientific methods, processes, algorithms and systems to extract knowledge and insights from data in various forms, both structured and unstructured,[135][136] similar to data mining. Data science is a "concept to unify statistics, data analysis, machine learning and their related methods" in order to "understand and analyze actual phenomena" with data.[137] It employs techniques and theories drawn from many fields within the context of mathematics, statistics, information science, and computer science.

dirtball ({XEROX PARC}) A small, perhaps struggling outsider; not in the major or even the minor leagues. For example, "Xerox is not a dirtball company". Outsiders often observe in the PARC culture an institutional arrogance which usage of this term exemplifies. The brilliance and scope of PARC's contributions to computer science have been such that this superior attitude is not much resented. - ESR [{Jargon File}] (1994-12-07)

dirtball ::: (XEROX PARC) A small, perhaps struggling outsider; not in the major or even the minor leagues. For example, Xerox is not a dirtball company.Outsiders often observe in the PARC culture an institutional arrogance which usage of this term exemplifies. The brilliance and scope of PARC's contributions to computer science have been such that this superior attitude is not much resented. - ESR[Jargon File] (1994-12-07)

domain theory ::: (theory) A branch of mathematics introduced by Dana Scott in 1970 as a mathematical theory of programming languages, and for nearly a quarter of a century developed almost exclusively in connection with denotational semantics in computer science.In denotational semantics of programming languages, the meaning of a program is taken to be an element of a domain. A domain is a mathematical structure consisting of a set of values (or points) and an ordering relation, = on those values. Domain theory is the study of such structures.(= is written in LaTeX as \subseteq)Different domains correspond to the different types of object with which a program deals. In a language containing functions, we might have a domain X -> Y functions or applications of functions to other functions. To represent the meaning of such programs, we must solve the recursive equation over domains, D = D -> D equivalent equation has no non-trivial solution in set theory.There are many definitions of domains, with different properties and suitable for different purposes. One commonly used definition is that of Scott domains, often simply called domains, which are omega-algebraic, consistently complete CPOs.There are domain-theoretic computational models in other branches of mathematics including dynamical systems, fractals, measure theory, integration theory, probability theory, and stochastic processes.See also abstract interpretation, bottom, pointed domain. (1999-12-09)

domain theory "theory" A branch of mathematics introduced by Dana Scott in 1970 as a mathematical theory of programming languages, and for nearly a quarter of a century developed almost exclusively in connection with {denotational semantics} in computer science. In {denotational semantics} of programming languages, the meaning of a program is taken to be an element of a domain. A domain is a mathematical structure consisting of a set of values (or "points") and an ordering relation, "= on those values. Domain theory is the study of such structures. (""=" is written in {LaTeX} as {\subseteq}) Different domains correspond to the different types of object with which a program deals. In a language containing functions, we might have a domain X -" Y which is the set of functions from domain X to domain Y with the ordering f "= g iff for all x in X, f x "= g x. In the {pure lambda-calculus} all objects are functions or {applications} of functions to other functions. To represent the meaning of such programs, we must solve the {recursive} equation over domains, D = D -" D which states that domain D is ({isomorphic} to) some {function space} from D to itself. I.e. it is a {fixed point} D = F(D) for some operator F that takes a domain D to D -" D. The equivalent equation has no non-trivial solution in {set theory}. There are many definitions of domains, with different properties and suitable for different purposes. One commonly used definition is that of Scott domains, often simply called domains, which are {omega-algebraic}, {consistently complete} {CPOs}. There are domain-theoretic computational models in other branches of mathematics including {dynamical systems}, {fractals}, {measure theory}, {integration theory}, {probability theory}, and {stochastic processes}. See also {abstract interpretation}, {bottom}, {pointed domain}. (1999-12-09)

Donald Knuth "person" Donald E. Knuth, the author of the {TeX} document formatting system, {Metafont} its {font}-design program and the 3 volume computer science "Bible" of {algorithms}, "The Art of Computer Programming". Knuth suggested the name "{Backus-Naur Form}" and was also involved in the {SOL} simulation language, and developed the {WEB} {literate programming} system. See also {MIX}, {Turingol}. (1994-11-04)

Donald Knuth ::: (person) Donald E. Knuth, the author of the TeX document formatting system, Metafont its font-design program and the 3 volume computer science Bible of algorithms, The Art of Computer Programming.Knuth suggested the name Backus-Naur Form and was also involved in the SOL simulation language, and developed the WEB literate programming system.See also MIX, Turingol. (1994-11-04)

Doug Lenat "person" One of the world's leading computer scientists specialising in {Artificial Intelligence}. He is currently (1999) head of the {Cyc} Project at {MCC}, and President of Cycorp. He has been a Professor of Computer Science at {Carnegie-Mellon University} and {Stanford University}. See also {microLenat}. (1999-08-24)

Doug Lenat ::: (person) One of the world's leading computer scientists specialising in Artificial Intelligence. He is currently (1999) head of the Cyc Project at MCC, and President of Cycorp. He has been a Professor of Computer Science at Carnegie-Mellon University and Stanford University.See also microLenat. (1999-08-24)

European Computer-Industry Research Centre GmbH ::: (body) (ECRC) A joint research organisation founded in 1984 on the initiative of three major European manufacturers: Bull (France), ICL (UK) and competitive ability of the European Information Technology industry and thus complement the work of national and international bodies.The Centre is intended to be the breeding ground for those ideas, techniques and products which are essential for the future use of electronic information processing. The work of the Centre will focus on advanced information processing technology for the next generation of computers.ECRC is an independent company, owned equally by its shareholders. The formal interface between ECRC and its shareholders consists of two bodies: The supervises their execution and the Scientific Advisory Board, which advises the Shareholders' Council in determining future research directions.There are many collaborations between ECRC and its shareholders' companies on specific projects (Technology Transfer, prospective studies etc). The Centre is the member companies, and others seconded from public research agencies and universities.Seminars are held which bring together specialists from the Centre and the member companies.ECRC's mission is to pursue research in fundamental areas of computer science. The aim is to develop the theory, methodologies and tools needed to build to both fundamental research and the process of delivering the results to industry.ECRC plays an important role in Europe and is involved in several European Community initiatives. It is regularly consulted by the Commission of the research plans, international co-operation and relationships between academia and industry.Address: ECRC GmbH, Arabellastrasse 17, D-81925 Munich, Germany. .Telephone: +49 (89) 926 99 0. Fax: +49 (89) 926 99 170. (1994-12-01)

European Computer-Industry Research Centre GmbH "body" (ECRC) A joint research organisation founded in 1984 on the initiative of three major European manufacturers: {Bull} (France), {ICL} (UK) and {Siemens} (Germany). Its activities were intended to enhance the future competitive ability of the European {Information Technology} industry and thus complement the work of national and international bodies. The Centre is intended to be the breeding ground for those ideas, techniques and products which are essential for the future use of electronic information processing. The work of the Centre will focus on advanced information processing technology for the next generation of computers. ECRC is an independent company, owned equally by its shareholders. The formal interface between ECRC and its shareholders consists of two bodies: The Shareholders' Council, which approves the Centre's programmes and budgets and supervises their execution and the Scientific Advisory Board, which advises the Shareholders' Council in determining future research directions. There are many collaborations between ECRC and its shareholders' companies on specific projects (Technology Transfer, prospective studies etc). The Centre is staffed by highly qualified scientists drawn from different countries. Research staff are hired directly by ECRC, as well as some who come on assignment from the member companies, and others seconded from public research agencies and universities. Seminars are held which bring together specialists from the Centre and the member companies. ECRC's mission is to pursue research in fundamental areas of computer science. The aim is to develop the theory, methodologies and tools needed to build innovative computer applications. ECRC contributes actively to the international effort that is expanding the frontiers of knowledge in computer science. It plays an important role in bridging the gap between research and industry by striving to work at the highest academic level with a strong industrial focus. ECRC constitutes an opportunity in Europe for the best scientists and offers young researchers the possibility to mature in an environment which exposes them to both fundamental research and the process of delivering the results to industry. ECRC plays an important role in Europe and is involved in several European Community initiatives. It is regularly consulted by the Commission of the European Communities on strategic issues, such as the definition of future research plans, international co-operation and relationships between academia and industry. Address: ECRC GmbH, Arabellastrasse 17, D-81925 Munich, Germany. {(http://ecrc.de/)}. Telephone: +49 (89) 926 99 0. Fax: +49 (89) 926 99 170. (1994-12-01)

FOundation for Research and Technology - Hellas "company" (FORTH) A small Greek software and research company associated with the Institute of Computer Science, Address: Science and Technology Park of Crete, Vassilika Vouton, P.O.Box 1385 GR 711 10 Heraklion, Crete, Greece. Telephone: +30 (81) 39 16 00, Fax: +30 (81) 39 16 01. (1997-04-12)

FOundation for Research and Technology - Hellas ::: (company) (FORTH) A small Greek software and research company associated with the Institute of Computer Science,Address: Science and Technology Park of Crete, Vassilika Vouton, P.O.Box 1385 GR 711 10 Heraklion, Crete, Greece.Telephone: +30 (81) 39 16 00, Fax: +30 (81) 39 16 01. (1997-04-12)

German National Research Center for Computer Science {GMD}

GMD ::: (company) Full name: GMD - Forschungszentrum Informationstechnik GmbH (German National Research Center for Information Technology).Before April 1995, GMD stood for Gesellschaft f�r Mathematik und Datenverarbeitung - National Research Center for Computer Science, it is retained for historical reasons. .Address: D-53754 Sankt Augustin, Germany. (1995-04-10)

GMD "company, history" A former German research centre. Full name: "GMD - Forschungszentrum Informationstechnik GmbH" (German National Research Center for Information Technology). Before April 1995, GMD stood for "Gesellschaft für Mathematik und Datenverarbeitung" - National Research Center for Computer Science, it is retained for historical reasons. In 2000-2001 GMD was integrated into the {FhG} (Fraunhofer Society for the Advancement of Applied Research). The gmd.de website says (in German): "GMD (Forschungszentrum Informationstechnik GmbH, before March 1995: Gesellschaft für Mathematik und Datenverarbeitung mbH) no longer exists!" Address: PO Box 1316, D-53731 Sankt Augustin 1, Germany (1995-04-10)

Gofer ::: (language) A lazy functional language designed by Mark Jones at the Programming Research Group, Oxford, UK in 1991. expressions, and wild card, as and irrefutable patterns. It lacks modules, arrays and standard classes.Gofer comes with an interpreter (in C), a compiler which compiles to C, documentation and examples. Unix Version 2.30 (1994-06-10) Mac_Gofer version 0.16 beta. Ported to Sun, Acorn Archimedes, IBM PC, Macintosh, Atari, Amiga.Version 2.30 added support for contexts in datatype and member function definitions, Haskell style arrays, an external function calling mechanism for functional state threads, an experimental implementation of do notation for monad comprehensions.Latest version: HUGS.[Introduction to Gofer 2.20, M.P. Jones.][The implementation of the Gofer functional programming system, Mark P. Jones, Research Report YALEU/DCS/RR-1030, Yale University, Department of Computer Science, May 1994. FTP: nebula.cs.yale.edu/pub/yale-fp/reports]. . . (1995-02-14)

Gofer "language" A {lazy} {functional language} designed by Mark Jones "mpj@cs.nott.ac.uk" at the {Programming Research Group}, Oxford, UK in 1991. It is very similar to {Haskell} 1.2. It has {lazy evaluation}, {higher order functions}, {pattern matching}, and {type class}es, lambda, case, conditional and let expressions, and wild card, "as" and {irrefutable patterns}. It lacks {modules}, {arrays} and standard {classes}. Gofer comes with an {interpreter} (in C), a {compiler} which compiles to {C}, documentation and examples. Unix Version 2.30 (1994-06-10) Mac_Gofer version 0.16 beta. Ported to {Sun}, {Acorn} {Archimedes}, {IBM PC}, {Macintosh}, {Atari}, {Amiga}. Version 2.30 added support for contexts in datatype and member function definitions, Haskell style {arrays}, an external function calling mechanism for gofc, an experimental implementation of Launchbury/Peyton Jones style lazy functional state threads, an experimental implementation of "do" notation for {monad comprehensions}. ["Introduction to Gofer 2.20", M.P. Jones.] [The implementation of the Gofer functional programming system, Mark P. Jones, Research Report YALEU/DCS/RR-1030, Yale University, Department of Computer Science, May 1994. FTP: nebula.cs.yale.edu/pub/yale-fp/reports]. {(http://cs.nott.ac.uk/Department/Staff/mpj/)}. {FTP Yale (ftp://nebula.cs.yale.edu/)}, {FTP Glasgow (ftp://ftp.dcs.glasgow.ac.uk/)}, {FTP Chalmers (ftp://ftp.cs.chalmers.se/pub/haskell/gofer/)}. (1995-02-14)

graph (abstract data type) ::: In computer science, a graph is an abstract data type that is meant to implement the undirected graph and directed graph concepts from mathematics; specifically, the field of graph theory.

hacker "person, jargon" (Originally, someone who makes furniture with an axe) 1. A person who enjoys exploring the details of programmable systems and how to stretch their capabilities, as opposed to most users, who prefer to learn only the minimum necessary. 2. One who programs enthusiastically (even obsessively) or who enjoys programming rather than just theorizing about programming. 3. A person capable of appreciating {hack value}. 4. A person who is good at programming quickly. 5. An expert at a particular program, or one who frequently does work using it or on it; as in "a {Unix} hacker". (Definitions 1 through 5 are correlated, and people who fit them congregate.) 6. An expert or enthusiast of any kind. One might be an astronomy hacker, for example. 7. One who enjoys the intellectual challenge of creatively overcoming or circumventing limitations. 8. (Deprecated) A malicious meddler who tries to discover sensitive information by poking around. Hence "password hacker", "network hacker". The correct term is {cracker}. The term "hacker" also tends to connote membership in the global community defined by the net (see {The Network} and {Internet address}). It also implies that the person described is seen to subscribe to some version of the {hacker ethic}. It is better to be described as a hacker by others than to describe oneself that way. Hackers consider themselves something of an elite (a meritocracy based on ability), though one to which new members are gladly welcome. Thus while it is gratifying to be called a hacker, false claimants to the title are quickly labelled as "bogus" or a "{wannabee}". 9. (University of Maryland, rare) A programmer who does not understand proper programming techniques and principles and doesn't have a Computer Science degree. Someone who just bangs on the keyboard until something happens. For example, "This program is nothing but {spaghetti code}. It must have been written by a hacker". [{Jargon File}] (1996-08-26)

hacker ::: (person, jargon) (Originally, someone who makes furniture with an axe) 1. A person who enjoys exploring the details of programmable systems and how to stretch their capabilities, as opposed to most users, who prefer to learn only the minimum necessary.2. One who programs enthusiastically (even obsessively) or who enjoys programming rather than just theorizing about programming.3. A person capable of appreciating hack value.4. A person who is good at programming quickly.5. An expert at a particular program, or one who frequently does work using it or on it; as in a Unix hacker. (Definitions 1 through 5 are correlated, and people who fit them congregate.)6. An expert or enthusiast of any kind. One might be an astronomy hacker, for example.7. One who enjoys the intellectual challenge of creatively overcoming or circumventing limitations.8. (Deprecated) A malicious meddler who tries to discover sensitive information by poking around. Hence password hacker, network hacker. The correct term is cracker.The term hacker also tends to connote membership in the global community defined by the net (see The Network and Internet address). It also implies that the person described is seen to subscribe to some version of the hacker ethic.It is better to be described as a hacker by others than to describe oneself that way. Hackers consider themselves something of an elite (a meritocracy based on gratifying to be called a hacker, false claimants to the title are quickly labelled as bogus or a wannabee.9. (University of Maryland, rare) A programmer who does not understand proper programming techniques and principles and doesn't have a Computer Science example, This program is nothing but spaghetti code. It must have been written by a hacker.[Jargon File] (1996-08-26)

ICSI ::: International Computer Science Institute at Berkeley, CA.

ICSI {International Computer Science Institute} at Berkeley, CA.

Information Technology ::: (business, jargon) (IT) Applied computer systems - both hardware and software, and often including networking and telecommunications, usually in the context of a business or other enterprise. Often the name of the part of an enterprise that deals with all things electronic.The term computer science is usually reserved for the more theoretical, academic aspects of computing, while the vaguer terms information systems (IS) non-computerised business processes like knowledge management. Others say that IT includes computer science.(2000-10-02)

information technology "business, jargon" (IT) Applied computer systems - both {hardware} and {software}, and often including {networking} and {telecommunications}, usually in the context of a business or other enterprise. Often the name of the part of an enterprise that deals with all things electronic. The term "{computer science}" is usually reserved for the more theoretical, academic aspects of computing, while the vaguer terms "information systems" (IS) or "information services" may include more of the human activities and non-computerised business processes like {knowledge management}. Others say that IT includes computer science. (2000-10-02)

Institute of Electrical and Electronics Engineers, Inc. (IEEE) The world's largest technical professional society, based in the USA. Founded in 1884 by a handful of practitioners of the new electrical engineering discipline, today's Institute has more than 320,000 members who participate in its activities in 147 countries. The IEEE sponsors technical conferences, symposia and local meetings worldwide, publishes nearly 25% of the world's technical papers in electrical, electronics and computer engineering and computer science, provides educational programs for its members and promotes standardisation. Areas covered include aerospace, computers and communications, biomedical technology, electric power and consumer electronics. {(http://ieee.org/)}. {Gopher (gopher://gopher.ieee.org/)}. {(ftp://ftp.ieee.org/)}. E-mail file-server: "fileserver-help@info.ieee.org". { IEEE Standards Process Automation (SPA) System (http://stdsbbs.ieee.org/)}, {telnet (telnet:stdsbbs.ieee.org)} [140.98.1.11]. (1995-03-10)

Institut National de Recherche en Informatique et Automatique "body" (INRIA) A French research institute for computer science, {control theory}, and applied mathematics. INRIA has research units in Rocquencourt (near Paris), Sophia-Antipolis (near Nice), Grenoble, Nancy (also known as LORIA) and Rennes (known as IRISA), the last two in partnership with {CNRS} and local universities. INRIA works on various projects, including the development of {free software} such as {SciLab}, {Objective Caml}, {Bigloo}, and projects such as {GNU MP}. (2003-07-13)

Institut National de Recherche en Informatique et Automatique ::: (body) (INRIA) A French research institute for computer science, control theory, and applied mathematics. INRIA has research units in Rocquencourt (near Rennes (known as IRISA), the last two in partnership with CNRS and local universities.INRIA works on various projects, including the development of free software such as SciLab, Objective Caml, Bigloo, and projects such as GNU MP.(2003-07-13)

Interface Description Language (IDL) A language designed by Nestor, Lamb and Wulf of {CMU} in 1981 for describing the data structures passed between parts of an application, to provide a language-independent intermediate representation. It forms part of Richard Snodgrass "rts@cs.arizona.edu"'s {Scorpion} environment development system. Not to be confused with any of the other {IDLs}. Mailing list: info-idl@sei.cmu.edu. ["The Interface Description Language: Definition and Use," by Richard Snodgrass, Computer Science Press, 1989, ISBN 0-7167-8198-0]. [SIGPLAN Notices 22(11) (Nov 1987) special issue]. (1994-11-11)

Interface Description Language ::: (IDL) A language designed by Nestor, Lamb and Wulf of CMU in 1981 for describing the data structures passed between parts of an application, to provide a language-independent intermediate representation.It forms part of Richard Snodgrass 's Scorpion environment development system.Not to be confused with any of the other IDLs.Mailing list: [The Interface Description Language: Definition and Use, by Richard Snodgrass, Computer Science Press, 1989, ISBN 0-7167-8198-0].[SIGPLAN Notices 22(11) (Nov 1987) special issue]. (1994-11-11)

interpretation ::: An assignment of meaning to the symbols of a formal language. Many formal languages used in mathematics, logic, and theoretical computer science are defined in solely syntactic terms, and as such do not have any meaning until they are given some interpretation. The general study of interpretations of formal languages is called formal semantics.

Ivan Sutherland Ivan E. Sutherland is widely known for his pioneering contributions. His 1963 MIT PhD thesis, {Sketchpad}, opened the field of computer graphics. His 1966 work, with Sproull, on a head-mounted display anticipated today's {virtual reality} by 25 years. He co-founded {Evans and Sutherland}, which manufactures the most advanced computer image generators now in use. As head of Computer Science Department of {Caltech} he helped make {integrated circuit} design an acceptable field of academic study. Dr. Sutherland is on the boards of several small companies and is a member of the National Academy of Engineering and the National Academy of Sciences, the {ACM} and {IEEE}. He received the {ACM}'s {Turing Award} in 1988. He is now Vice President and Fellow of {Sun Microsystems} Laboratories in Mountain View, CA, USA. (1994-11-16)

John von Neumann "person" /jon von noy'mahn/ Born 1903-12-28, died 1957-02-08. A Hungarian-born mathematician who did pioneering work in quantum physics, game theory, and {computer science}. He contributed to the USA's Manhattan Project that built the first atomic bomb. von Neumann was invited to Princeton University in 1930, and was a mathematics professor at the {Institute for Advanced Studies} from its formation in 1933 until his death. From 1936 to 1938 {Alan Turing} was a visitor at the Institute and completed a Ph.D. dissertation under von Neumann's supervision. This visit occurred shortly after Turing's publication of his 1934 paper "On Computable Numbers with an Application to the Entscheidungs-problem" which involved the concepts of logical design and the universal machine. von Neumann must have known of Turing's ideas but it is not clear whether he applied them to the design of the IAS Machine ten years later. While serving on the BRL Scientific Advisory Committee, von Neumann joined the developers of {ENIAC} and made some critical contributions. In 1947, while working on the design for the successor machine, {EDVAC}, von Neumann realized that ENIAC's lack of a centralized control unit could be overcome to obtain a rudimentary stored program computer. He also proposed the {fetch-execute cycle}. His ideas led to what is now often called the {von Neumann architecture}. {(http://sis.pitt.edu/~mbsclass/is2000/hall_of_fame/vonneuma.htm)}. {(http://ei.cs.vt.edu/~history/VonNeumann.html)}. {(http://ftp.arl.mil/~mike/comphist/54nord/)}. (2004-01-14)

John von Neumann ::: (person) /jon von noy'mahn/ Born 1903-12-28, died 1957-02-08.A Hungarian-born mathematician who did pioneering work in quantum physics, game theory, and computer science. He contributed to the USA's Manhattan Project that built the first atomic bomb. von Neumann was invited to Princeton University in 1930, and was a mathematics professor at the Institute for Advanced Studies from its formation in 1933 until his death.From 1936 to 1938 Alan Turing was a visitor at the Institute and completed a Ph.D. dissertation under von Neumann's supervision. This visit occurred shortly but it is not clear whether he applied them to the design of the IAS Machine ten years later.While serving on the BRL Scientific Advisory Committee, von Neumann joined the developers of ENIAC and made some critical contributions. In 1947, while working rudimentary stored program computer. He also proposed the fetch-execute cycle. His ideas led to what is now often called the von Neumann architecture. . . .(2004-01-14)

John von Neumann ::: (person) /jon von noy'mahn/ Born 1903-12-28, died 1957-02-08.A Hungarian-born mathematician who did pioneering work in quantum physics, game theory, and computer science. He contributed to the USA's Manhattan Project that built the first atomic bomb.von Neumann was invited to Princeton University in 1930, and was a mathematics professor at the Institute for Advanced Studies from its formation in 1933 until his death.From 1936 to 1938 Alan Turing was a visitor at the Institute and completed a Ph.D. dissertation under von Neumann's supervision. This visit occurred shortly but it is not clear whether he applied them to the design of the IAS Machine ten years later.While serving on the BRL Scientific Advisory Committee, von Neumann joined the developers of ENIAC and made some critical contributions. In 1947, while working rudimentary stored program computer. He also proposed the fetch-execute cycle. His ideas led to what is now often called the von Neumann architecture. . . .(2004-01-14)

Knowledge Systems Laboratory ::: (KSL) An artificial intelligence research laboratory within the Department of Computer Science at Stanford University. Current work focuses on knowledge computational environments for modelling physical devices, architectures for adaptive intelligent systems, and expert systems for science and engineering. (1994-12-06)

Knowledge Systems Laboratory (KSL) An {artificial intelligence} research laboratory within the Department of Computer Science at {Stanford University}. Current work focuses on {knowledge representation} for sharable engineering knowledge bases and systems, computational environments for modelling physical devices, architectures for adaptive intelligent systems, and {expert systems} for science and engineering. (1994-12-06)

Lab for Computer Science {MIT}. {(http://lcs.mit.edu/)}.

Lab for Computer Science ::: MIT. .

LaTeX "language, text, tool" (Lamport TeX) Leslie Lamport "lamport@pa.dec.com"'s document preparation system built on top of {TeX}. LaTeX was developed at {SRI International}'s Computer Science Laboratory and was built to resemble {Scribe}. LaTeX adds commands to simplify typesetting and lets the user concentrate on the structure of the text rather than on formatting commands. {BibTeX} is a LaTeX package for bibliographic citations. Lamport's LaTeX book has an exemplary index listing every symbol, concept and example in the book. The index in the, now obsolete, first edition includes (on page 221) the mysterious entry "Gilkerson, Ellen, 221". The second edition (1994) has an entry for "{infinite loop}" instead. ["LaTeX, A Document Preparation System", Leslie Lamport, A-W 1986, ISBN 0-201-15790-X (first edition, now obsolete)]. (1997-11-17)

LaTeX ::: (language, text, tool) (Lamport TeX) Leslie Lamport 's document preparation system built on top of TeX. LaTeX was developed at SRI International's Computer Science Laboratory and was built to resemble Scribe.LaTeX adds commands to simplify typesetting and lets the user concentrate on the structure of the text rather than on formatting commands.BibTeX is a LaTeX package for bibliographic citations.Lamport's LaTeX book has an exemplary index listing every symbol, concept and example in the book. The index in the, now obsolete, first edition includes (on page 221) the mysterious entry Gilkerson, Ellen, 221. The second edition (1994) has an entry for infinite loop instead.[LaTeX, A Document Preparation System, Leslie Lamport, A-W 1986, ISBN 0-201-15790-X (first edition, now obsolete)]. (1997-11-17)

Machine learning - a subfield of computer science which focuses on the development of algorithms that can learn from and make predictions on data without being explicitly programmed. See /r/machinelearning

machine vision (MV) ::: The technology and methods used to provide imaging-based automatic inspection and analysis for such applications as automatic inspection, process control, and robot guidance, usually in industry. Machine vision is a term encompassing a large number of technologies, software and hardware products, integrated systems, actions, methods and expertise. Machine vision as a systems engineering discipline can be considered distinct from computer vision, a form of computer science. It attempts to integrate existing technologies in new ways and apply them to solve real world problems. The term is the prevalent one for these functions in industrial automation environments but is also used for these functions in other environments such as security and vehicle guidance.

Marc Andreessen "person" The man who founded {Netscape Communications Corporation} in April 1994 with {Dr. James H. Clark}. Andreessen has been a director since September 1994. As an undergraduate at the {University of Illinois} in Champaign, Andreessen created the {Mosaic} {web browser} prototype with a team of students and staff at the university's {National Center for Supercomputing Applications} (NCSA). With a friendly, {point-and-click} method for {navigating} the {Internet} and free distribution to network users, NCSA Mosaic gained an estimated two million users worldwide in just over one year. Andreessen earned his Bachelor of Science degree in Computer Science at the University of Illinois in 1993. {Home (http://netscape.com/columns/techvision/index.html)}. (1999-04-12)

Marc Andreessen ::: (person) The man who founded Netscape Communications Corporation in April 1994 with Dr. James H. Clark. Andreessen has been a director since September 1994.As an undergraduate at the University of Illinois in Champaign, Andreessen created the Mosaic web browser prototype with a team of students and staff at worldwide in just over one year. Andreessen earned his bachelor of science degree in computer science at the University of Illinois in 1993. . (1999-04-12)

Margaret Hamilton "person" (born 1936-08-17) A {computer scientist}, {systems engineer} and business owner, credited with coining the term {software engineering}. Margaret Hamilton published over 130 papers, proceedings and reports about the 60 projects and six major programs in which she has been involved. In 1965 she became Director of Software Programming at MIT's {Charles Stark Draper Laboratory} and Director of the Software Engineering Division of the {MIT Instrumentation Laboratory}, which developed on-board {flight software} for the Apollo space program. At {NASA}, Hamilton pioneered the Apollo on-board guidance software that navigated to and landed on the Moon and formed the basis for software used in later missions. At the time, programming was a hands-on, engineering descipline; computer science and software engineering barely existed. Hamilton produced innovations in {system design} and software development, enterprise and {process modelling}, development paradigms, {formal systems modelling languages}, system-oriented objects for systems modelling and development, {automated life-cycle environments}, {software reliability}, {software reuse}, {domain analysis}, correctness by built-in language properties, open architecture techniques for robust systems, full {life-cycle automation}, {quality assurance}, {seamless integration}, {error detection and recovery}, {man-machine interface} systems, {operating systems}, {end-to-end testing} and {life-cycle management}. She developed concepts of {asynchronous software}, {priority scheduling} and {Human-in-the-loop} decision capability, which became the foundation for modern, ultra-reliable software design. The Apollo 11 moon landing would have aborted when spurious data threatened to overload the computer, but thanks to the innovative asynchronous, priority based scheduling, it eliminated the unnecessary processing and completed the landing successfully. In 1986, she founded {Hamilton Technologies, Inc.}, developed around the {Universal Systems Language} and her systems and software design {paradigm} of {Development Before the Fact} (DBTF). (2015-03-08)

Massachusetts Institute of Technology (MIT) An independent, coeducational university located in Cambridge, MA, USA. Its best-known computer-related labs are the {Artificial Intelligence Lab}, the {Lab for Computer Science} and the Media Lab. It is also known for its {hacks} or practical jokes, such as {The Great Dome Police Car Hack (http://the-tech.mit.edu/Bulletins/hack.html)}. Resident computer {hackers} include {Richard Stallman}, {Gerald Sussman} and {Tom Knight}. See also {6.001}. {(http://web.mit.edu/)}.

mcvax mcvax.cwi.nl used to be the international {backbone} node of {EUnet}, the European Unix network. It was located in Amsterdam, Netherlands and belonged to "Centrum voor Wiskunde en Informatica" (Centre for Mathematics and Computer Science) which is an institute belonging to a foundation called "Mathematisch Centrum". Since the first mcvax was on of the first {VAXen} in Europe and one of it's first {uucp} connections was to a machine called decvax it was quickly christened mcvax. Some also say this was done to give Jim McKie a nice mail address: mcvax!mckie. But this is certainly not true at all. The function of EUnet international backbone moved to another VAX later but the name moved with it, because in those days of mainly uucp based mail and before widespread use of {pathalias} it was simply not feasible to rename the machine to "europa" as was suggested at one stage. Mcsun (or relay.eu.net or net.eu.relay in some parts of Europe) replaced the international backbone host of EUnet around 1990. This machine was donated by {Sun Microsystems} owned by the {European Unix Systems User Group} (EUUG). It was located about 5m from where mcvax used to be and operated by the same people. Mcvax has finally ceased to exist in the {domain} and {uucp} {namespaces}. It still exists in the {EARN}/{BITNET} namespace. [Posting by Daniel Karrenberg "dfk@eu.net" to eunet.general]. (1990-03-02)

metaheuristic ::: In computer science and mathematical optimization, a metaheuristic is a higher-level procedure or heuristic designed to find, generate, or select a heuristic (partial search algorithm) that may provide a sufficiently good solution to an optimization problem, especially with incomplete or imperfect information or limited computation capacity.[221][222] Metaheuristics sample a set of solutions which is too large to be completely sampled.

model checking ::: In computer science, model checking or property checking is, for a given model of a system, exhaustively and automatically checking whether this model meets a given specification. Typically, one has hardware or software systems in mind, whereas the specification contains safety requirements such as the absence of deadlocks and similar critical states that can cause the system to crash. Model checking is a technique for automatically verifying correctness properties of finite-state systems.

Monte Carlo tree search ::: In computer science, Monte Carlo tree search (MCTS) is a heuristic search algorithm for some kinds of decision processes.

naive semantics ::: An approach used in computer science for representing basic knowledge about a specific domain, and has been used in applications such as the representation of the meaning of natural language sentences in artificial intelligence applications. In a general setting the term has been used to refer to the use of a limited store of generally understood knowledge about a specific domain in the world, and has been applied to fields such as the knowledge based design of data schemas.[224]

natural language processing (NLP) ::: A subfield of computer science, information engineering, and artificial intelligence concerned with the interactions between computers and human (natural) languages, in particular how to program computers to process and analyze large amounts of natural language data.

Ninety-Ninety Rule ::: (humour) The first 90% of the code accounts for the first 90% of the development time. The remaining 10% of the code accounts for the other 90% of the development time.An aphorism attributed to Tom Cargill of Bell Labs, and popularised by Jon Bentley's September 1985 Bumper-Sticker Computer Science column in Communications of the ACM. It was there called the Rule of Credibility, a name which seems not to have stuck.[Jargon File] (1995-07-14)

Ninety-Ninety Rule "humour" "The first 90% of the code accounts for the first 90% of the development time. The remaining 10% of the code accounts for the other 90% of the development time". An aphorism attributed to Tom Cargill of Bell Labs, and popularised by Jon Bentley's September 1985 "Bumper-Sticker Computer Science" column in "Communications of the ACM". It was there called the "Rule of Credibility", a name which seems not to have stuck. [{Jargon File}] (1995-07-14)

occurs check "programming" A feature of some implementations of {unification} which causes unification of a {logic variable} V and a structure S to fail if S contains V. Binding a variable to a structure containing that variable results in a cyclic structure which may subsequently cause unification to loop forever. Some implementations use extra pointer comparisons to avoid this. Most implementations of {Prolog} do not perform the occurs check for reasons of efficiency. Without occurs check the {complexity} of {unification} is O(min(size(term1), size(term2))) with occurs check it's O(max(size(term1), size(term2))) In {theorem proving} unification without the occurs check can lead to unsound inference. For example, in {Prolog} it is quite valid to write X = f(X). which will succeed, binding X to a cyclic structure. Clearly however, if f is taken to stand for a function rather than a {constructor}, then the above equality is only valid if f is the {identity function}. Weijland calls unification without occur check, "complete unification". The reference below describes a complete unification algorithm in terms of Colmerauer's consistency algorithm. ["Semantics for Logic Programs without Occur Check", W.P. Weijland, Theoretical Computer Science 71 (1990) pp 155-174]. (1996-01-11)

occurs check ::: (programming) A feature of some implementations of unification which causes unification of a logic variable V and a structure S to fail if S contains V.Binding a variable to a structure containing that variable results in a cyclic structure which may subsequently cause unification to loop forever. Some implementations use extra pointer comparisons to avoid this.Most implementations of Prolog do not perform the occurs check for reasons of efficiency. Without occurs check the complexity of unification is O(min(size(term1), size(term2))) with occurs check it's O(max(size(term1), size(term2))) unsound inference. For example, in Prolog it is quite valid to write X = f(X). constructor, then the above equality is only valid if f is the identity function.Weijland calls unification without occur check, complete unification. The reference below describes a complete unification algorithm in terms of Colmerauer's consistency algorithm.[Semantics for Logic Programs without Occur Check, W.P. Weijland, Theoretical Computer Science 71 (1990) pp 155-174]. (1996-01-11)

PROgrammed Graph REwriting Systems ::: (language) (PROGRES) A very high level language based on graph grammars, developed by Andy Scheurr and Albert Zuendorf of RWTH, Aachen in 1991.PROGRES supports structurally object-oriented specification of attributed graph structures with multiple inheritance hierarchies and types of types (for imperative programming of composite graph transformations (with built-in backtracking and cancelling arbitrary sequences of failing graph modifications).It is used for implementing abstract data types with graph-like internal structure, as a visual language for the graph-oriented database GRAS, and as a rule-oriented language for prototyping nondeterministically specified data/rule base transformations.PROGRES has a formally defined semantics based on PROgrammed Graph Rewriting Systems. It is an almost statically typed language which additionally offers down casting operators for run time checked type casting/conversion (in order to avoid severe restrictions concerning the language's expressiveness).Version RWTH 5.10 includes an integrated environment.[A. Scheurr, Introduction to PROGRES, an Attribute Graph Grammar Based Specification Language, in Proc WG89 Workshop on Graphtheoretic Concepts in Computer Science, LNCS 411, Springer 1991]. (1993-11-02)

PROgrammed Graph REwriting Systems "language" (PROGRES) A very high level language based on {graph grammars}, developed by Andy Scheurr "andy@i3.informatik.rwth-aachen.de" and Albert Zuendorf "albert@i3.informatik.rwth-aachen.de" of {RWTH}, Aachen in 1991. PROGRES supports structurally {object-oriented specification} of {attributed graph} structures with {multiple inheritance} hierarchies and types of types (for {parametric polymorphism}). It also supports declarative/relational specification of derived attributes, node sets, binary relationships (directed edges) and {Boolean} {constraints}, rule-oriented/visual specification of parameterised graph rewrite rules with complex application conditions, {nondeterministic} and {imperative programming} of composite graph transformations (with built-in {backtracking} and cancelling arbitrary sequences of failing graph modifications). It is used for implementing {abstract data types} with graph-like internal structure, as a visual language for the {graph-oriented database} {GRAS}, and as a rule-oriented language for prototyping {nondeterministic}ally specified data/rule base transformations. PROGRES has a formally defined {semantics} based on "PROgrammed Graph Rewriting Systems". It is an almost {statically typed} language which additionally offers "down casting" operators for run time checked type casting/conversion (in order to avoid severe restrictions concerning the language's expressiveness). Version RWTH 5.10 includes an integrated environment. [A. Scheurr, "Introduction to PROGRES, an Attribute Graph Grammar Based Specification Language", in Proc WG89 Workshop on Graphtheoretic Concepts in Computer Science", LNCS 411, Springer 1991]. {(ftp://ftp.informatik.rwth-aachen.de/pub/Unix/PROGRES/)} for {Sun-4}. (1993-11-02)

Quintus Prolog "language, product" A version of {Prolog} developed by {Quintus}. Development of Quintus Prolog had transferred to the {Swedish Institute of Computer Science} by December 1998. {(ftp://ftp.quintus.com/)}. Telephone: +1 (800) 542 1283. [More details? Features?] (1998-12-12)

Quintus Prolog ::: (language, product) A version of Prolog developed by Quintus. Development of Quintus Prolog had transferred to the Swedish Institute of Computer Science by December 1998. .Telephone: +1 (800) 542 1283.[More details? Features?] (1998-12-12)

RC4 ::: (cryptography) A cipher designed by RSA Data Security, Inc. which can accept keys of arbitrary length, and is essentially a pseudo random number either. There is very strong evidence that the posted algorithm is indeed equivalent to RC4.The United States government routinely approves RC4 with 40-bit keys for export. Keys this small can be easily broken by governments, criminals, and amateurs. many universities or companies the same computing power is available to any computer science student.See also RC4 Source and Information . (1996-10-28)

RC4 "cryptography" A {cipher} designed by {RSA Data Security, Inc.} which can accept {keys} of arbitrary length, and is essentially a {pseudo random number generator} with the output of the generator being {XOR}ed with the data stream to produce the encrypted data. For this reason, it is very important that the same RC4 key never be used to encrypt two different data streams. The encryption mechanism used to be a trade secret, until someone posted source code for an {algorithm} onto {Usenet News}, claiming it to be equivalent to RC4. The algorithm is very fast, its security is unknown, but breaking it does not seem trivial either. There is very strong evidence that the posted algorithm is indeed equivalent to RC4. The United States government routinely approves RC4 with 40-bit keys for export. Keys this small can be easily broken by governments, criminals, and amateurs. The exportable version of {Netscape}'s {Secure Socket Layer}, which uses RC4-40, was broken by at least two independent groups. Breaking it took about eight days; in many universities or companies the same computing power is available to any computer science student. See also {Damien Doligez's SSL cracking page (http://pauillac.inria.fr/~doligez/ssl/)}, {RC4 Source and Information (http://cs.hut.fi/crypto/rc4)}, {SSLeay (http://cs.hut.fi/crypto/software.html

Real Programmers Don't Use Pascal ::: (humour) Back in the good old days - the Golden Era of computers, it was easy to separate the men from the boys (sometimes called Real Men and out that Real Men don't relate to anything, and aren't afraid of being impersonal.)But, as usual, times change. We are faced today with a world in which little old ladies can get computers in their microwave ovens, 12-year-old kids can blow danger of becoming extinct, of being replaced by high-school students with TRASH-80s.There is a clear need to point out the differences between the typical high-school junior Pac-Man player and a Real Programmer. If this difference is why it would be a mistake to replace the Real Programmers on their staff with 12-year-old Pac-Man players (at a considerable salary savings).LANGUAGESThe easiest way to tell a Real Programmer from the crowd is by the programming language he (or she) uses. Real Programmers use Fortran. Quiche Eaters use need all these abstract concepts to get their jobs done - they are perfectly happy with a keypunch, a Fortran IV compiler, and a beer.Real Programmers do List Processing in Fortran.Real Programmers do String Manipulation in Fortran.Real Programmers do Accounting (if they do it at all) in Fortran.Real Programmers do Artificial Intelligence programs in Fortran.If you can't do it in Fortran, do it in assembly language. If you can't do it in assembly language, it isn't worth doing.STRUCTURED PROGRAMMINGThe academics in computer science have gotten into the structured programming rut over the past several years. They claim that programs are more easily in the world won't help you solve a problem like that - it takes actual talent. Some quick observations on Real Programmers and Structured Programming:Real Programmers aren't afraid to use GOTOs.Real Programmers can write five-page-long DO loops without getting confused.Real Programmers like Arithmetic IF statements - they make the code more interesting.Real Programmers write self-modifying code, especially if they can save 20 nanoseconds in the middle of a tight loop.Real Programmers don't need comments - the code is obvious.Since Fortran doesn't have a structured IF, REPEAT ... UNTIL, or CASE statement, Real Programmers don't have to worry about not using them. Besides, they can be simulated when necessary using assigned GOTOs.Data Structures have also gotten a lot of press lately. Abstract Data Types, Structures, Pointers, Lists, and Strings have become popular in certain circles. Languages, as we all know, have implicit typing based on the first letter of the (six character) variable name.OPERATING SYSTEMSWhat kind of operating system is used by a Real Programmer? CP/M? God forbid - CP/M, after all, is basically a toy operating system. Even little old ladies and grade school students can understand and use CP/M.Unix is a lot more complicated of course - the typical Unix hacker never can remember what the PRINT command is called this week - but when it gets right systems: they send jokes around the world on UUCP-net and write adventure games and research papers.No, your Real Programmer uses OS 370. A good programmer can find and understand the description of the IJK305I error he just got in his JCL manual. A great outstanding programmer can find bugs buried in a 6 megabyte core dump without using a hex calculator. (I have actually seen this done.)OS is a truly remarkable operating system. It's possible to destroy days of work with a single misplaced space, so alertness in the programming staff is people claim there is a Time Sharing system that runs on OS 370, but after careful study I have come to the conclusion that they were mistaken.PROGRAMMING TOOLSWhat kind of tools does a Real Programmer use? In theory, a Real Programmer could run his programs by keying them into the front panel of the computer. Back the first operating system for the CDC7600 in on the front panel from memory when it was first powered on. Seymore, needless to say, is a Real Programmer.One of my favorite Real Programmers was a systems programmer for Texas Instruments. One day he got a long distance call from a user whose system had includes a keypunch and lineprinter in his toolkit, he can get along with just a front panel and a telephone in emergencies.In some companies, text editing no longer consists of ten engineers standing in line to use an 029 keypunch. In fact, the building I work in doesn't contain a system is called SmallTalk, and would certainly not talk to the computer with a mouse.Some of the concepts in these Xerox editors have been incorporated into editors running on more reasonably named operating systems - Emacs and VI being two. The the Real Programmer wants a you asked for it, you got it text editor - complicated, cryptic, powerful, unforgiving, dangerous. TECO, to be precise.It has been observed that a TECO command sequence more closely resembles transmission line noise than readable text [4]. One of the more entertaining will probably destroy your program, or even worse - introduce subtle and mysterious bugs in a once working subroutine.For this reason, Real Programmers are reluctant to actually edit a program that is close to working. They find it much easier to just patch the binary object Programmer to do the job - no Quiche Eating structured programmer would even know where to start. This is called job security.Some programming tools NOT used by Real Programmers:Fortran preprocessors like MORTRAN and RATFOR. The Cuisinarts of programming - great for making Quiche. See comments above on structured programming.Source language debuggers. Real Programmers can read core dumps.Compilers with array bounds checking. They stifle creativity, destroy most of the interesting uses for EQUIVALENCE, and make it impossible to modify the operating system code with negative subscripts. Worst of all, bounds checking is inefficient.Source code maintenance systems. A Real Programmer keeps his code locked up in a card file, because it implies that its owner cannot leave his important programs unguarded [5].THE REAL PROGRAMMER AT WORKWhere does the typical Real Programmer work? What kind of programs are worthy of the efforts of so talented an individual? You can be sure that no Real or sorting mailing lists for People magazine. A Real Programmer wants tasks of earth-shaking importance (literally!).Real Programmers work for Los Alamos National Laboratory, writing atomic bomb simulations to run on Cray I supercomputers.Real Programmers work for the National Security Agency, decoding Russian transmissions.It was largely due to the efforts of thousands of Real Programmers working for NASA that our boys got to the moon and back before the Russkies.Real Programmers are at work for Boeing designing the operating systems for cruise missiles.Some of the most awesome Real Programmers of all work at the Jet Propulsion Laboratory in California. Many of them know the entire operating system of the bytes of unused memory in a Voyager spacecraft that searched for, located, and photographed a new moon of Jupiter.The current plan for the Galileo spacecraft is to use a gravity assist trajectory past Mars on the way to Jupiter. This trajectory passes within 80 +/-3 kilometers of the surface of Mars. Nobody is going to trust a Pascal program (or a Pascal programmer) for navigation to these tolerances.As you can tell, many of the world's Real Programmers work for the U.S. Government - mainly the Defense Department. This is as it should be. Recently, programmers and Quiche Eaters alike.) Besides, the determined Real Programmer can write Fortran programs in any language.The Real Programmer might compromise his principles and work on something slightly more trivial than the destruction of life as we know it, providing Fortran, so there are a fair number of people doing graphics in order to avoid having to write COBOL programs.THE REAL PROGRAMMER AT PLAYGenerally, the Real Programmer plays the same way he works - with computers. He is constantly amazed that his employer actually pays him to do what he would be breath of fresh air and a beer or two. Some tips on recognizing Real Programmers away from the computer room:At a party, the Real Programmers are the ones in the corner talking about operating system security and how to get around it.At a football game, the Real Programmer is the one comparing the plays against his simulations printed on 11 by 14 fanfold paper.At the beach, the Real Programmer is the one drawing flowcharts in the sand.At a funeral, the Real Programmer is the one saying Poor George, he almost had the sort routine working before the coronary.In a grocery store, the Real Programmer is the one who insists on running the cans past the laser checkout scanner himself, because he never could trust keypunch operators to get it right the first time.THE REAL PROGRAMMER'S NATURAL HABITATWhat sort of environment does the Real Programmer function best in? This is an important question for the managers of Real Programmers. Considering the amount of money it costs to keep one on the staff, it's best to put him (or her) in an environment where he can get his work done.The typical Real Programmer lives in front of a computer terminal. Surrounding this terminal are:Listings of all programs the Real Programmer has ever worked on, piled in roughly chronological order on every flat surface in the office.Some half-dozen or so partly filled cups of cold coffee. Occasionally, there will be cigarette butts floating in the coffee. In some cases, the cups will contain Orange Crush.Unless he is very good, there will be copies of the OS JCL manual and the Principles of Operation open to some particularly interesting pages.Taped to the wall is a line-printer Snoopy calendar for the year 1969.Strewn about the floor are several wrappers for peanut butter filled cheese bars - the type that are made pre-stale at the bakery so they can't get any worse while waiting in the vending machine.Hiding in the top left-hand drawer of the desk is a stash of double-stuff Oreos for special occasions.Underneath the Oreos is a flowcharting template, left there by the previous occupant of the office. (Real Programmers write programs, not documentation. Leave that to the maintenance people.)The Real Programmer is capable of working 30, 40, even 50 hours at a stretch, under intense pressure. In fact, he prefers it that way. Bad response time project done on time, but creates a convenient excuse for not doing the documentation. In general:No Real Programmer works 9 to 5 (unless it's the ones at night).Real Programmers don't wear neckties.Real Programmers don't wear high-heeled shoes.Real Programmers arrive at work in time for lunch [9].A Real Programmer might or might not know his wife's name. He does, however, know the entire ASCII (or EBCDIC) code table.Real Programmers don't know how to cook. Grocery stores aren't open at three in the morning. Real Programmers survive on Twinkies and coffee.THE FUTUREWhat of the future? It is a matter of some concern to Real Programmers that the latest generation of computer programmers are not being brought up with the same ever learning Fortran! Are we destined to become an industry of Unix hackers and Pascal programmers?From my experience, I can only report that the future is bright for Real Programmers everywhere. Neither OS 370 nor Fortran show any signs of dying out, one of them has a way of converting itself back into a Fortran 66 compiler at the drop of an option card - to compile DO loops like God meant them to be.Even Unix might not be as bad on Real Programmers as it once was. The latest release of Unix has the potential of an operating system worthy of any Real in - like having the best parts of Fortran and assembly language in one place. (Not to mention some of the more creative uses for

Real Programmers Don't Use Pascal "humour" Back in the good old days - the "Golden Era" of computers, it was easy to separate the men from the boys (sometimes called "Real Men" and "Quiche Eaters" in the literature). During this period, the Real Men were the ones that understood computer programming, and the Quiche Eaters were the ones that didn't. A real computer programmer said things like "DO 10 I=1,10" and "ABEND" (they actually talked in capital letters, you understand), and the rest of the world said things like "computers are too complicated for me" and "I can't relate to computers - they're so impersonal". (A previous work [1] points out that Real Men don't "relate" to anything, and aren't afraid of being impersonal.) But, as usual, times change. We are faced today with a world in which little old ladies can get computers in their microwave ovens, 12-year-old kids can blow Real Men out of the water playing Asteroids and Pac-Man, and anyone can buy and even understand their very own Personal Computer. The Real Programmer is in danger of becoming extinct, of being replaced by high-school students with {TRASH-80s}. There is a clear need to point out the differences between the typical high-school junior Pac-Man player and a Real Programmer. If this difference is made clear, it will give these kids something to aspire to -- a role model, a Father Figure. It will also help explain to the employers of Real Programmers why it would be a mistake to replace the Real Programmers on their staff with 12-year-old Pac-Man players (at a considerable salary savings). LANGUAGES The easiest way to tell a Real Programmer from the crowd is by the programming language he (or she) uses. Real Programmers use {Fortran}. Quiche Eaters use {Pascal}. Nicklaus Wirth, the designer of Pascal, gave a talk once at which he was asked how to pronounce his name. He replied, "You can either call me by name, pronouncing it 'Veert', or call me by value, 'Worth'." One can tell immediately from this comment that Nicklaus Wirth is a Quiche Eater. The only parameter passing mechanism endorsed by Real Programmers is call-by-value-return, as implemented in the {IBM 370} {Fortran-G} and H compilers. Real programmers don't need all these abstract concepts to get their jobs done - they are perfectly happy with a {keypunch}, a {Fortran IV} {compiler}, and a beer. Real Programmers do List Processing in Fortran. Real Programmers do String Manipulation in Fortran. Real Programmers do Accounting (if they do it at all) in Fortran. Real Programmers do {Artificial Intelligence} programs in Fortran. If you can't do it in Fortran, do it in {assembly language}. If you can't do it in assembly language, it isn't worth doing. STRUCTURED PROGRAMMING The academics in computer science have gotten into the "structured programming" rut over the past several years. They claim that programs are more easily understood if the programmer uses some special language constructs and techniques. They don't all agree on exactly which constructs, of course, and the examples they use to show their particular point of view invariably fit on a single page of some obscure journal or another - clearly not enough of an example to convince anyone. When I got out of school, I thought I was the best programmer in the world. I could write an unbeatable tic-tac-toe program, use five different computer languages, and create 1000-line programs that WORKED. (Really!) Then I got out into the Real World. My first task in the Real World was to read and understand a 200,000-line Fortran program, then speed it up by a factor of two. Any Real Programmer will tell you that all the Structured Coding in the world won't help you solve a problem like that - it takes actual talent. Some quick observations on Real Programmers and Structured Programming: Real Programmers aren't afraid to use {GOTOs}. Real Programmers can write five-page-long DO loops without getting confused. Real Programmers like Arithmetic IF statements - they make the code more interesting. Real Programmers write self-modifying code, especially if they can save 20 {nanoseconds} in the middle of a tight loop. Real Programmers don't need comments - the code is obvious. Since Fortran doesn't have a structured IF, REPEAT ... UNTIL, or CASE statement, Real Programmers don't have to worry about not using them. Besides, they can be simulated when necessary using {assigned GOTOs}. Data Structures have also gotten a lot of press lately. Abstract Data Types, Structures, Pointers, Lists, and Strings have become popular in certain circles. Wirth (the above-mentioned Quiche Eater) actually wrote an entire book [2] contending that you could write a program based on data structures, instead of the other way around. As all Real Programmers know, the only useful data structure is the Array. Strings, lists, structures, sets - these are all special cases of arrays and can be treated that way just as easily without messing up your programing language with all sorts of complications. The worst thing about fancy data types is that you have to declare them, and Real Programming Languages, as we all know, have implicit typing based on the first letter of the (six character) variable name. OPERATING SYSTEMS What kind of operating system is used by a Real Programmer? CP/M? God forbid - CP/M, after all, is basically a toy operating system. Even little old ladies and grade school students can understand and use CP/M. Unix is a lot more complicated of course - the typical Unix hacker never can remember what the PRINT command is called this week - but when it gets right down to it, Unix is a glorified video game. People don't do Serious Work on Unix systems: they send jokes around the world on {UUCP}-net and write adventure games and research papers. No, your Real Programmer uses OS 370. A good programmer can find and understand the description of the IJK305I error he just got in his JCL manual. A great programmer can write JCL without referring to the manual at all. A truly outstanding programmer can find bugs buried in a 6 megabyte {core dump} without using a hex calculator. (I have actually seen this done.) OS is a truly remarkable operating system. It's possible to destroy days of work with a single misplaced space, so alertness in the programming staff is encouraged. The best way to approach the system is through a keypunch. Some people claim there is a Time Sharing system that runs on OS 370, but after careful study I have come to the conclusion that they were mistaken. PROGRAMMING TOOLS What kind of tools does a Real Programmer use? In theory, a Real Programmer could run his programs by keying them into the front panel of the computer. Back in the days when computers had front panels, this was actually done occasionally. Your typical Real Programmer knew the entire bootstrap loader by memory in hex, and toggled it in whenever it got destroyed by his program. (Back then, memory was memory - it didn't go away when the power went off. Today, memory either forgets things when you don't want it to, or remembers things long after they're better forgotten.) Legend has it that {Seymore Cray}, inventor of the Cray I supercomputer and most of Control Data's computers, actually toggled the first operating system for the CDC7600 in on the front panel from memory when it was first powered on. Seymore, needless to say, is a Real Programmer. One of my favorite Real Programmers was a systems programmer for Texas Instruments. One day he got a long distance call from a user whose system had crashed in the middle of saving some important work. Jim was able to repair the damage over the phone, getting the user to toggle in disk I/O instructions at the front panel, repairing system tables in hex, reading register contents back over the phone. The moral of this story: while a Real Programmer usually includes a keypunch and lineprinter in his toolkit, he can get along with just a front panel and a telephone in emergencies. In some companies, text editing no longer consists of ten engineers standing in line to use an 029 keypunch. In fact, the building I work in doesn't contain a single keypunch. The Real Programmer in this situation has to do his work with a "text editor" program. Most systems supply several text editors to select from, and the Real Programmer must be careful to pick one that reflects his personal style. Many people believe that the best text editors in the world were written at Xerox Palo Alto Research Center for use on their Alto and Dorado computers [3]. Unfortunately, no Real Programmer would ever use a computer whose operating system is called SmallTalk, and would certainly not talk to the computer with a mouse. Some of the concepts in these Xerox editors have been incorporated into editors running on more reasonably named operating systems - {Emacs} and {VI} being two. The problem with these editors is that Real Programmers consider "what you see is what you get" to be just as bad a concept in Text Editors as it is in women. No the Real Programmer wants a "you asked for it, you got it" text editor - complicated, cryptic, powerful, unforgiving, dangerous. TECO, to be precise. It has been observed that a TECO command sequence more closely resembles transmission line noise than readable text [4]. One of the more entertaining games to play with TECO is to type your name in as a command line and try to guess what it does. Just about any possible typing error while talking with TECO will probably destroy your program, or even worse - introduce subtle and mysterious bugs in a once working subroutine. For this reason, Real Programmers are reluctant to actually edit a program that is close to working. They find it much easier to just patch the binary {object code} directly, using a wonderful program called SUPERZAP (or its equivalent on non-IBM machines). This works so well that many working programs on IBM systems bear no relation to the original Fortran code. In many cases, the original source code is no longer available. When it comes time to fix a program like this, no manager would even think of sending anything less than a Real Programmer to do the job - no Quiche Eating structured programmer would even know where to start. This is called "job security". Some programming tools NOT used by Real Programmers: Fortran preprocessors like {MORTRAN} and {RATFOR}. The Cuisinarts of programming - great for making Quiche. See comments above on structured programming. Source language debuggers. Real Programmers can read core dumps. Compilers with array bounds checking. They stifle creativity, destroy most of the interesting uses for EQUIVALENCE, and make it impossible to modify the operating system code with negative subscripts. Worst of all, bounds checking is inefficient. Source code maintenance systems. A Real Programmer keeps his code locked up in a card file, because it implies that its owner cannot leave his important programs unguarded [5]. THE REAL PROGRAMMER AT WORK Where does the typical Real Programmer work? What kind of programs are worthy of the efforts of so talented an individual? You can be sure that no Real Programmer would be caught dead writing accounts-receivable programs in {COBOL}, or sorting {mailing lists} for People magazine. A Real Programmer wants tasks of earth-shaking importance (literally!). Real Programmers work for Los Alamos National Laboratory, writing atomic bomb simulations to run on Cray I supercomputers. Real Programmers work for the National Security Agency, decoding Russian transmissions. It was largely due to the efforts of thousands of Real Programmers working for NASA that our boys got to the moon and back before the Russkies. Real Programmers are at work for Boeing designing the operating systems for cruise missiles. Some of the most awesome Real Programmers of all work at the Jet Propulsion Laboratory in California. Many of them know the entire operating system of the Pioneer and Voyager spacecraft by heart. With a combination of large ground-based Fortran programs and small spacecraft-based assembly language programs, they are able to do incredible feats of navigation and improvisation - hitting ten-kilometer wide windows at Saturn after six years in space, repairing or bypassing damaged sensor platforms, radios, and batteries. Allegedly, one Real Programmer managed to tuck a pattern-matching program into a few hundred bytes of unused memory in a Voyager spacecraft that searched for, located, and photographed a new moon of Jupiter. The current plan for the Galileo spacecraft is to use a gravity assist trajectory past Mars on the way to Jupiter. This trajectory passes within 80 +/-3 kilometers of the surface of Mars. Nobody is going to trust a Pascal program (or a Pascal programmer) for navigation to these tolerances. As you can tell, many of the world's Real Programmers work for the U.S. Government - mainly the Defense Department. This is as it should be. Recently, however, a black cloud has formed on the Real Programmer horizon. It seems that some highly placed Quiche Eaters at the Defense Department decided that all Defense programs should be written in some grand unified language called "ADA" ((C), DoD). For a while, it seemed that ADA was destined to become a language that went against all the precepts of Real Programming - a language with structure, a language with data types, {strong typing}, and semicolons. In short, a language designed to cripple the creativity of the typical Real Programmer. Fortunately, the language adopted by DoD has enough interesting features to make it approachable -- it's incredibly complex, includes methods for messing with the operating system and rearranging memory, and Edsgar Dijkstra doesn't like it [6]. (Dijkstra, as I'm sure you know, was the author of "GoTos Considered Harmful" - a landmark work in programming methodology, applauded by Pascal programmers and Quiche Eaters alike.) Besides, the determined Real Programmer can write Fortran programs in any language. The Real Programmer might compromise his principles and work on something slightly more trivial than the destruction of life as we know it, providing there's enough money in it. There are several Real Programmers building video games at Atari, for example. (But not playing them - a Real Programmer knows how to beat the machine every time: no challenge in that.) Everyone working at LucasFilm is a Real Programmer. (It would be crazy to turn down the money of fifty million Star Trek fans.) The proportion of Real Programmers in Computer Graphics is somewhat lower than the norm, mostly because nobody has found a use for computer graphics yet. On the other hand, all computer graphics is done in Fortran, so there are a fair number of people doing graphics in order to avoid having to write COBOL programs. THE REAL PROGRAMMER AT PLAY Generally, the Real Programmer plays the same way he works - with computers. He is constantly amazed that his employer actually pays him to do what he would be doing for fun anyway (although he is careful not to express this opinion out loud). Occasionally, the Real Programmer does step out of the office for a breath of fresh air and a beer or two. Some tips on recognizing Real Programmers away from the computer room: At a party, the Real Programmers are the ones in the corner talking about operating system security and how to get around it. At a football game, the Real Programmer is the one comparing the plays against his simulations printed on 11 by 14 fanfold paper. At the beach, the Real Programmer is the one drawing flowcharts in the sand. At a funeral, the Real Programmer is the one saying "Poor George, he almost had the sort routine working before the coronary." In a grocery store, the Real Programmer is the one who insists on running the cans past the laser checkout scanner himself, because he never could trust keypunch operators to get it right the first time. THE REAL PROGRAMMER'S NATURAL HABITAT What sort of environment does the Real Programmer function best in? This is an important question for the managers of Real Programmers. Considering the amount of money it costs to keep one on the staff, it's best to put him (or her) in an environment where he can get his work done. The typical Real Programmer lives in front of a computer terminal. Surrounding this terminal are: Listings of all programs the Real Programmer has ever worked on, piled in roughly chronological order on every flat surface in the office. Some half-dozen or so partly filled cups of cold coffee. Occasionally, there will be cigarette butts floating in the coffee. In some cases, the cups will contain Orange Crush. Unless he is very good, there will be copies of the OS JCL manual and the Principles of Operation open to some particularly interesting pages. Taped to the wall is a line-printer Snoopy calendar for the year 1969. Strewn about the floor are several wrappers for peanut butter filled cheese bars - the type that are made pre-stale at the bakery so they can't get any worse while waiting in the vending machine. Hiding in the top left-hand drawer of the desk is a stash of double-stuff Oreos for special occasions. Underneath the Oreos is a flowcharting template, left there by the previous occupant of the office. (Real Programmers write programs, not documentation. Leave that to the maintenance people.) The Real Programmer is capable of working 30, 40, even 50 hours at a stretch, under intense pressure. In fact, he prefers it that way. Bad response time doesn't bother the Real Programmer - it gives him a chance to catch a little sleep between compiles. If there is not enough schedule pressure on the Real Programmer, he tends to make things more challenging by working on some small but interesting part of the problem for the first nine weeks, then finishing the rest in the last week, in two or three 50-hour marathons. This not only impresses the hell out of his manager, who was despairing of ever getting the project done on time, but creates a convenient excuse for not doing the documentation. In general: No Real Programmer works 9 to 5 (unless it's the ones at night). Real Programmers don't wear neckties. Real Programmers don't wear high-heeled shoes. Real Programmers arrive at work in time for lunch [9]. A Real Programmer might or might not know his wife's name. He does, however, know the entire {ASCII} (or EBCDIC) code table. Real Programmers don't know how to cook. Grocery stores aren't open at three in the morning. Real Programmers survive on Twinkies and coffee. THE FUTURE What of the future? It is a matter of some concern to Real Programmers that the latest generation of computer programmers are not being brought up with the same outlook on life as their elders. Many of them have never seen a computer with a front panel. Hardly anyone graduating from school these days can do hex arithmetic without a calculator. College graduates these days are soft - protected from the realities of programming by source level debuggers, text editors that count parentheses, and "user friendly" operating systems. Worst of all, some of these alleged "computer scientists" manage to get degrees without ever learning Fortran! Are we destined to become an industry of Unix hackers and Pascal programmers? From my experience, I can only report that the future is bright for Real Programmers everywhere. Neither OS 370 nor Fortran show any signs of dying out, despite all the efforts of Pascal programmers the world over. Even more subtle tricks, like adding structured coding constructs to Fortran have failed. Oh sure, some computer vendors have come out with Fortran 77 compilers, but every one of them has a way of converting itself back into a Fortran 66 compiler at the drop of an option card - to compile DO loops like God meant them to be. Even Unix might not be as bad on Real Programmers as it once was. The latest release of Unix has the potential of an operating system worthy of any Real Programmer - two different and subtly incompatible user interfaces, an arcane and complicated teletype driver, virtual memory. If you ignore the fact that it's "structured", even 'C' programming can be appreciated by the Real Programmer: after all, there's no type checking, variable names are seven (ten? eight?) characters long, and the added bonus of the Pointer data type is thrown in - like having the best parts of Fortran and assembly language in one place. (Not to mention some of the more creative uses for

Redundant Array of Independent Disks "storage, architecture" (RAID) A standard naming convention for various ways of using multiple disk drives to provide redundancy and distributed I/O. The original ("..Inexpensive..") term referred to the 3.5 and 5.25 inch disks used for the first RAID system but no longer applies. As {solid state drives} are becoming a practical repacement for magnetic disks, "RAID" is sometimes expanded as "Redundant Array of Independent Drives". The following standard RAID specifications exist: RAID 0 Non-redundant striped array RAID 1 Mirrored arrays RAID 2 Parallel array with ECC RAID 3 Parallel array with parity RAID 4 Striped array with parity RAID 5 Striped array with rotating parity RAID originated in a project at the computer science department of the {University of California at Berkeley}, under the direction of Professor Katz, in conjunction with Professor {John Ousterhout} and Professor {David Patterson}. A prototype disk array file server with a capacity of 40 GBytes and a sustained bandwidth of 80 MBytes/second was interfaced to a 1 Gb/s {local area network}. It was planned to extend the storage array to include automated {optical disks} and {magnetic tapes}. {(ftp://wuarchive.wustl.edu/doc/techreports/berkeley.edu/raid/raidPapers)}. {(http://HTTP.CS.Berkeley.EDU/projects/parallel/research_summaries/14-Computer-Architecture/)}. ["A Case for Redundant Arrays of Inexpensive Disks (RAID)", "D. A. Patterson and G. Gibson and R. H. Katz", Proc ACM SIGMOD Conf, Chicago, IL, Jun 1988]. ["Introduction to Redundant Arrays of Inexpensive Disks (RAID)", "D. A. Patterson and P. Chen and G. Gibson and R. H. Katz", IEEE COMPCON 89, San Francisco, Feb-Mar 1989]. (2012-08-26)

Redundant Arrays of Independent Disks ::: (storage, architecture) (RAID. Originally Redundant Arrays of Inexpensive Disks) A project at the computer science department of the University of California at Berkeley, under the direction of Professor Katz, in conjunction with Professor John Ousterhout and Professor David Patterson.The project is reaching its culmination with the implementation of a prototype disk array file server with a capacity of 40 GBytes and a sustained bandwidth of A key element of the research will be to develop techniques for managing latency in the I/O and network paths.The original (..Inexpensive..) term referred to the 3.5 and 5.25 inch disks used for the first RAID system but no longer applies.The following standard RAID specifications exist: RAID 0 Non-redundant striped arrayRAID 1 Mirrored arrays . .[A Case for Redundant Arrays of Inexpensive Disks (RAID), D. A. Patterson and G. Gibson and R. H. Katz, Proc ACM SIGMOD Conf, Chicago, IL, Jun 1988].[Introduction to Redundant Arrays of Inexpensive Disks (RAID), D. A. Patterson and P. Chen and G. Gibson and R. H. Katz, IEEE COMPCON 89, San Francisco, Feb-Mar 1989]. (1995-07-20)

Richard Gabriel "person" (Dick, RPG) Dr. Richard P. Gabriel. A noted {SAIL} {LISP} {hacker} and volleyball fanatic. Consulting Professor of Computer Science at {Stanford University}. Richard Gabriel is a leader in the {Lisp} and {OOP} community, with years of contributions to {standardisation}. He founded the successful company, {Lucid Technologies, Inc.}. In 1996 he was Distinguished Computer Scientist at ParcPlace-Digitalk, Inc. (later renamed {ObjectShare, Inc.}). See also {gabriel}, {Qlambda}, {QLISP}, {saga}. (1999-10-12)

Richard Gabriel ::: (person) (Dick, RPG) Dr. Richard P. Gabriel. A noted SAIL LISP hacker and volleyball fanatic.Consulting Professor of Computer Science at Stanford University. Richard Gabriel is a leader in the Lisp and OOP community, with years of contributions to standardisation. He founded the successful company, Lucid Technologies, Inc..In 1996 he was Distinguished Computer Scientist at ParcPlace-Digitalk, Inc. (later renamed ObjectShare, Inc.).See also gabriel, Qlambda, QLISP, saga. (1999-10-12)

Richard Hamming ::: (person) Professor Richard Wesley Hamming (1915-02-11 - 1998-01-07). An American mathematician known for his work in information theory (notably error detection and correction), having invented the concepts of Hamming code, Hamming distance, and Hamming window.Richard Hamming received his B.S. from the University of Chicago in 1937, his M.A. from the University of Nebraska in 1939, and his Ph.D. in mathematics from the University of Illinois at Urbana-Champaign in 1942. In 1945 Hamming joined the Manhattan Project at Los Alamos.In 1946, after World War II, Hamming joined the Bell Telephone Laboratories where he worked with both Shannon and John Tukey. He worked there until 1976 when he accepted a chair of computer science at the Naval Postgraduate School at Monterey, California.Hamming's fundamental paper on error-detecting and error-correcting codes (Hamming codes) appeared in 1950.His work on the IBM 650 leading to the development in 1956 of the L2 programming language. This never displaced the workhorse language L1 devised by Michael V Wolontis. By 1958 the 650 had been elbowed aside by the 704.Although best known for error-correcting codes, Hamming was primarily a numerical analyst, working on integrating differential equations and the Hamming (better to solve the right problem the wrong way than the wrong problem the right way.).In 1968 he was made a fellow of the Institute of Electrical and Electronics Engineers and awarded the Turing Prize from the Association for Computing Machinery. The Institute of Electrical and Electronics Engineers awarded Hamming the Emanuel R Piore Award in 1979 and a medal in 1988. . . .[Richard Hamming. Coding and Information Theory. Prentice-Hall, 1980. ISBN 0-13-139139-9].(2003-06-07)

Richard Hamming "person" Professor Richard Wesley Hamming (1915-02-11 - 1998-01-07). An American mathematician known for his work in {information theory} (notably {error detection and correction}), having invented the concepts of {Hamming code}, {Hamming distance}, and {Hamming window}. Richard Hamming received his B.S. from the University of Chicago in 1937, his M.A. from the University of Nebraska in 1939, and his Ph.D. in mathematics from the University of Illinois at Urbana-Champaign in 1942. In 1945 Hamming joined the Manhattan Project at Los Alamos. In 1946, after World War II, Hamming joined the {Bell Telephone Laboratories} where he worked with both {Shannon} and {John Tukey}. He worked there until 1976 when he accepted a chair of computer science at the Naval Postgraduate School at Monterey, California. Hamming's fundamental paper on error-detecting and error-correcting codes ("{Hamming codes}") appeared in 1950. His work on the {IBM 650} leading to the development in 1956 of the {L2} programming language. This never displaced the workhorse language {L1} devised by Michael V Wolontis. By 1958 the 650 had been elbowed aside by the 704. Although best known for error-correcting codes, Hamming was primarily a numerical analyst, working on integrating {differential equations} and the {Hamming spectral window} used for smoothing data before {Fourier analysis}. He wrote textbooks, propounded aphorisms ("the purpose of computing is insight, not numbers"), and was a founder of the {ACM} and a proponent of {open-shop} computing ("better to solve the right problem the wrong way than the wrong problem the right way."). In 1968 he was made a fellow of the {Institute of Electrical and Electronics Engineers} and awarded the {Turing Prize} from the {Association for Computing Machinery}. The Institute of Electrical and Electronics Engineers awarded Hamming the Emanuel R Piore Award in 1979 and a medal in 1988. {(http://www-gap.dcs.st-and.ac.uk/~history/Mathematicians/Hamming.html)}. {(http://zapata.seas.smu.edu/~gorsak/hamming.html)}. {(http://webtechniques.com/archives/1998/03/homepage/)}. [Richard Hamming. Coding and Information Theory. Prentice-Hall, 1980. ISBN 0-13-139139-9]. (2003-06-07)

Richard Korf "person" A Professor of computer science at the University of California, Los Angeles. Richard Korf received his B.S. from {MIT} in 1977, and his M.S. and Ph.D. in computer science from Carnegie-Mellon University in 1980 and 1983. From 1983 to 1985 he served as Herbert M. Singer Assistant Professor of Computer Science at Columbia University. Dr. Korf studies problem-solving, {heuristic search} and {planning} in {artificial intelligence}. He wrote "Learning to Solve Problems by Searching for Macro-Operators" (Pitman, 1985). He serves on the editorial boards of Artificial Intelligence, and the Journal of Applied Intelligence. Dr. Korf is the recipient of several awards and is a Fellow of the {American Association for Artificial Intelligence}. {Richard Korf home page (http://www.cs.ucla.edu/~korf/)}. (2007-05-01)

robotics ::: An interdisciplinary branch of science and engineering that includes mechanical engineering, electronic engineering, information engineering, computer science, and others. Robotics deals with the design, construction, operation, and use of robots, as well as computer systems for their control, sensory feedback, and information processing.

rule-based system ::: In computer science, a rule-based system is used to store and manipulate knowledge to interpret information in a useful way. It is often used in artificial intelligence applications and research. Normally, the term rule-based system is applied to systems involving human-crafted or curated rule sets. Rule-based systems constructed using automatic rule inference, such as rule-based machine learning, are normally excluded from this system type.

Sather "language" /Say-ther/ (Named after the Sather Tower at {UCB}, as opposed to the Eiffel Tower). An interactive {object-oriented} language designed by Steve M. Omohundro at {ICSI} in 1991. Sather has simple {syntax}, similar to {Eiffel}, but it is non-proprietary and faster. Sather 0.2 was nearly a subset of Eiffel 2.0, but Sather 1.0 adds many distinctive features: parameterised {class}es, {multiple inheritance}, statically-checked {strong typing}, {garbage collection}. The compiler generates {C} as an {intermediate language}. There are versions for most {workstations}. Sather attempts to retain much of {Eiffel}'s theoretical cleanliness and simplicity while achieving the efficiency of {C++}. The compiler generates efficient and portable C code which is easily integrated with existing code. A variety of development tools including a debugger and {browser} based on {gdb} and a {GNU Emacs} development environment have also been written. There is also a {class library} with several hundred classes that implement a variety of basic data structures and numerical, geometric, connectionist, statistical, and graphical abstractions. The authors would like to encourage contributions to the library and hope to build a large collection of efficient, well-written, well-tested classes in a variety of areas of computer science. Sather runs on {Sun-4}, {HP9000}/300, {Decstation} 5000, {MIPS}, {Sony News} 3000, {Sequent}/{Dynix}, {SCO} {SysV}R3.2, {NeXT}, {Linux}. See also {dpSather}, {pSather}, {Sather-K}. {(ftp://ftp.icsi.berkeley.edu/pub/sather)}. E-mail: "sather-admin@icsi.berkeley.edu". Mailing list: sather-request@icsi.berkeley.edu. (1995-04-26)

Sather ::: (language) /Say-ther/ (Named after the Sather Tower at UCB, as opposed to the Eiffel Tower).An interactive object-oriented language designed by Steve M. Omohundro at ICSI in 1991. Sather has simple syntax, similar to Eiffel, but it is non-proprietary and faster.Sather 0.2 was nearly a subset of Eiffel 2.0, but Sather 1.0 adds many distinctive features: parameterised classes, multiple inheritance, statically-checked strong typing, garbage collection. The compiler generates C as an intermediate language. There are versions for most workstations.Sather attempts to retain much of Eiffel's theoretical cleanliness and simplicity while achieving the efficiency of C++. The compiler generates efficient and portable C code which is easily integrated with existing code.A variety of development tools including a debugger and browser based on gdb and a GNU Emacs development environment have also been written. There is also a library and hope to build a large collection of efficient, well-written, well-tested classes in a variety of areas of computer science.Sather runs on Sun-4, HP9000/300, Decstation 5000, MIPS, Sony News 3000, Sequent/Dynix, SCO SysVR3.2, NeXT, Linux.See also dpSather, pSather, Sather-K. .E-mail: .Mailing list: (1995-04-26)

Scheme84 {Scheme} from {Indiana University}. It requires {Franz Lisp} on a {VAX} under {VMS} or {BSD}. E-mail: Nancy Garrett "nlg@indiana.edu". Send a tape with return postage to Scheme84 Distribution, Nancy Garrett, c/o Dan Friedman, Department of Computer Science, Indiana University, Bloomington, Indiana. Telephone: +1 (812) 335 9770.

Scheme84 ::: Scheme from Indiana University. It requires Franz Lisp on a VAX under VMS or BSD.E-mail: Nancy Garrett .Send a tape with return postage to Scheme84 Distribution, Nancy Garrett, c/o Dan Friedman, Department of Computer Science, Indiana University, Bloomington, Indiana. Telephone: +1 (812) 335 9770.

SICS ::: Swedish Institute for Computer Science

SICS {Swedish Institute for Computer Science}

sml2c ::: A Standard ML to C compiler. sml2c is a batch compiler and compiles only module-level declarations, i.e. signatures, structures and functors. It provides of its run-time system, but does not support SML/NJ style debugging and profiling.School of Computer Science, Carnegie Mellon University . .conformance: superset + first-class continuations, ports: IBM-RT Decstation3100 Omron-Luna-88k Sun-3 Sun-4 386(Mach)portability: easy, easier than SML/NJE-mail: (1991-06-27)

sml2c A Standard ML to C compiler. sml2c is a batch compiler and compiles only module-level declarations, i.e. signatures, structures and functors. It provides the same pervasive environment for the compilation of these programs as SML/NJ. As a result, module-level programs that run on SML/NJ can be compiled by sml2c without any changes. Based on SML/NJ version 0.67 and shares front end and most of its run-time system, but does not support SML/NJ style debugging and profiling. School of Computer Science, Carnegie Mellon University {(ftp://dravido.soar.cs.cmu.edu/usr/nemo/sml2c/sml2c.tar.Z)}. {Linux (ftp://ftp.dcs.glasgow.ac.uk/pub/linux/smlnj-0.82-linux.tar.Z)}. conformance: superset + first-class continuations, + asynchronous signal handling + separate compilation + freeze and restart programs ports: IBM-RT Decstation3100 Omron-Luna-88k Sun-3 Sun-4 386(Mach) portability: easy, easier than SML/NJ E-mail: "david.tarditi@cs.cmu.edu", "peter.lee@cs.cmu.edu" (1991-06-27)

software ::: A collection of data or computer instructions that tell the computer how to work. This is in contrast to physical hardware, from which the system is built and actually performs the work. In computer science and software engineering, computer software is all information processed by computer systems, programs and data. Computer software includes computer programs, libraries and related non-executable data, such as online documentation or digital media.

SPARKS ::: Fortran superset, used in Fundamentals of Data Structures, E. Horowitz & S. Sahni, Computer Science Press 1976.

SPARKS "language" Fortran superset, used in Fundamentals of Data Structures, E. Horowitz & S. Sahni, Computer Science Press 1976. (2007-03-21)

spatial-temporal reasoning ::: An area of artificial intelligence which draws from the fields of computer science, cognitive science, and cognitive psychology. The theoretic goal—on the cognitive side—involves representing and reasoning spatial-temporal knowledge in mind. The applied goal—on the computing side—involves developing high-level control systems of automata for navigating and understanding time and space.

speech recognition ::: An interdisciplinary subfield of computational linguistics that develops methodologies and technologies that enables the recognition and translation of spoken language into text by computers. It is also known as automatic speech recognition (ASR), computer speech recognition or speech to text (STT). It incorporates knowledge and research in the linguistics, computer science, and electrical engineering fields.

state ::: In information technology and computer science, a program is described as stateful if it is designed to remember preceding events or user interactions;[291] the remembered information is called the state of the system.

steganography ::: (security) Hiding a secret message within a larger one in such a way that others can not discern the presence or contents of the hidden message. For example, a message might be hidden within an image by changing the least significant bits to be the message bits.[ Chaffing and Winnowing: Confidentiality without Encryption, Ronald L. Rivest, MIT Lab for Computer Science, 1998-03-22 (1998-07-13)

steganography "security" Hiding a secret message within a larger one in such a way that others can not discern the presence or contents of the hidden message. For example, a message might be hidden within an {image} by changing the {least significant bits} to be the message bits. [{Chaffing and Winnowing: Confidentiality without Encryption, Ronald L. Rivest, MIT Lab for Computer Science, 1998-03-22 (http://theory.lcs.mit.edu/~rivest/chaffing.txt)}]. (1998-07-13)

Stephen Kleene ::: (person) Professor Stephen Cole Kleene (1909-01-05 - 1994-01-26) /steev'n (kohl) klay'nee/ An American mathematician whose work at the University of theory and for inventing regular expressions. The Kleene star and Ascending Kleene Chain are named after him.Kleene was born in Hartford, Conneticut, USA. He received his bachelor of arts degree from Amherst College in 1930. From 1930 to 1935, he was a graduate doctorate in mathematics in 1934. In 1935, he joined UW-Madison mathematics department as an instructor. He became an assistant professor in 1937.From 1939 to 1940, he was a visiting scholar at Princeton's Institute for Advanced Study where he laid the foundation for recursive function theory, an area that would be his lifelong research interest. In 1941 he returned to Amherst as an associate professor of mathematics.During World War II Kleene was a lieutenant commander in the United States Navy. He was an instructor of navigation at the U.S. Naval Reserve's Midshipmen's School in New York, and then a project director at the Naval Research Laboratory in Washington, D.C.In 1946, he returned to Wisconsin, eventually becoming a full professor. He was chair of mathematics, and computer sciences in 1962 and 1963 and dean of the College of Letters and Science from 1969 to 1974. In 1964 he was named the Cyrus C. MacDuffee professor of mathematics.An avid mountain climber, Kleene had a strong interest in nature and the environment and was active in many conservation causes. He led several Logic from 1956 to 1958. In 1961, he served as president of the International Union of the History and the Philosophy of Science.Kleene pronounced his last name /klay'nee/. /klee'nee/ and /kleen/ are extremely common mispronunciations. His first name is /steev'n/, not /stef'n/. His son, pronunciation is incorrect in all known languages. I believe that this novel pronunciation was invented by my father. . (1999-03-03)

Stephen Kleene "person" Professor Stephen Cole Kleene (1909-01-05 - 1994-01-26) /steev'n (kohl) klay'nee/ An American mathematician whose work at the {University of Wisconsin-Madison} helped lay the foundations for modern computer science. Kleene was best known for founding the branch of {mathematical logic} known as {recursion theory} and for inventing {regular expressions}. The {Kleene star} and {Ascending Kleene Chain} are named after him. Kleene was born in Hartford, Conneticut, USA. He received his Bachelor of Arts degree from Amherst College in 1930. From 1930 to 1935, he was a graduate student and research assistant at {Princeton University} where he received his doctorate in mathematics in 1934. In 1935, he joined UW-Madison mathematics department as an instructor. He became an assistant professor in 1937. From 1939 to 1940, he was a visiting scholar at Princeton's {Institute for Advanced Study} where he laid the foundation for recursive function theory, an area that would be his lifelong research interest. In 1941 he returned to Amherst as an associate professor of mathematics. During World War II Kleene was a lieutenant commander in the United States Navy. He was an instructor of navigation at the U.S. Naval Reserve's Midshipmen's School in New York, and then a project director at the Naval Research Laboratory in Washington, D.C. In 1946, he returned to Wisconsin, eventually becoming a full professor. He was chair of mathematics, and computer sciences in 1962 and 1963 and dean of the College of Letters and Science from 1969 to 1974. In 1964 he was named the Cyrus C. MacDuffee professor of mathematics. An avid mountain climber, Kleene had a strong interest in nature and the environment and was active in many conservation causes. He led several professional organisations, serving as president of the {Association of Symbolic Logic} from 1956 to 1958. In 1961, he served as president of the International Union of the History and the Philosophy of Science. Kleene pronounced his last name /klay'nee/. /klee'nee/ and /kleen/ are extremely common mispronunciations. His first name is /steev'n/, not /stef'n/. His son, Ken Kleene "kenneth.kleene@umb.edu", wrote: "As far as I am aware this pronunciation is incorrect in all known languages. I believe that this novel pronunciation was invented by my father." {(gopher://gopher.adp.wisc.edu/00/.data/.news-rel/.9401/.940126a)}. (1999-03-03)

stochastic semantic analysis ::: An approach used in computer science as a semantic component of natural language understanding. Stochastic models generally use the definition of segments of words as basic semantic units for the semantic models, and in some cases involve a two layered approach.[295]

theoretical computer science (TCS) ::: A subset of general computer science and mathematics that focuses on more mathematical topics of computing and includes the theory of computation.

theory of computation ::: In theoretical computer science and mathematics, the theory of computation is the branch that deals with how efficiently problems can be solved on a model of computation, using an algorithm. The field is divided into three major branches: automata theory and languages, computability theory, and computational complexity theory, which are linked by the question: "What are the fundamental capabilities and limitations of computers?".[309]

Tim Berners-Lee "person" (Sir -) The man who invented the {web} while working at the {Center for European Particle Research} (CERN). He is Director of the {World Wide Web Consortium}. Tim Berners-Lee graduated from the Queen's College at Oxford University, England, 1976. Whilst there he built his first computer with a soldering iron, {TTL} gates, an {M6800} processor and an old television. He then went on to work for {Plessey Telecommunications}, and D.G. Nash Ltd (where he wrote software for intelligent printers and a {multi-tasking} {operating system}), before joining CERN, where he designed a program called 'Enquire', which was never published, but formed the conceptual basis for today's {web}. In 1984, he took up a fellowship at CERN, and in 1989, he wrote the first {web server}, "{httpd}", and the first client, "WorldWideWeb" a {hypertext} browser/editor which ran under {NEXTSTEP}. The program "WorldWideWeb" was first made available within CERN in December, and on the {Internet} as a whole in the summer of 1991. In 1994, Tim joined the {Laboratory for Computer Science} (LCS) at the {Massachusetts Institute of Technology} (MIT). In 1999, he became the first holder of the {3Com} Founders chair. He is also the author of "Weaving the Web", on the past present and future of the Web. In 2001, Tim was made a fellow of The Royal Society. Tim is married to Nancy Carlson. They have two children, born 1991 and 1994. {(http://w3.org/People/Berners-Lee/Longer.html)}. (2001-06-17)

Tim Berners-Lee ::: (person) The man who invented the World-Wide Web while working at the Center for European Particle Research (CERN). Now Director of the World-Wide Web Consortium.Tim Berners-Lee graduated from the Queen's College at Oxford University, England, 1976. Whilst there he built his first computer with a soldering iron, TTL gates, an M6800 processor and an old television.He then went on to work for Plessey Telecommunications, and D.G. Nash Ltd (where he wrote software for intelligent printers and a multi-tasking operating which was never published, but formed the conceptual basis for today's World-Wide Web.In 1984, he took up a fellowship at CERN, and in 1989, he wrote the first World-Wide Web server, httpd, and the first client, WorldWideWeb a hypertext made available within CERN in December, and on the Internet as a whole in the summer of 1991.In 1994, Tim joined the Laboratory for Computer Science (LCS) at the Massachusetts Institute of Technology (MIT). In 1999, he became the first holder of the 3Com Founders chair. He is also the author of Weaving the Web, on the past present and future of the Web.In 2001, Tim was made a fellow of The Royal Society.Tim is married to Nancy Carlson. They have two children, born 1991 and 1994. .(2001-06-17)

toolsmith The software equivalent of a tool-and-die specialist; one who specialises in making the {tools} with which other programmers create applications. Many hackers consider this more fun than applications per se; to understand why, see {uninteresting}. Jon Bentley, in the "Bumper-Sticker Computer Science" chapter of his book "More Programming Pearls", quotes Dick Sites from DEC as saying "I'd rather write programs to write programs than write programs". [{Jargon File}]

Towers of Hanoi "games" A classic computer science problem, invented by Edouard Lucas in 1883, often used as an example of {recursion}. "In the great temple at Benares, says he, beneath the dome which marks the centre of the world, rests a brass plate in which are fixed three diamond needles, each a cubit high and as thick as the body of a bee. On one of these needles, at the creation, God placed sixty-four discs of pure gold, the largest disc resting on the brass plate, and the others getting smaller and smaller up to the top one. This is the Tower of Bramah. Day and night unceasingly the priests transfer the discs from one diamond needle to another according to the fixed and immutable laws of Bramah, which require that the priest on duty must not move more than one disc at a time and that he must place this disc on a needle so that there is no smaller disc below it. When the sixty-four discs shall have been thus transferred from the needle on which at the creation God placed them to one of the other needles, tower, temple, and Brahmins alike will crumble into dust, and with a thunderclap the world will vanish." The recursive solution is: Solve for n-1 discs recursively, then move the remaining largest disc to the free needle. Note that there is also a non-recursive solution: On odd-numbered moves, move the smallest sized disk clockwise. On even-numbered moves, make the single other move which is possible. ["Mathematical Recreations and Essays", W W R Ball, p. 304] {The rec.puzzles Archive (http://rec-puzzles.org/sol.pl/induction/hanoi)}. (2003-07-13)

Towers of Hanoi ::: (games) A classic computer science problem, invented by Edouard Lucas in 1883, often used as an example of recursion.In the great temple at Benares, says he, beneath the dome which marks the centre of the world, rests a brass plate in which are fixed three diamond placed them to one of the other needles, tower, temple, and Brahmins alike will crumble into dust, and with a thunderclap the world will vanish.The recursive solution is: Solve for n-1 discs recursively, then move the remaining largest disc to the free needle.Note that there is also a non-recursive solution: On odd-numbered moves, move the smallest sized disk clockwise. On even-numbered moves, make the single other move which is possible.[Mathematical Recreations and Essays, W W R Ball, p. 304] .(2003-07-13)

transition system ::: In theoretical computer science, a transition system is a concept used in the study of computation. It is used to describe the potential behavior of discrete systems. It consists of states and transitions between states, which may be labeled with labels chosen from a set; the same label may appear on more than one transition. If the label set is a singleton, the system is essentially unlabeled, and a simpler definition that omits the labels is possible.

Turing tar-pit A place where anything is possible but nothing of interest is practical. {Alan M. Turing} helped lay the foundations of computer science by showing that all machines and languages capable of expressing a certain very primitive set of operations are logically equivalent in the kinds of computations they can carry out, and in principle have capabilities that differ only in speed from those of the most powerful and elegantly designed computers. However, no machine or language exactly matching Turing's primitive set has ever been built (other than possibly as a classroom exercise), because it would be horribly slow and far too painful to use. A "Turing tar-pit" is any computer language or other tool that shares this property. That is, it's theoretically universal but in practice, the harder you struggle to get any real work done, the deeper its inadequacies suck you in. Compare {bondage-and-discipline language}. A tar pit is a geological occurence where subterranean tar leaks to the surface, creating a large puddle (or pit) of tar. Animals wandering or falling in get stuck, being unable to extricate themselves from the tar. La Brea, California, has a museum built around the fossilized remains of mammals and birds found in such a tar pit. [{Jargon File}] (1998-06-27)

Turing tar-pit ::: A place where anything is possible but nothing of interest is practical. Alan M. Turing helped lay the foundations of computer science by showing that all possibly as a classroom exercise), because it would be horribly slow and far too painful to use.A Turing tar-pit is any computer language or other tool that shares this property. That is, it's theoretically universal but in practice, the harder you struggle to get any real work done, the deeper its inadequacies suck you in. Compare bondage-and-discipline language.A tar pit is a geological occurence where subterranean tar leaks to the surface, creating a large puddle (or pit) of tar. Animals wandering or falling in get has a museum built around the fossilized remains of mammals and birds found in such a tar pit.[Jargon File] (1998-06-27)

ubiquitous computing Computers everywhere. Making many computers available throughout the physical environment, while making them effectively invisible to the user. Ubiquitous computing is held by some to be the Third Wave of computing. The First Wave was many people per computer, the Second Wave was one person per computer. The Third Wave will be many computers per person. Three key technical issues are: power consumption, user interface, and wireless connectivity. The idea of ubiquitous computing as invisible computation was first articulated by Mark Weiser in 1988 at the Computer Science Lab at {Xerox PARC}. {(http://ubiq.com/hypertext/weiser/weiser.html)}. (1994-12-23)

ubiquitous computing ::: Computers everywhere. Making many computers available throughout the physical environment, while making them effectively invisible to the user. Ubiquitous Wave will be many computers per person. Three key technical issues are: power consumption, user interface, and wireless connectivity.The idea of ubiquitous computing as invisible computation was first articulated by Mark Weiser in 1988 at the Computer Science Lab at Xerox PARC. . (1994-12-23)

University of Twente ::: (body, education) A university in the east of The Netherlands for technical and social sciences. It was founded in 1961, making it one of the Computer Science; Electrical Engineering; Mechanical Engineering; Philosophy of science, Technology and Society; Educational Technology. . (1995-04-16)

University of Twente "body, education" A university in the east of The Netherlands for technical and social sciences. It was founded in 1961, making it one of the youngest universities in The Netherlands. It has 7000 students studying Applied Educational Science; Applied Mathematics; Applied Physics; Chemical Technology; Computer Science; Electrical Engineering; Mechanical Engineering; Philosophy of science, Technology and Society; Educational Technology. {(http://nic.utwente.nl/uthomuk.htm)}. (1995-04-16)

Vint Cerf "person" (Vinton G. Cerf) The co-inventor with {Bob Kahn} of the {Internet} and its base {protocol}, {TCP/IP}. Like {Jon Postel}, he was crucial in the development of many higher-level protocols, and has written several dozen {RFCs} since the late 1960s. Vinton Cerf is senior vice president of Internet Architecture and Technology for {MCI WorldCom}. His team of architects and engineers design advanced Internet frameworks for delivering a combination of data, information, voice and video services for business and consumer use. In December 1997, President Clinton presented the U.S. National Medal of Technology to Cerf and his partner, Robert E. Kahn, for founding and developing the Internet. Prior to rejoining MCI in 1994, Cerf was vice president of the Corporation for National Research Initiatives (CNRI). As vice president of MCI Digital Information Services from 1982-1986, he led the engineering of {MCI Mail}, the first commercial e-mail service to be connected to the Internet. During his tenure from 1976-1982 with the U.S. Department of {Defense Advanced Research Projects Agency} (DARPA), Cerf played a key role leading the development of Internet and Internet-related data packet and security technologies. Cerf served as founding president of the {Internet Society} from 1992-1995 and is currently chairman of the Board. Cerf is a member of the U.S. Presidential Information Technology Advisory Committee (PITAC) and the Advisory Committee for Telecommunications (ACT) in Ireland. Cerf is a recipient of numerous awards and commendations in connection with his work on the Internet. In December 1994, People magazine identified Cerf as one of that year's "25 Most Intriguing People." In addition to his work on behalf of MCI and the Internet, Cerf serves as technical advisor to production for "Gene Roddenberry's Earth: Final Conflict," the number one television show in first-run syndication. He also made a special guest appearance in May 1998. Cerf also holds an appointment as distinguished visiting scientist at the Jet Propulsion Laboratory where he is working on the design of an interplanetary Internet. Cerf holds a Bachelor of Science degree in Mathematics from Stanford University and Master of Science and Ph.D. degrees in Computer Science from UCLA. He also holds honorary Doctorate degrees from the Swiss Federal Institute of Technology, Zurich; Lulea University of Technology, Sweden; University of the Balearic Islands, Palma; Capitol College and Gettysburg College. {(http://mci.com/cerfsup/)}. (1999-02-25)

Vint Cerf ::: (person) (Vinton G. Cerf) The co-inventor with Bob Kahn of the Internet and its base protocol, TCP/IP. Like Jon Postel, he was crucial in the development of many higher-level protocols, and has written several dozen RFCs since the late 1960s.Vinton Cerf is senior vice president of Internet Architecture and Technology for MCI WorldCom. His team of architects and engineers design advanced Internet frameworks for delivering a combination of data, information, voice and video services for business and consumer use.In December 1997, President Clinton presented the U.S. National Medal of Technology to Cerf and his partner, Robert E. Kahn, for founding and developing the Internet.Prior to rejoining MCI in 1994, Cerf was vice president of the Corporation for National Research Initiatives (CNRI). As vice president of MCI Digital Information Services from 1982-1986, he led the engineering of MCI Mail, the first commercial e-mail service to be connected to the Internet.During his tenure from 1976-1982 with the U.S. Department of Defense Advanced Research Projects Agency (DARPA), Cerf played a key role leading the development of Internet and Internet-related data packet and security technologies.Cerf served as founding president of the Internet Society from 1992-1995 and is currently chairman of the Board. Cerf is a member of the U.S. Presidential Information Technology Advisory Committee (PITAC) and the Advisory Committee for Telecommunications (ACT) in Ireland.Cerf is a recipient of numerous awards and commendations in connection with his work on the Internet. In December 1994, People magazine identified Cerf as one of that year's 25 Most Intriguing People.In addition to his work on behalf of MCI and the Internet, Cerf serves as technical advisor to production for Gene Roddenberry's Earth: Final Conflict, visiting scientist at the Jet Propulsion Laboratory where he is working on the design of an interplanetary Internet.Cerf holds a Bachelor of Science degree in Mathematics from Stanford University and Master of Science and Ph.D. degrees in Computer Science from UCLA. He also Zurich; Lulea University of Technology, Sweden; University of the Balearic Islands, Palma; Capitol College and Gettysburg College. . (1999-02-25)

WA-12 Workflow Analysis in 12 different organisations. A project from the Department of Computer Science from the {University of Twente}, Enschede, The Netherlands. The final report of this project is available to the public (ISBN 90-365-0683-2).

WA-12 ::: Workflow Analysis in 12 different organisations. A project from the Department of Computer Science from the University of Twente, Enschede, The Netherlands. The final report of this project is available to the public (ISBN 90-365-0683-2).

Wizard Book "publication" {Hal Abelson}, {Gerald Sussman} and Julie Sussman's "Structure and Interpretation of Computer Programs" (MIT Press, 1984; ISBN 0-262-01077-1), an excellent computer science text used in introductory courses at MIT. So called because of the wizard on the jacket. One of the {bibles} of the LISP/Scheme world. Also, less commonly, known as the {Purple Book}. [{Jargon File}] (1995-01-10)

X Consortium A vendor consortium supporting development, evolution and maintenance of the {X Window System}. The X Consortium is an independent, not-for-profit company. It was formed in 1993 as the successor to the {MIT} X Consortium, a research group of the MIT {Laboratory for Computer Science}. {(ftp://ftp.x.org)}. {(http://x.org/)}. [Members?]

X Consortium ::: A vendor consortium supporting development, evolution and maintenance of the X Window System. The X Consortium is an independent, not-for-profit company. It was formed in 1993 as the successor to the MIT X Consortium, a research group of the MIT Laboratory for Computer Science. .[Members?]

Yourdon, Inc. ::: (company) The company founded in 1974 by Edward Yourdon to provide educational, publishing, and consulting services in state-of-the-art software computer books on a wide range of software engineering topics; many of these classics are used as standard university computer science textbooks. (1995-04-16)

Yourdon, Inc. "company" The company founded in 1974 by {Edward Yourdon} to provide educational, publishing, and consulting services in state-of-the-art software engineering technology. Over the next 12 years, the company grew to a staff of over 150 people, with offices throughout North America and Europe. As CEO of the company, Yourdon oversaw an operation that trained over 250,000 people around the world; the company was sold in 1986 and eventually became part of {CGI}, the French software company that is now part of {IBM}. The publishing division, Yourdon Press (now part of Prentice Hall), has produced over 150 technical computer books on a wide range of software engineering topics; many of these "classics" are used as standard university computer science textbooks. (1995-04-16)



QUOTES [2 / 2 - 196 / 196]


KEYS (10k)

   1 Wikipedia
   1 Harold Abelson

NEW FULL DB (2.4M)

   9 Frederick Lenz
   7 Anonymous
   6 Pedro Domingos
   6 Donald Knuth
   5 Alan Perlis
   4 Randy Pausch
   4 Edsger Dijkstra
   4 Brad Stone
   4 Bill Gates
   3 Richard Hamming
   3 Melinda Gates
   3 Hal Abelson
   3 Guy Kawasaki
   3 Donald A Norman
   3 Alan Kay
   2 Walter Isaacson
   2 Tracy Kidder
   2 Steven Levy
   2 Steve Ballmer
   2 Stan Kelly Bootle

1:Alan Mathison Turing OBE FRS (/ˈtjʊərɪŋ/; 23 June 1912 - 7 June 1954) was an English computer scientist, mathematician, logician, cryptanalyst and theoretical biologist. He was highly influential in the development of theoretical computer science, providing a formalisation of the concepts of algorithm and computation with the Turing machine, which can be considered a model of a general purpose computer.[2][3][4] Turing is widely considered to be the father of theoretical computer science and artificial intelligence.[5]
   ~ Wikipedia,
2:[Computer science] is not really about computers -- and it's not about computers in the same sense that physics is not really about particle accelerators, and biology is not about microscopes and Petri dishes...and geometry isn't really about using surveying instruments. Now the reason that we think computer science is about computers is pretty much the same reason that the Egyptians thought geometry was about surveying instruments: when some field is just getting started and you don't really understand it very well, it's very easy to confuse the essence of what you're doing with the tools that you use. ~ Harold Abelson, Introductory lecture to Structure and Interpretation of Computer Programs,

*** WISDOM TROVE ***

1:Computer science really involves the same mindset, particularly artificial intelligence. ~ frederick-lenz, @wisdomtrove
2:If you seek to develop the mind fully, for the enlightenment process, you will benefit if your career is related to computer science, law, medicine, or the arts. ~ frederick-lenz, @wisdomtrove
3:I recommend computer science to people who practice meditation. The mental structures that are used in computer science are very similar exercises done in Buddhist monasteries. ~ frederick-lenz, @wisdomtrove
4:I was lucky to get into computers when it was a very young and idealistic industry. There weren't many degrees offered in computer science, so people in computers were brilliant people from mathematics, physics, music, zoology, whatever. They loved it, and no one was really in it for the money. ~ steve-jobs, @wisdomtrove
5:Atlantis was a highly evolved civilization where the sciences and arts were far more advanced than one might guess. Atlantis was technologically advanced in genetic engineering, computer science, inter-dimensional physics, and artistically developed with electronic music and crystal art forms. ~ frederick-lenz, @wisdomtrove
6:The burgeoning field of computer science has shifted our view of the physical world from that of a collection of interacting material particles to one of a seething network of information. In this way of looking at nature, the laws of physics are a form of software, or algorithm, while the material world-the hardware-plays the role of a gigantic computer. ~ paul-davies, @wisdomtrove

*** NEWFULLDB 2.4M ***

1:Computer Science is embarrassed by the computer. ~ Alan Perlis,
2:Trees sprout up just about everywhere in computer science. ~ Donald Knuth,
3:Computer science is the operating system for all innovation. ~ Steve Ballmer,
4:Science is to computer science as hydrodynamics is to plumbing. ~ Stan Kelly Bootle,
5:Computer science was then generally a subdepartment of electrical engineering, ~ Ellen Ullman,
6:All problems in Computer Science can be solved by another level of indirection. ~ Butler Lampson,
7:Computer science is no more about computers than astronomy is about telescopes. ~ Edsger Dijkstra,
8:Computer Science is no more about computers than astronomy is about telescopes ~ Edsger W Dijkstra,
9:Computer science really involves the same mindset, particularly artificial intelligence. ~ Frederick Lenz,
10:The first law of computer science: Every problem is solved by yet another indirection. ~ Bjarne Stroustrup,
11:Theoretical Computer Science is just as useless as everything we mathematicians do. ~ Jennifer Tour Chayes,
12:Computer science has as much to do with computers as astronomy has to do with telescopes. ~ Edsger Dijkstra,
13:As so often happens in computer science, we’re willing to sacrifice efficiency for generality. ~ Pedro Domingos,
14:When a professor insists computer science is X but not Y, have compassion for his graduate students. ~ Alan Perlis,
15:Computer science is one of the worst things that ever happened to either computers or to science. ~ Neil Gershenfeld,
16:Computer science departments have always considered 'user interface' research to be sissy work. ~ Nicholas Negroponte,
17:I shopped at J. Crew in high school, I studied computer science. I was a nerd-nerd, now I'm a music-nerd. ~ Mayer Hawthorne,
18:Computer Science is the only discipline in which we view adding a new wing to a building as being maintenance. ~ Jim Horning,
19:I have yet to see a career that is similar in benefit as computer science for doing the advanced exercises. ~ Frederick Lenz,
20:Software Engineering is that part of Computer Science which is too difficult for the Computer Scientist. ~ Friedrich L Bauer,
21:Until Systers came into existence, the notion of a global community of women in computer science did not exist. ~ Anita Borg,
22:I never took a computer science course in college, because then it was a thing you just learned on your own. ~ Mitchel Resnick,
23:Remember, there are only two hard problems in computer science: cache invalidation, naming, and off-by-one errors. ~ Anonymous,
24:The goal of Computer Science is to build something that will last at least until we've finished building it. ~ William C Brown,
25:What English speakers call “computer science” Europeans have known as informatique, informatica, and Informatik ~ James Gleick,
26:I am a professor at the computer science department, but I don't know how to use a computer, not even for Email. ~ Endre Szemeredi,
27:Buck’s girlfriend went by the porny name of Miracle though she had a master’s in computer science from Florida State. ~ Carl Hiaasen,
28:the best data scientists tend to be “hard scientists,” particularly physicists, rather than computer science majors. ~ Mike Loukides,
29:When people think about computer science, they imagine people with pocket protectors and thick glasses who code all night. ~ Marissa Mayer,
30:Both women and computer science are the losers when a geeky stereotype serves as an unnecessary gatekeeper to the profession. ~ Cordelia Fine,
31:But being considered the best speaker in a computer science department is like being known as the tallest of the Seven Dwarfs. ~ Randy Pausch,
32:I went to a school that's predominantly computer science and engineering. So, there's a real shortage of hot girls, let's say. ~ Joe Manganiello,
33:I fear - as far as I can tell - that most undergraduate degrees in computer science these days are basically Java vocational training. ~ Alan Kay,
34:Coding is today's language of creativity. All our children deserve a chance to become creators instead consumers of computer science. ~ Maria Klawe,
35:There's a good part of Computer Science that's like magic. Unfortunately there's a bad part of Computer Science that's like religion. ~ Hal Abelson,
36:computer science has traditionally been all about thinking deterministically, but machine learning requires thinking statistically. ~ Pedro Domingos,
37:I considered law and math. My Dad was a lawyer. I think though I would have ended up in physics if I didn't end up in computer science. ~ Bill Gates,
38:And they came to be included in a culture and community that placed the computer science engineer at the highest level of social status. ~ Alec J Ross,
39:Computer Science: A study akin to numerology and astrology, but lacking the precision of the former and the success of the latter. ~ Stan Kelly Bootle,
40:I don't know how many of you have ever met Dijkstra, but you probably know that arrogance in computer science is measured in nano-Dijkstras. ~ Alan Kay,
41:IBM veteran and computer science professor Frederick Brooks argued that adding manpower to complex software projects actually delayed progress. ~ Brad Stone,
42:I've been programming computers since elementary school, where they taught us, and I stuck with computer science through high school and college. ~ Masi Oka,
43:the most advanced computer science programs in the world, and over the course of the Computer Center’s life, thousands of students passed ~ Malcolm Gladwell,
44:Let’s give them credit,” Schmidt says. “The book guys got computer science, they figured out the analytics, and they built something significant. ~ Brad Stone,
45:For those who wish to stay and work in computer science or technology, fields badly in need of their services, let’s roll out the welcome mat. ~ Sheldon Adelson,
46:He asked the class how many of us were taking computer science, and everybody but me and this one girl who didn’t speak English raised their hands. ~ Ned Vizzini,
47:Computer science is to biology what calculus is to physics. It's the natural mathematical technique that best maps the character of the subject. ~ Harold Morowitz,
48:Computer science needs to be part of the core curriculum - like algebra, biology, physics, or chemistry. We need all schools to teach it, not just 10%. ~ Brad Feld,
49:I can't be as confident about computer science as I can about biology. Biology easily has 500 years of exciting problems to work on. It's at that level. ~ Donald Knuth,
50:When I was 19 years old, I wrote my first book. I took a computer science class, and the book was garbage. I thought I could write a better one, so I did. ~ Jim McKelvey,
51:But biology and computer science - life and computation - are related. I am confident that at their interface great discoveries await those who seek them. ~ Leonard Adleman,
52:As you study computer science you develop this wonderful mental acumen, particularly with relational databases, systems analysis, and artificial intelligence. ~ Frederick Lenz,
53:She shrugged noncommittally. “Not bad.”
Kyle scoffed. “Not bad? Counselor, there are two things I’ve got mad skills at: And computer science is the other one. ~ Julie James,
54:If we suppose that many natural phenomena are in effect computations, the study of computer science can tell us about the kinds of natural phenomena that can occur. ~ Rudy Rucker,
55:If you seek to develop the mind fully, for the enlightenment process, you will benefit if your career is related to computer science, law, medicine, or the arts. ~ Frederick Lenz,
56:Only in high school when I began programming computers, did I become interested in tech and start-ups, which led me to attend Stanford and major in Computer Science. ~ Clara Shih,
57:My hope is that in the future, women stop referring to themselves as 'the only woman' in their physics lab or 'only one of two' in their computer science jobs. ~ Kirsten Gillibrand,
58:Computer science is fascinating. As you study computer science, you will find that you develop your mind. It is literally like doing Buddhist exercises all day long. ~ Frederick Lenz,
59:Computer science doesn't know how to build complex systems that work reliably. This has been a well-understood problem since the very beginning of programmable computers. ~ Matt Blaze,
60:I had almost no background for the work in computer science, artificial intelligence, and cognitive psychology...Interdisciplinary adventure is easiest in new fields. ~ Herbert A Simon,
61:We're losing track of the vastness of the potential for computer science. We really have to revive the beautiful intellectual joy of it, as opposed to the business potential. ~ Jaron Lanier,
62:I try to learn certain areas of computer science exhaustively; then I try to digest that knowledge into a form that is accessible to people who don't have time for such study. ~ Donald Knuth,
63:My background was computer science and business school, so eventually I worked my way up where I was running product groups - development, testing, marketing, user education. ~ Melinda Gates,
64:I recommend computer science to people who practice meditation. The mental structures that are used in computer science are very similar exercises done in Buddhist monasteries. ~ Frederick Lenz,
65:I recommend, for many people, the study of computer science. Our natural resource in America is the mind. The mindset in computer science is very similar to the mindset in Zen. ~ Frederick Lenz,
66:when I arrived at Stanford in 1985, economics, not computer science, was the most popular major. To most people on campus, the tech sector seemed idiosyncratic or even provincial. ~ Peter Thiel,
67:John von Neumann, one of the founding fathers of computer science, famously said that “with four parameters I can fit an elephant, and with five I can make him wiggle his trunk. ~ Pedro Domingos,
68:Computer science … jobs should be way more interesting than even going to Wall Street or being a lawyer--or, I can argue, than anything but perhaps biology, and there it's just a tie. ~ Bill Gates,
69:People think that computer science is the art of geniuses but the actual reality is the opposite, just many people doing things that build on each other, like a wall of mini stones. ~ Donald Knuth,
70:See, Berkeley has always drawn the nuts and flakes of the academic world. That's what happens when you have a university that offers degrees in both computer science and parapsychology. ~ Mira Grant,
71:The rise of Google, the rise of Facebook, the rise of Apple, I think are proof that there is a place for computer science as something that solves problems that people face every day. ~ Eric Schmidt,
72:Computer science is the most misunderstood field there is. You are being paid to solve puzzles. For a person who has practiced meditation in past lives, that is the way your mind works. ~ Frederick Lenz,
73:I decry the current tendency to seek patents on algorithms. There are better ways to earn a living than to prevent other people from making use of one's contributions to computer science. ~ Donald Knuth,
74:Too few people in computer science are aware of some of the informational challenges in biology and their implications for the world. We can store an incredible amount of data very cheaply. ~ Sergey Brin,
75:passed in those days for computer terminals. In 1971, this was state of the art. The University of Michigan had one of the most advanced computer science programs in the world, and over ~ Malcolm Gladwell,
76:Throughout my academic career, I'd given some pretty good talks. But being considered the best speaker in the computer science department is like being known as the tallest of the Seven Dwarfs. ~ Randy Pausch,
77:Computer science is a restless infant and its progress depends as much on shifts in point of view as on the orderly development of our current concepts. ~ Alan Perlis, The Synthesis of Algorithmic Systems, 1966,
78:I remember that mathematicians were telling me in the 1960s that they would recognize computer science as a mature discipline when it had 1,000 deep algorithms. I think we've probably reached 500. ~ Donald Knuth,
79:really a hedge fund but a versatile technology laboratory full of innovators and talented engineers who could apply computer science to a variety of different problems.5 Investing was only the first ~ Brad Stone,
80:I can’t be as confident about computer science as I can about biology. Biology easily has 500 years of exciting problems to work on. It’s at that level. ~ Donald Knuth (1993) Computer Literacy Bookshops Interview,
81:The training one receives when one becomes a technician, like a data scientist - we get trained in mathematics or computer science or statistics - is entirely separated from a discussion of ethics. ~ Cathy O Neil,
82:My undergraduate work was in computer science and economics. It just happened to be at that time when 34 percent of computer-science majors were women. We didn't realize it was at the peak at the time. ~ Melinda Gates,
83:Why is computer science a good field for women? For one thing, thats where the jobs are, and for another, the pay is better than for many jobs, and finally, its easier to combine career and family. ~ Madeleine M Kunin,
84:Even when I was studying mathematics, physics, and computer science, it always seemed that the problem of consciousness was about the most interesting problem out there for science to come to grips with. ~ David Chalmers,
85:Persons with Disability (PWD), Ex-Serviceman (XSM), Kashmiri Migrant (KM). Please refer to the Norms for the same. There are 394 vacancies for the above position (200 Electronics, 120 Mechanical, 57 Computer Science, ~ Anonymous,
86:ProPublica’s technology reporter Jeff Larson joined the bunker in London. A computer science graduate, Larson knew his stuff. Using diagrams, he could explain the NSA’s complex data-mining programs – no mean feat. ~ Luke Harding,
87:I was on this path to becoming a computer-science guy, but I didn't like it. I got no joy from it. It was very, very scary. It was suffocating to think that I was just going to do this thing for the rest of my life. ~ Kumail Nanjiani,
88:Perhaps writers should never be allowed to get together in a workplace context. It's not like studying computer science, after all. The emotions are at large, and are shared and are questioned. There is a vulnerability. ~ Graham Joyce,
89:Starting early and getting girls on computers, tinkering and playing with technology, games and new tools, is extremely important for bridging the gender divide that exists now in computer science and in technology. ~ Beth Simone Noveck,
90:Harvard’s Leslie Valiant received the Turing Award, the Nobel Prize of computer science, for inventing this type of analysis, which he describes in his book entitled, appropriately enough, Probably Approximately Correct. ~ Pedro Domingos,
91:leading with computational thinking instead of code itself, and helping students imagine how being computer savvy could help them in any career, boosts the number of girls and kids of color taking—and sticking with—computer science. ~ Anonymous,
92:For years, computer scientists were treating operating systems design as sort of an open-reserch issue, when the field's direction had been decided by commercial operations. Computer science has become completely cut off from reality. ~ David Gelernter,
93:Software engineering is the part of computer science which is too difficult for the computer scientist. ~ Friedrich Bauer, "Software Engineering." Information Processing: Proceedings of the IFIP Congress 1971, Ljubljana, Yugoslavia, August 23-28, 1971.,
94:Computer science is no more about computers than astronomy is about telescopes, biology is about microscopes or chemistry is about beakers and test tubes. Science is not about tools. It is about how we use them, and what we find out when we do. ~ Edsger Dijkstra,
95:The best way to prepare [to be a programmer] is to write programs, and to study great programs that other people have written. In my case, I went to the garbage cans at the Computer Science Center and I fished out listings of their operating systems. ~ Bill Gates,
96:I was never as focused in math, science, computer science, etcetera, as the people who were best at it. I wanted to create amazing screensavers that did beautiful visualizations of music. It's like, "Oh, I have to learn computer science to do that." ~ Kevin Systrom,
97:Computer science is neither mathematics nor electrical engineering. ~ Alan Perlis (1968) title of article "Computer Science is neither Mathematics nor Electrical Engineering" in: A. Finerman (Hg.), University Education in Computing Science, New York, London, pp. 69-77,
98:Every weekend the drama department would have parties. The 20 hot girls on campus? All of them were in the drama dept. So we'd have somebody standing guard at the door to keep all the computer science guys out. We had to guard our women at all times. ~ Joe Manganiello,
99:Computer science has some of the most colorful language of any field. In what other field can you walk into a sterile room, carefully controlled at 68°F, and find viruses, Trojan horses, worms, bugs, bombs, crashes, flames, twisted sex changers, and fatal errors? ~ Anonymous,
100:Drs. Margolis and Fisher have done a great service to education, computer science, and the culture at large. Unlocking the Clubhouse should be required reading for anyone and everyone who is concerned about the decreasing rate of women studying computer science. ~ Anita Borg,
101:I actually started off majoring in computer science, but I knew right away I wasn't going to stay with it. It was because I had this one professor who was the loneliest, saddest man I've ever known. He was a programmer, and I knew that I didn't want to do whatever he did. ~ J Cole,
102:I have met bright students in computer science who have never seen the source code of a large program. They may be good at writing small programs, but they can't begin to learn the different skills of writing large ones if they can't see how others have done it. ~ Richard Stallman,
103:[Computer science] is not really about computers and it's not about computers in the same sense that physics is not really about particle accelerators, and biology is not about microscopes and Petri dishes... and geometry isn't really about using surveying instruments. ~ Hal Abelson,
104:Computer science is not as old as physics; it lags by a couple of hundred years. However, this does not mean that there is significantly less on the computer scientist's plate than on the physicist's: younger it may be, but it has had a far more intense upbringing! ~ Richard P Feynman,
105:Two decades later, when I got my PhD in computer science from Carnegie Mellon, I thought that made me infinitely qualified to do anything, so I dashed off my letters of application to Walt Disney Imagineering. And they sent me the nicest go-to-hell letter I'd ever received. ~ Randy Pausch,
106:Daniel Dennett is our best current philosopher. He is the next Bertrand Russell. Unlike traditional philosophers, Dan is a student of neuroscience, linguistics, artificial intelligence, computer science, and psychology. He's redefining and reforming the role of the philosopher. ~ Marvin Minsky,
107:As the Era of Stagnation began, the Soviet scientific establishment lavished resources on the immediate priorities of the state—space exploration, water diversion, nuclear power—while emergent technologies, including computer science, genetics, and fiber optics, fell behind. ~ Adam Higginbotham,
108:The company I invested in is probably a leader in that area. They're a company called Second Spectrum, which happens to be based in LA but was started by two USC computer-science professors. It's filled with guys who love sports, who played sports, but really look like programmers. ~ Steve Ballmer,
109:I was lucky to get into computers when it was a very young and idealistic industry. There weren't many degrees offered in computer science, so people in computers were brilliant people from mathematics, physics, music, zoology, whatever. They loved it, and no one was really in it for the money. ~ Steve Jobs,
110:Atlantis was a highly evolved civilization where the sciences and arts were far more advanced than one might guess. Atlantis was technologically advanced in genetic engineering, computer science, inter-dimensional physics, and artistically developed with electronic music and crystal art forms. ~ Frederick Lenz,
111:Any problem in computer science can be solved with another level of indirection. ~ David Wheeler (Attributed in: Butler Lampson. Principles for Computer System Design. Turing Award Lecture. February 17, 1993.) Wheeler is said to have added the appendage "Except for the problem of too many layers of indirection.",
112:Computer science only indicates the retrospective omnipotence of our technologies. In other words, an infinite capacity to process data (but only data -- i.e. the already given) and in no sense a new vision. With that science, we are entering an era of exhaustivity, which is also an era of exhaustion. ~ Jean Baudrillard,
113:It is hardly surprising that children should enthusiastically start their education at an early age with the Absolute Knowledge of computer science; while they are unable to read, for reading demands making judgments at every line. Conversation is almost dead, and soon so too will be those who knew how to speak. ~ Guy Debord,
114:I fear - as far as I can tell - that most undergraduate degrees in computer science these days are basically Java vocational training. I've heard complaints from even mighty Stanford University with its illustrious faculty that basically the undergraduate computer science program is little more than Java certification. ~ Alan Kay,
115:In our opinion, most search engine optimization (SEO) is bullshit. It involves trying to read Google’s mind and then gaming the system to make Google find crap. There are three thousand computer science PhDs at Google trying to make each search relevant, and then there’s you trying to fool them. Who’s going to win? ~ Guy Kawasaki,
116:If somebody is working on a new medicine, computer science helps us model those things. We have a whole group here in Seattle called the Institute for Disease Modelling that is a mix of computer science and math-type people, and the progress we're making in polio or plans for malaria or really driven by their deep insights. ~ Bill Gates,
117:[Though computer science is a fairly new discipline, it is predominantly based on the Cartesian world view. As Edsgar W. Dijkstra has pointed out] A scientific discipline emerges with the - usually rather slow! - discovery of which aspects can be meaningfully 'studied' in isolation for the sake of their own consistency. ~ Edsger Dijkstra,
118:1958 - John McCarthy and Paul Graham invent LISP. Due to high costs caused by a post-war depletion of the strategic parentheses reserve LISP never becomes popular... Fortunately for computer science the supply of curly braces and angle brackets remains high. ~ Iry James, A Brief, Incomplete, and Mostly Wrong History of Programming Languages,
119:One of the problems we've had is that the ICT curriculum in the past has been written for a subject that is changing all the time. I think that what we should have is computer science in the future - and how it fits in to the curriculum is something we need to be talking to scientists, to experts in coding and to young people about. ~ Michael Gove,
120:The prerequisite that people have a scientific or engineering degree or a medical degree limits the number of female astronauts. Right now, still, we have about 20 per cent of people who have that prerequisite who are female. So hey, girls: Embrace the very fun career of science and technology. Look at computer science. That's what I did. ~ Julie Payette,
121:The issues involved are sufficiently important that courses are now moving out of the philosophy departments and into mainstream computer science. And they affect everyone. Many of the students attracted to these courses are not technology majors, and many of the topics we discuss relate to ethical challenges that transcend the computer world. ~ D Michael Quinn,
122:Now, the reason that we think computer science is about computers is pretty much the same reason that the Egyptians thought geometry was about surveying instruments. And that is, when some field is just getting started and you don't really understand it very well, it's very easy to confuse the essence of what you're doing with the tools that you use. ~ Hal Abelson,
123:Most of human behavior is a result of subconscious processes. We are unaware of them. As a result, many of our beliefs about how people behave—including beliefs about ourselves—are wrong. That is why we have the multiple social and behavioral sciences, with a good dash of mathematics, economics, computer science, information science, and neuroscience. ~ Donald A Norman,
124:The burgeoning field of computer science has shifted our view of the physical world from that of a collection of interacting material particles to one of a seething network of information. In this way of looking at nature, the laws of physics are a form of software, or algorithm, while the material world-the hardware-plays the role of a gigantic computer. ~ Paul Davies,
125:I actually remember very specifically the night that I launched Facebook at Harvard. I used to go out to get pizza with a friend who I did all my computer science homework with. And I remember talking to him and saying I am so happy we have this at Harvard because now our community can be connected but one day someone is going to build this for the world. ~ Mark Zuckerberg,
126:However, a real implementation may still have to include code to handle the case where something happens that was assumed to be impossible, even if that handling boils down to printf("Sucks to be you") and exit(666) — i.e., letting a human operator clean up the mess [93]. (This is arguably the difference between computer science and software engineering.) ~ Martin Kleppmann,
127:My background, I really am a computer hacker. I've studied computer science, I work in computer security. I'm not an actively a hacker, I'm an executive but I understand the mindset of changing a system to get the outcome that you want. It turns out to make the coffee, the problem is actually how the beans get turn into green coffee. That's where most of the problems happen. ~ Dave Asprey,
128:Throughout my academic career, I'd given some pretty good talks. But being considered the best speaker in the computer science department is like being known as the tallest of the Seven Dwarfs. And right then, I had the feeling that I had more in me, that if I gave it my all, I might be able to offer people something special. "Wisdom" is a strong word, but maybe that was it. ~ Randy Pausch,
129:Without real experience in using the computer to get useful results the computer science major is apt to know all about the marvelous tool except how to use it. Such a person is a mere technician, skilled in manipulating the tool but with little sense of how and when to use it for its basic purposes. ~ Richard Hamming, 1968 Turing Award lecture, Journal of the ACM 16 (1), January 1969, p. 6,
130:It's interesting that the greatest minds of computer science, the founding fathers, like Alan Turing and Claude Shannon and Norbert Wiener, they all looked at chess as the ultimate test. So they thought, "Oh, if a machine can play chess, and beat strong players, set aside a world champion, that would be the sign of a dawn of the AI era." With all due respect, they were wrong. ~ Garry Kasparov,
131:The best computer science students at Stanford were some of the best computer science students anywhere. Under Clark they gathered together into a new, potent force. ‘The difference was phenomenal, for me. I don’t know how many people around me noticed. But my God I noticed. The first manifestation was when all of these people started coming up and wanting to be part of my project.’ That ~ Michael Lewis,
132:What information consumes is rather obvious: it consumes the attention of its recipients. Hence, a wealth of information creates a poverty of attention and a need to allocate that attention efficiently among the overabundance of information sources that might consume it. —HERBERT SIMON, recipient of Nobel Memorial Prize in Economics8 and the A.M. Turing Award, the “Nobel Prize of Computer Science ~ Timothy Ferriss,
133:You know that Estonia, based largely on how successful Skype was, built by Estonian developers, that was a tenth of the entire country's GDP when eBay bought it. That was like a decade ago, it was f****** Estonia, they were behind the Iron Curtain two decades earlier. They're now pushing for K-12 education in computer science in public schools. They've gotten the message. They know how much value that can bring. ~ Alexis Ohanian,
134:The C-list girls who just banded together to create their own little utopia. Those are the girls you want to be, it couldn’t be clearer in hindsight. Early anarchists. Badasses. They didn’t bother, exempted themselves, turned their backs and took up softball, computer science, gardening, poetry, sewing. Those are the ones with a shot at becoming fairly content happy/tough/certain/fulfilled/gray-haired grown women. An ~ Elisa Albert,
135:Computer science... differs from physics in that it is not actually a science. It does not study natural objects. Neither is it, as you might think, mathematics; although it does use mathematical reasoning pretty extensively. Rather, computer science is like engineering; it is all about getting something to do something, rather than just dealing with abstractions, as in the pre-Smith geology. ~ Richard Feynman, Feynman Lectures on Computation, 1970,
136:meantime, here is a list of degrees for five of the nerdiest writers: J. STEWART BURNS BS Mathematics, Harvard University MS Mathematics, UC Berkeley DAVID S. COHEN BS Physics, Harvard University MS Computer Science, UC Berkeley AL JEAN BS Mathematics, Harvard University KEN KEELER BS Applied Mathematics, Harvard University PhD Applied Mathematics, Harvard University JEFF WESTBROOK BS Physics, Harvard University PhD Computer Science, Princeton University ~ Simon Singh,
137:I had the opportunity, as a child, to grow up in a community center where I was exposed to theater, music, art, and computer science; things that I would have never had the opportunity to even meet had it not been for those people taking time out of their schedules, helping us as children to travel all over the world while sitting in a gymnasium. That's what I did before I was a musician, before I was a recording artist, I was a teacher and a community leader. ~ Erykah Badu,
138:We are lucky in the United States to have our liberal arts system. In most countries, if you go to university, you have to decide for all English literature or no literature, all philosophy or no philosophy. But we have a system that is one part general education and one part specialization. If your parents say you've got to major in computer science, you can do that. But you can also take general education courses in the humanities, and usually you have to. ~ Martha C Nussbaum,
139:When I studied computer science at Duke University in the first half of the 1980s, I had professors who treated women differently than men. I kind of got used to it. At Microsoft, I had to use my elbows and make sure I spoke up at the table, but it was an incredibly meritocratic place. Outside, in the industry, I would feel the sexism. I'd walk into a room and until I proved my worth, everyone would assume that the guy presenting with me had credibility and I didn't. ~ Melinda Gates,
140:So I think a humanities major who also did a lot of computer science, economics, psychology, or other sciences can be quite valuable and have great career flexibility,’’ Katz said. ‘‘But you need both, in my view, to maximize your potential. And an economics major or computer science major or biology or engineering or physics major who takes serious courses in the humanities and history also will be a much more valuable scientist, financial professional, economist or entrepreneur. ~ Anonymous,
141:Computer science is an empirical discipline. [...] Each new machine that is built is an experiment. Actually constructing the machine poses a question to nature; and we listen for the answer by observing the machine in operation and analyzing it by all analytical and measurement means available. Each new program that is built is an experiment. It poses a question to nature, and its behavior offers clues to an answer. ~ Allen Newell (1975) Computer Science as Empirical Inquiry: Symbols and Search. p. 114,
142:Because of its origins and guiding principles, symbolist machine learning is still closer to the rest of AI than the other schools. If computer science were a continent, symbolist learning would share a long border with knowledge engineering. Knowledge is traded in both directions— manually entered knowledge for use in learners, induced knowledge for addition to knowledge bases— but at the end of the day the rationalist-empiricist fault line runs right down that border, and crossing it is not easy. ~ Pedro Domingos,
143:Error processing is turning out to be one of the thorniest problems of modern computer science, and you can't afford to deal with it haphazardly. Some people have estimated that as much as 90 percent of a program's code is written for exceptional, error-processing cases or housekeeping, implying that only 10 percent is written for nominal cases (Shaw in Bentley 1982). With so much code dedicated to handling errors, a strategy for handling them consistently should be spelled out in the architecture. ~ Steve McConnell,
144:I did think about a Ph.D. in computer science, but this is a time in industry where theory and practice are coming together in amazing ways. Yes, there's money, but what really interests me is that private-sector innovation happens faster. You can get more done and on a larger scale and have more impact. With all the start-ups out there, I think this is a time like the Renaissance. Not just one person doing great work, but so many feeding off one another. If you lived then, wouldn't you go out and paint? ~ Allegra Goodman,
145:A fashionable idea in technical circles is that quantity not only turns into quality at some extreme of scale, but also does so according to principles we already understand. Some of my colleagues think a million, or perhaps a billion, fragmentary insults will eventually yield wisdom that surpasses that of any well-thought-out essay, so long as sophisticated secret statistical algorithms recombine the fragments. I disagree. A trope from the early days of computer science comes to mind: garbage in, garbage out. ~ Jaron Lanier,
146:Alan Mathison Turing OBE FRS (/ˈtjʊərɪŋ/; 23 June 1912 - 7 June 1954) was an English computer scientist, mathematician, logician, cryptanalyst and theoretical biologist. He was highly influential in the development of theoretical computer science, providing a formalisation of the concepts of algorithm and computation with the Turing machine, which can be considered a model of a general purpose computer.[2][3][4] Turing is widely considered to be the father of theoretical computer science and artificial intelligence.[5]
   ~ Wikipedia,
147:We are unaware of them. As a result, many of our beliefs about how people behave—including beliefs about ourselves—are wrong. That is why we have the multiple social and behavioral sciences, with a good dash of mathematics, economics, computer science, information science, and neuroscience. Consider the following simple experiment. Do all three steps:        1.   Wiggle the second finger of your hand.        2.   Wiggle the third finger of the same hand.        3.   Describe what you did differently those two times. ~ Donald A Norman,
148:2 Metaphors for a Richer CC2E.COM/ 0278 Understanding of Software Development Contents 2.1 The Importance of Metaphors 2.2 How to Use Software Metaphors 2.3 Common Software Metaphors Related Topic Heuristics in design: “Design is a Heuristic Process” in Section 5.1. Computer science has some of the most colorful language of any field. In what other field can you walk into a sterile room, carefully controlled at 68°F, and find viruses, Trojan horses, worms, bugs, bombs, crashes, flames, twisted sex changers, and fatal errors ~ Anonymous,
149:a professor of computer science at MIT named Joseph Weizenbaum writes of a malady he calls “the compulsion to program.” He describes the afflicted as “bright young men of disheveled appearance, often with sunken, glowing eyes,” who play out “megalomaniacal fantasies of omnipotence” at computer consoles; they sit at their machines, he writes, “their arms tensed and waiting to fire their fingers, already poised to strike, at the buttons and keys on which their attention seems to be as riveted as a gambler’s on the rolling dice. ~ Tracy Kidder,
150:adventure, one usually found me, and now I weave those tales into my stories. I am blessed to have written the bestselling Jack Stratton mystery series. The collection includes And Then She Was Gone, Girl Jacked, Jack Knifed, Jacks Are Wild, Jack and the Giant Killer, and Data Jack. My background is an eclectic mix of degrees in theatre, communications, and computer science. Currently I reside in Massachusetts with my lovely wife and two fantastic children. My wife, Katherine Greyson, who is my chief content editor, is an author of her own romance ~ Christopher Greyson,
151:My own feeling is this: mathematics and computer science are the two unnatural sciences, the things that are man-made. We get to set up the rules so it doesn't matter the way the universe works – we create our own universe... And I feel strongly that they are different, but I tried to convince Bill Thurston and he disagreed with me. My opinion though is I can feel rather strongly when I am wearing my mathematician's cloak versus when I am wearing my computer scientist's cap. ~ Donald Knuth (May 29, 2011). "All Questions Answered" by Donald Knuth. GoogleTechTalks. YouTube.,
152:It’s fairly intuitive that never exploring is no way to live. But it’s also worth mentioning that never exploiting can be every bit as bad. In the computer science definition, exploitation actually comes to characterize many of what we consider to be life’s best moments. A family gathering together on the holidays is exploitation. So is a bookworm settling into a reading chair with a hot cup of coffee and a beloved favorite, or a band playing their greatest hits to a crowd of adoring fans, or a couple that has stood the test of time dancing to “their song. ~ Brian Christian,
153:One of the first people I interviewed was Alvy Ray Smith, a charismatic Texan with a Ph.D. in computer science and a sparkling resume that included teaching stints at New York University and UC Berkeley and a gig at Xerox PARC, the distinguished R&D lab in Palo Alto. I had conflicting feelings when I met Alvy because, frankly, he seemed more qualified to lead the lab than I was. I can still remember the uneasiness in my gut, that instinctual twinge spurred by a potential threat: This, I thought, could be the guy who takes my job one day. I hired him anyway. ~ Ed Catmull,
154:What is the central core of the subject [computer science]? What is it that distinguishes it from the separate subjects with which it is related? What is the linking thread which gathers these disparate branches into a single discipline. My answer to these questions is simple -it is the art of programming a computer. It is the art of designing efficient and elegant methods of getting a computer to solve problems, theoretical or practical, small or large, simple or complex. It is the art of translating this design into an effective and accurate computer program. ~ Tony Hoare,
155:In our opinion, most search engine optimization (SEO) is bullshit. It involves trying to read Google’s mind and then gaming the system to make Google find crap. There are three thousand computer science PhDs at Google trying to make each search relevant, and then there’s you trying to fool them. Who’s going to win? Tricking Google is futile. Instead, you should let Google do what it does best: find great content. So defy all the SEO witchcraft out there and focus on creating, curating, and sharing great content. This is what’s called SMO: social-media optimization ~ Guy Kawasaki,
156:This is because computer science has traditionally been all about thinking deterministically, but machine learning requires thinking statistically. If a rule for, say, labeling e-mails as spam is 99 percent accurate, that does not mean it’s buggy; it may be the best you can do and good enough to be useful. This difference in thinking is a large part of why Microsoft has had a lot more trouble catching up with Google than it did with Netscape. At the end of the day, a browser is just a standard piece of software, but a search engine requires a different mind-set. ~ Pedro Domingos,
157:Starting when computer technology first emerged during World War II and continuing into the 1960s, women made up most of the computing workforce. By 1970, however, women only accounted for 13.6% of bachelor's in computer science graduates. In 1984 that number rose to 37%, but it has since declined to 18% -- around the same time personal computers started showing up in homes. According to NPR, personal computers were marketed almost exclusively to men and families were more likely to buy computers for boys than girls. ~ Computerscience.org, “The Current State of Women in Computer Science”.,
158:In a book called Computer Power and Human Reason, a professor of computer science at MIT named Joseph Weizenbaum writes of a malady he calls “the compulsion to program.” He describes the afflicted as “bright young men of disheveled appearance, often with sunken, glowing eyes,” who play out “megalomaniacal fantasies of omnipotence” at computer consoles; they sit at their machines, he writes, “their arms tensed and waiting to fire their fingers, already poised to strike, at the buttons and keys on which their attention seems to be as riveted as a gambler’s on the rolling dice. ~ Tracy Kidder,
159:I think that it's extraordinarily important that we in computer science keep fun in computing. When it started out, it was an awful lot of fun. Of course, the paying customers got shafted every now and then, and after a while we began to take their complaints seriously. We began to feel as if we really were responsible for the successful, error-free perfect use of these machines. I don't think we are. I think we're responsible for stretching them, setting them off in new directions, and keeping fun in the house. I hope the field of computer science never loses its sense of fun. ~ Alan Perlis,
160:40. Be Defiant In our opinion, most search engine optimization (SEO) is bullshit. It involves trying to read Google’s mind and then gaming the system to make Google find crap. There are three thousand computer science PhDs at Google trying to make each search relevant, and then there’s you trying to fool them. Who’s going to win? Tricking Google is futile. Instead, you should let Google do what it does best: find great content. So defy all the SEO witchcraft out there and focus on creating, curating, and sharing great content. This is what’s called SMO: social-media optimization. ~ Guy Kawasaki,
161:The percentage of women working in computer science-related professions has declined since the 1990s, dropping from 35% to 26% between 1990 and 2013. According to the American Association of University Women, we can reverse this trend by removing negative connotations around women in computer science. Educators and parents must work together to help girls maintain their confidence and curiosity in STEM subjects. Professional women already in the field can become mentors, while men can help create a more inclusive workplace. ~ Computerscience.org, “The Current State of Women in Computer Science”.,
162:Computer science research is different from these more traditional disciplines. Philosophically it differs from the physical sciences because it seeks not to discover, explain, or exploit the natural world, but instead to study the properties of machines of human creation. In this it is analogous to mathematics, and indeed the "science" part of computer science is, for the most part mathematical in spirit. But an inevitable aspect of computer science is the creation of computer programs: objects that, though intangible, are subject to commercial exchange. ~ Dennis Ritchie (1984) Reflections on Software Research.,
163:the groundbreakers in many sciences were devout believers. Witness the accomplishments of Nicolaus Copernicus (a priest) in astronomy, Blaise Pascal (a lay apologist) in mathematics, Gregor Mendel (a monk) in genetics, Louis Pasteur in biology, Antoine Lavoisier in chemistry, John von Neumann in computer science, and Enrico Fermi and Erwin Schrodinger in physics. That’s a short list, and it includes only Roman Catholics; a long list could continue for pages. A roster that included other believers—Protestants, Jews, and unconventional theists like Albert Einstein, Fred Hoyle, and Paul Davies—could fill a book. ~ Scott Hahn,
164:Indeed, one of my major complaints about the computer field is that whereas Newton could say, "If I have seen a little farther than others, it is because I have stood on the shoulders of giants," I am forced to say, "Today we stand on each other's feet." Perhaps the central problem we face in all of computer science is how we are to get to the situation where we build on top of the work of others rather than redoing so much of it in a trivially different way. Science is supposed to be cumulative, not almost endless duplication of the same kind of things. ~ Richard Hamming, 1968 Turing Award lecture, Journal of the ACM 16 (1), January 1969, p. 7,
165:Andy: ugh I’ve never felt so old and slimy.
Sinter: You’re 25. That isn’t old to a 19 year old
Andy: Old. And slimy. Slimy like a slug. Like seaweed.
Sinter: Are you done with your metaphors
Andy: I think these are similes

Though he’d been a computer science major, he’d also been, like me, an English minor. It made him remarkably hot at moments such as this.

Sinter: Right, you’re right
Andy: And no. There are many slimy things and I’m like them all. Slimy like mayo
Sinter: Gross
Andy: Exactly, I am gross
Sinter: Ha no, mayo is gross
Andy: Slimy like a dog’s tongue.
Sinter: Seriously stop. ~ Molly Ringle,
166:The mind is more difficult to comprehend than actions. Most of us start by believing we already understand both human behavior and the human mind. After all, we are all human: we have all lived with ourselves all of our lives, and we like to think we understand ourselves. But the truth is, we don’t. Most of human behavior is a result of subconscious processes. We are unaware of them. As a result, many of our beliefs about how people behave—including beliefs about ourselves—are wrong. That is why we have the multiple social and behavioral sciences, with a good dash of mathematics, economics, computer science, information science, and neuroscience. ~ Donald A Norman,
167:I returned to our surveillance. The houses around us reminded me of Ryan Kessler’s place. About every fifth one was, if not identical, then designed from the same mold. We were staring through bushes at a split-level colonial, on the other side of a dog-park-cum-playground. It was the house of Peter Yu, the part-time professor of computer science at Northern Virginia College and a software designer for Global Software Innovations. The company was headquartered along the Dulles “technology corridor,” which was really just a dozen office buildings on the tollway, housing corporations whose claim to tech fame was mostly that they were listed on the NASDAQ stock exchange. I ~ Jeffery Deaver,
168:[Computer science] is not really about computers -- and it's not about computers in the same sense that physics is not really about particle accelerators, and biology is not about microscopes and Petri dishes...and geometry isn't really about using surveying instruments. Now the reason that we think computer science is about computers is pretty much the same reason that the Egyptians thought geometry was about surveying instruments: when some field is just getting started and you don't really understand it very well, it's very easy to confuse the essence of what you're doing with the tools that you use. ~ Harold Abelson, Introductory lecture to Structure and Interpretation of Computer Programs,
169:Underlying our approach to this subject is our conviction that "computer science" is not a science and that its significance has little to do with computers. The computer revolution is a revolution in the way we think and in the way we express what we think. The essence of this change is the emergence of what might best be called procedural epistemology—the study of the structure of knowledge from an imperative point of view, as opposed to the more declarative point of view taken by classical mathematical subjects. Mathematics provides a framework for dealing precisely with notions of "what is". Computation provides a framework for dealing precisely with notions of "how to". ~ Harold Abelson,
170:Interviewer: Is studying computer science the best way to prepare to be a programmer? Bill Gates: No. the best way to prepare is to write programs, and to study great programs that other people have written. In my case, I went to the garbage cans at the Computer Science Center and I fished out listings of their operating system. You got to be willing to read other people's code, then write your own, then have other people review your code. You've got to want to be in this incredible feedback loop where you get the world-class people to tell you what you're doing wrong. ~ Bill Gates cited in: "Programmers at Work: Interviews With 19 Programmers Who Shaped the Computer Industry", Tempus, by Susan Lammers (Editor),
171:[Computer science] is not really about computers -- and it's not about computers in the same sense that physics is not really about particle accelerators, and biology is not about microscopes and Petri dishes...and geometry isn't really about using surveying instruments. Now the reason that we think computer science is about computers is pretty much the same reason that the Egyptians thought geometry was about surveying instruments: when some field is just getting started and you don't really understand it very well, it's very easy to confuse the essence of what you're doing with the tools that you use." ~ Hal Abelson (1986) Introduction of video of lectures on the Structure and Interpretation of Computer Programs (source).,
172:Livingston: Why did users like Viaweb? Graham: I think the main thing was that it was easy. Practically all the software in the world is either broken or very difficult to use. So users dread software. They've been trained that whenever they try to install something, or even fill out a form online, it's not going to work. I dread installing stuff, and I have a PhD in computer science. So if you're writing applications for end users, you have to remember that you're writing for an audience that has been traumatized by bad experiences. We worked hard to make Viaweb as easy as it could possibly be, and we had this confidence-building online demo where we walked people through using the software. That was what got us all the users. ~ Jessica Livingston,
173:During the last years of the 1950s, the terminology in the field of computing was discussed in the Communications of the ACM, and a number of terms for the practitioners of the field of computing were suggested: turingineer, turologist, flowcharts-man, applied meta-mathematician, applied epistemologist, comptologist, hypologist, and computologist. The corresponding names of the discipline were, for instance, comptology, hypology, and computology. Later Peter Naur suggested the terms datalogy, datamatics, and datamaton for the names of the field, its practitioners, and the machine, and recently George McKee suggested the term computics. None of these terms stuck... ~ Matti Tedre (2006). The Development of Computer Science: A Sociocultural Perspective. p. 260,
174:[Computers] are developing so rapidly that even computer scientists cannot keep up with them. It must be bewildering to most mathematicians and engineers... In spite of the diversity of the applications, the methods of attacking the difficult problems with computers show a great unity, and the name of Computer Sciences is being attached to the discipline as it emerges. It must be understood, however, that this is still a young field whose structure is still nebulous. The student will find a great many more problems than answers. ~ George Forsythe (1961) "Engineering students must learn both computing and mathematics". J. Eng. Educ. 52 (1961), p. 177. as cited in (Knuth, 1972) According to Donald Knuth in this quote Forsythe coined the term "computer science".,
175:Just three or four decades ago, if you wanted to access a thousand core processors, you’d need to be the chairman of MIT’s computer science department or the secretary of the US Defense Department. Today the average chip in your cell phone can perform about a billion calculations per second. Yet today has nothing on tomorrow. “By 2020, a chip with today’s processing power will cost about a penny,” CUNY theoretical physicist Michio Kaku explained in a recent article for Big Think,23 “which is the cost of scrap paper. . . . Children are going to look back and wonder how we could have possibly lived in such a meager world, much as when we think about how our own parents lacked the luxuries—cell phone, Internet—that we all seem to take for granted. ~ Peter H Diamandis,
176:Usenet bulletin-board posting, August 21, 1994: Well-capitalized start-up seeks extremely talented C/C++/Unix developers to help pioneer commerce on the Internet. You must have experience designing and building large and complex (yet maintainable) systems, and you should be able to do so in about one-third the time that most competent people think possible. You should have a BS, MS, or PhD in Computer Science or the equivalent. Top-notch communication skills are essential. Familiarity with web servers and HTML would be helpful but is not necessary. Expect talented, motivated, intense, and interesting co-workers. Must be willing to relocate to the Seattle area (we will help cover moving costs). Your compensation will include meaningful equity ownership. Send resume and cover letter to Jeff Bezos. ~ Brad Stone,
177:My son Aaron, who is a professor of computer science, encountered just such a careless signal when he was on the admissions committee at Carnegie Mellon University. One Ph.D. applicant submitted a passionate letter about why he wanted to study at CMU, writing that he regarded CMU as the best computer science department in the world, that the CMU faculty was best equipped to help him pursue his research interests, and so on. But the final sentence of the letter gave the game away: I will certainly attend CMU if adCMUted. It was proof that the applicant had merely taken the application letter he had written to MIT and done a search-and-replace with “CMU” . . . and hadn’t even taken the time to reread it! Had he done so, he would have noticed that every occurrence of those three letters had been replaced. ~ Alvin E Roth,
178:Throw in the valley’s rich history of computer science breakthroughs, and you’ve set the stage for the geeky-hippie hybrid ideology that has long defined Silicon Valley. Central to that ideology is a wide-eyed techno-optimism, a belief that every person and company can truly change the world through innovative thinking. Copying ideas or product features is frowned upon as a betrayal of the zeitgeist and an act that is beneath the moral code of a true entrepreneur. It’s all about “pure” innovation, creating a totally original product that generates what Steve Jobs called a “dent in the universe.” Startups that grow up in this kind of environment tend to be mission-driven. They start with a novel idea or idealistic goal, and they build a company around that. Company mission statements are clean and lofty, detached from earthly concerns or financial motivations. ~ Kai Fu Lee,
179:In terms of funding, Google dwarfs even its own government: U.S. federal funding for math and computer science research amounts to less than half of Google’s own R&D budget. That spending spree has bought Alphabet an outsized share of the world’s brightest AI minds. Of the top one hundred AI researchers and engineers, around half are already working for Google. The other half are distributed among the remaining Seven Giants, academia, and a handful of smaller startups. Microsoft and Facebook have soaked up substantial portions of this group, with Facebook bringing on superstar researchers like Yann LeCun. Of the Chinese giants, Baidu went into deep-learning research earliest—even trying to acquire Geoffrey Hinton’s startup in 2013 before being outbid by Google—and scored a major coup in 2014 when it recruited Andrew Ng to head up its Silicon Valley AI Lab. ~ Kai Fu Lee,
180:The spectacle's instruction and the spectators' ignorance are wrongly seen as antagonistic factors when in fact they give birth to each other. In the same way, the computer's binary language is an irresistible inducement to the continual and unreserved acceptance of what has been programmed according to the wishes of someone else and passes for the timeless source of a superior, impartial and total logic. Such progress, such speed, such breadth of vocabulary! Political? Social? Make your choice. You cannot have both. My own choice is inescapable. They are jeering at us, and we know whom these programs are for. Thus it is hardly surprising that children should enthusiastically start their education at an early age with the Absolute Knowledge of computer science; while they are still unable to read, for reading demands making judgements at every line; and is the only access to the wealth of pre-spectacular human experience. Conversation is almost dead, and soon so too will be those who knew how to speak. ~ Guy Debord,
181:Everyone is talking about what’s going on with sales, and Sergey was paying no attention, just pushing buttons on the AV system and trying to unscrew a panel to understand it,” says Levick. “And I remember thinking, this man does not give a rat’s ass about this part of the business. He doesn’t get what we do. He never will. That set the tone for me very early in terms of the two Googles—the engineering Google and this other Google, the sales and business side.” No matter how much you exceeded your sales quota, a salesperson wouldn’t be coddled as much as a guy with a computer science degree who spent all day creating code. And some tried-and-true sales methods were verboten. For instance, golf outings. “Larry and Sergey hate golf,” says Levick. “Google has never sponsored a golf event and never will.” There would be days when Google salespeople would call agencies and discover that everybody was off on a golf retreat with Yahoo. But Tim Armstrong would tell his troops, “They have to take people on golf outings because they have nothing else. ~ Steven Levy,
182:I think that it's extraordinarily important that we in computer science keep fun in computing. When it started out, it was an awful lot of fun. Of course, the paying customers got shafted every now and then, and after a while we began to take their complaints seriously. We began to feel as if we really were responsible for the successful, error-free perfect use of these machines. I don't think we are. I think we're responsible for stretching them, setting them off in new directions, and keeping fun in the house. I hope the field of computer science never loses its sense of fun. Above all, I hope we don't become missionaries. Don't feel as if you're Bible salesmen. The world has too many of those already. What you know about computing other people will learn. Don't feel as if the key to successful computing is only in your hands. What's in your hands, I think and hope, is intelligence: the ability to see the machine as more than when you were first led up to it, that you can make it more. ~ Alan Perlis,     Quoted in The Structure and Interpretation of Computer Programs by Hal Abelson, Gerald Jay Sussman and Julie Sussman (McGraw-Hill, 2nd edition, 1996).,
183:In the face of this difficulty [of defining "computer science"] many people, including myself at times, feel that we should ignore the discussion and get on with doing it. But as George Forsythe points out so well in a recent article*, it does matter what people in Washington D.C. think computer science is. According to him, they tend to feel that it is a part of applied mathematics and therefore turn to the mathematicians for advice in the granting of funds. And it is not greatly different elsewhere; in both industry and the universities you can often still see traces of where computing first started, whether in electrical engineering, physics, mathematics, or even business. Evidently the picture which people have of a subject can significantly affect its subsequent development. Therefore, although we cannot hope to settle the question definitively, we need frequently to examine and to air our views on what our subject is and should become. ~ Richard Hamming, 1968 Turing Award lecture, Journal of the ACM 16 (1), January 1969, p. 4. In this quote Hamming refers to George Forsythe, "What to do until the computer scientist comes", Am. Math. Monthly 75 (5), May 1968, p. 454-461.,
184:> In the 21st century, intellectual capital is what will matter in the job market and will help a country grow its economy. Investments in biosciences, computers and electronics, engineering, and other growing high-tech industries have been the major differentiator in recent decades. More careers than ever now require technical skills so in order to be competitive in those fields, a nation must invest in STEM studies. Economic growth has slowed and unemployment rates have spiked, making employers much pickier about qualifications to hire. There is now an overabundance of liberal arts majors. A study from Georgetown University lists the five college majors with the highest unemployment rates (crossed against popularity): clinical psychology, 19.5 percent; miscellaneous fine arts, 16.2 percent; U.S. history, 15.1 percent; library science, 15 percent; and (tied for No. 5) military technologies and educational psychology, 10.9 percent each. Unemployment rates for STEM subjects hovered around 0 to 3 percent: astrophysics/astronomy, around 0 percent; geological and geophysics engineering, 0 percent; physical science, 2.5 percent; geosciences, 3.2 percent; and math/computer science, 3.5 percent.  ~ Philip G Zimbardo,
185:If biology limited women’s ability to code, then the ratio of women to men in programming ought to be similar in other countries. It isn’t. In India, roughly 40 percent of the students studying computer science and related fields are women. This is despite even greater barriers to becoming a female coder there; India has such rigid gender roles that female college students often have an 8 p.m. curfew, meaning they can’t work late in the computer lab, as the social scientist Roli Varma learned when she studied them in 2015. The Indian women had one big cultural advantage over their American peers, though: They were far more likely to be encouraged by their parents to go into the field, Varma says. What’s more, the women regarded coding as a safer job because it kept them indoors, lessening their exposure to street-level sexual harassment. It was, in other words, considered normal in India that women would code. The picture has been similar in Malaysia, where in 2001 — precisely when the share of American women in computer science had slid into a trough — women represented 52 percent of the undergraduate computer-science majors and 39 percent of the Ph.D. candidates at the University of Malaya in Kuala Lumpur. ~ Clive Thompson, “The Secret History of Women in Coding”, The New York Times, (Feb. 13, 2019),
186:Even more controversial was Google’s insistence on relying on academic metrics for mature adults whose work experience would seem to make college admission test scores and GPAs moot. In her interview for Google’s top HR job, Stacy Sullivan, then age thirty-five, was shocked when Brin and Page asked for her SAT scores. At first she challenged the practice. “I don’t think you should ask something from when people were sixteen or seventeen years old,” she told them. But Page and Brin seemed to believe that Google needed those … data. They believed that SAT scores showed how smart you were. GPAs showed how hard you worked. The numbers told the story. It never failed to astound midcareer people when Google asked to exhume those old records. “You’ve got to be kidding,” said R. J. Pittman, thirty-nine years old at the time, to the recruiter who asked him to produce his SAT scores and GPA. He was a Silicon Valley veteran, and Google had been wooing him. “I was pretty certain I didn’t have a copy of my SATs, and you can’t get them after five years or something,” he says. “And they’re, ‘Well, can you try to remember, make a close guess?’ I’m like, ‘Are you really serious?’ And they were serious. They will ask you questions about a grade that you got in a particular computer science class in college: Was there any reason why that wasn’t an A? And you think, ‘What was I doing way back then? ~ Steven Levy,
187:The best entrepreneurs don’t just follow Moore’s Law; they anticipate it. Consider Reed Hastings, the cofounder and CEO of Netflix. When he started Netflix, his long-term vision was to provide television on demand, delivered via the Internet. But back in 1997, the technology simply wasn’t ready for his vision—remember, this was during the era of dial-up Internet access. One hour of high-definition video requires transmitting 40 GB of compressed data (over 400 GB without compression). A standard 28.8K modem from that era would have taken over four months to transmit a single episode of Stranger Things. However, there was a technological innovation that would allow Netflix to get partway to Hastings’s ultimate vision—the DVD. Hastings realized that movie DVDs, then selling for around $ 20, were both compact and durable. This made them perfect for running a movie-rental-by-mail business. Hastings has said that he got the idea from a computer science class in which one of the assignments was to calculate the bandwidth of a station wagon full of backup tapes driving across the country! This was truly a case of technological innovation enabling business model innovation. Blockbuster Video had built a successful business around buying VHS tapes for around $ 100 and renting them out from physical stores, but the bulky, expensive, fragile tapes would never have supported a rental-by-mail business. ~ Reid Hoffman,
188:Bush’s description of how basic research provides the seed corn for practical inventions became known as the “linear model of innovation.” Although subsequent waves of science historians sought to debunk the linear model for ignoring the complex interplay between theoretical research and practical applications, it had a popular appeal as well as an underlying truth. The war, Bush wrote, had made it “clear beyond all doubt” that basic science—discovering the fundamentals of nuclear physics, lasers, computer science, radar—“is absolutely essential to national security.” It was also, he added, crucial for America’s economic security. “New products and new processes do not appear full-grown. They are founded on new principles and new conceptions, which in turn are painstakingly developed by research in the purest realms of science. A nation which depends upon others for its new basic scientific knowledge will be slow in its industrial progress and weak in its competitive position in world trade.” By the end of his report, Bush had reached poetic heights in extolling the practical payoffs of basic scientific research: “Advances in science when put to practical use mean more jobs, higher wages, shorter hours, more abundant crops, more leisure for recreation, for study, for learning how to live without the deadening drudgery which has been the burden of the common man for past ages.”9 Based on this report, Congress established the National Science Foundation. ~ Walter Isaacson,
189:Where people were once dazzled to be online, now their expectations had soared, and they did not bother to hide their contempt for those who sought to curtail their freedom on the Web. Nobody was more despised than a computer science professor in his fifties named Fang Binxing. Fang had played a central role in designing the architecture of censorship, and the state media wrote admiringly of him as the “father of the Great Firewall.” But when Fang opened his own social media account, a user exhorted others, “Quick, throw bricks at Fang Binxing!” Another chimed in, “Enemies of the people will eventually face trial.” Censors removed the insults as fast as possible, but they couldn’t keep up, and the lacerating comments poured in. People called Fang a “eunuch” and a “running dog.” Someone Photoshopped his head onto a voodoo doll with a pin in its forehead. In digital terms, Fang had stepped into the hands of a frenzied mob. Less than three hours after Web users spotted him, the Father of the Great Firewall shut down his account and recoiled from the digital world that he had helped create. A few months later, in May 2011, Fang was lecturing at Wuhan University when a student threw an egg at him, followed by a shoe, hitting the professor in the chest. Teachers tried to detain the shoe thrower, a science student from a nearby college, but other students shielded him and led him to safety. He was instantly famous online. People offered him cash and vacations in Hong Kong and Singapore. A female blogger offered to sleep with him. ~ Evan Osnos,
190:I work in theoretical computer science: a field that doesn’t itself win Fields Medals (at least not yet), but that has occasions to use parts of math that have won Fields Medals. Of course, the stuff we use cutting-edge math for might itself be dismissed as “ivory tower self-indulgence.” Except then the cryptographers building the successors to Bitcoin, or the big-data or machine-learning people, turn out to want the stuff we were talking about at conferences 15 years ago—and we discover to our surprise that, just as the mathematicians gave us a higher platform to stand on, so we seem to have built a higher platform for the practitioners. The long road from Hilbert to Gödel to Turing and von Neumann to Eckert and Mauchly to Gates and Jobs is still open for traffic today.

Yes, there’s plenty of math that strikes even me as boutique scholasticism: a way to signal the brilliance of the people doing it, by solving problems that require years just to understand their statements, and whose “motivations” are about 5,000 steps removed from anything Caplan or Bostrom would recognize as motivation. But where I part ways is that there’s also math that looked to me like boutique scholasticism, until Greg Kuperberg or Ketan Mulmuley or someone else finally managed to explain it to me, and I said: “ah, so that’s why Mumford or Connes or Witten cared so much about this. It seems … almost like an ordinary applied engineering question, albeit one from the year 2130 or something, being impatiently studied by people a few moves ahead of everyone else in humanity’s chess game against reality. It will be pretty sweet once the rest of the world catches up to this. ~ Scott Aaronson,
191:Soon, I found myself criss-crossing the country with Steve, in what we called our “dog and pony show,” trying to drum up interest in our initial public offering. As we traveled from one investment house to another, Steve (in a costume he rarely wore: suit and tie) pushed to secure early commitments, while I added a professorial presence by donning, at Steve’s insistence, a tweed jacket with elbow patches. I was supposed to embody the image of what a “technical genius” looks like—though, frankly, I don’t know anyone in computer science who dresses that way. Steve, as pitch man, was on fire. Pixar was a movie studio the likes of which no one had ever seen, he said, built on a foundation of cutting-edge technology and original storytelling. We would go public one week after Toy Story opened, when no one would question that Pixar was for real. Steve turned out to be right. As our first movie broke records at the box office and as all our dreams seemed to be coming true, our initial public offering raised nearly $140 million for the company—the biggest IPO of 1995. And a few months later, as if on cue, Eisner called, saying that he wanted to renegotiate the deal and keep us as a partner. He accepted Steve’s offer of a 50/50 split. I was amazed; Steve had called this exactly right. His clarity and execution were stunning. For me, this moment was the culmination of such a lengthy series of pursuits, it was almost impossible to take in. I had spent twenty years inventing new technological tools, helping to found a company, and working hard to make all the facets of this company communicate and work well together. All of this had been in the service of a single goal: making a computer-animated feature film. And now, we’d not only done it; thanks to Steve, we were on steadier financial ground than we’d ever been before. For the first time since our founding, our jobs were safe. I ~ Ed Catmull,
192:So far this does not tell us anything very general about structure except that it is hierarchical. But we can say more. Each assembly or subassembly or part has a task to perform. If it did not it would not be there. Each therefore is a means to a purpose. Each therefore, by my earlier definition, is a technology. This means that the assemblies, subassemblies, and individual parts are all executables-are all technologies. It follows that a technology consists of building blocks that are technologies, which consist of further building blocks that are technologies, which consist of yet further building blocks that are technologies, with the pattern repeating all the way down to the fundamental level of elemental components. Technologies, in other words, have a recursive structure. They consist of technologies within technologies all the way down to the elemental parts.

Recursiveness will be the second principle we will be working with. It is not a very familiar concept outside mathematics, physics, and computer science, where it means that structures consist of components that are in some way similar to themselves. In our context of course it does not mean that a jet engine consists of systems and parts that are little jet engines. That would be absurd. It means simply that a jet engine (or more generally, any technology) consists of component building blocks that are also technologies, and these consist of sub-parts that are also technologies, in a repeating (or recurring) pattern.

Technologies, then, are built from a hierarchy of technologies, and this has implications for how we should think of them, as wee will see shortly. It also means that whatever we can say in general about technologies-singular must hold also for assemblies or subsystems at lower levels as well. In particular, because a technology consists of main assembly and supporting assemblies, each assembly or subsystem must be organized this way too. ~ W Brian Arthur,
193:the AuThoRS Neal Lathia is a research associate in the Computer laboratory at the university of Cambridge. His research falls at the intersection of data mining, mobile systems, and personalization/recommender systems. lathia has a phD in computer science from the university College london. Contact him at neal.lathia@ cl.cam.ac.uk. Veljko Pejovic is a postdoctoral research fellow at the school of Computer science at the university of birmingham, uK. His research focuses on adaptive wireless technologies and their impact on society. pejovic received a phD in computer science from the university of California, santa barbara. Contact him at v.pejovic@cs.bham.ac.uk. Kiran K. Rachuri is a phD student in the Computer laboratory at the university of Cambridge. His research interests include smartphone sensing systems, energy efficient sensing, and sensor networks. rachuri received an ms in computer science from the Indian Institute of technology madras. Contact him at kiran.rachuri@cl.cam.ac.uk. Cecilia Mascolo is a reader in mobile systems in the Computer laboratory at the university of Cambridge. Her interests are in the area of mobility modeling, sensing, and social network analysis. mascolo has a phD in computer science from the university of bologna. Contact her at cecilia.mascolo@cl.cam.ac.uk. Mirco Musolesi is a senior lecturer in the school of Computer science at the university of birmingham, uK. His research interests include mobile sensing, large-scale data mining, and network science. musolesi has a phD in computer science from the university College london. Contact him at m.musolesi@ cs.bham.ac.uk. Peter J. Rentfrow is a senior lecturer in the psychology Department at the university of Cambridge. His research focuses on behavioral manifestations of personality and psychological processes. rentfrow earned a phD in psychology from the university of texas at Austin. Contact him at pjr39@cam.ac.uk.selected Cs articles and columns are also available for free at http ~ Anonymous,
194:In fact, the same basic ingredients can easily be found in numerous start-up clusters in the United States and around the world: Austin, Boston, New York, Seattle, Shanghai, Bangalore, Istanbul, Stockholm, Tel Aviv, and Dubai. To discover the secret to Silicon Valley’s success, you need to look beyond the standard origin story. When people think of Silicon Valley, the first things that spring to mind—after the HBO television show, of course—are the names of famous start-ups and their equally glamorized founders: Apple, Google, Facebook; Jobs/ Wozniak, Page/ Brin, Zuckerberg. The success narrative of these hallowed names has become so universally familiar that people from countries around the world can tell it just as well as Sand Hill Road venture capitalists. It goes something like this: A brilliant entrepreneur discovers an incredible opportunity. After dropping out of college, he or she gathers a small team who are happy to work for equity, sets up shop in a humble garage, plays foosball, raises money from sage venture capitalists, and proceeds to change the world—after which, of course, the founders and early employees live happily ever after, using the wealth they’ve amassed to fund both a new generation of entrepreneurs and a set of eponymous buildings for Stanford University’s Computer Science Department. It’s an exciting and inspiring story. We get the appeal. There’s only one problem. It’s incomplete and deceptive in several important ways. First, while “Silicon Valley” and “start-ups” are used almost synonymously these days, only a tiny fraction of the world’s start-ups actually originate in Silicon Valley, and this fraction has been getting smaller as start-up knowledge spreads around the globe. Thanks to the Internet, entrepreneurs everywhere have access to the same information. Moreover, as other markets have matured, smart founders from around the globe are electing to build companies in start-up hubs in their home countries rather than immigrating to Silicon Valley. ~ Reid Hoffman,
195:a harbinger of a third wave of computing, one that blurred the line between augmented human intelligence and artificial intelligence. “The first generation of computers were machines that counted and tabulated,” Rometty says, harking back to IBM’s roots in Herman Hollerith’s punch-card tabulators used for the 1890 census. “The second generation involved programmable machines that used the von Neumann architecture. You had to tell them what to do.” Beginning with Ada Lovelace, people wrote algorithms that instructed these computers, step by step, how to perform tasks. “Because of the proliferation of data,” Rometty adds, “there is no choice but to have a third generation, which are systems that are not programmed, they learn.”27 But even as this occurs, the process could remain one of partnership and symbiosis with humans rather than one designed to relegate humans to the dustbin of history. Larry Norton, a breast cancer specialist at New York’s Memorial Sloan-Kettering Cancer Center, was part of the team that worked with Watson. “Computer science is going to evolve rapidly, and medicine will evolve with it,” he said. “This is coevolution. We’ll help each other.”28 This belief that machines and humans will get smarter together is a process that Doug Engelbart called “bootstrapping” and “coevolution.”29 It raises an interesting prospect: perhaps no matter how fast computers progress, artificial intelligence may never outstrip the intelligence of the human-machine partnership. Let us assume, for example, that a machine someday exhibits all of the mental capabilities of a human: giving the outward appearance of recognizing patterns, perceiving emotions, appreciating beauty, creating art, having desires, forming moral values, and pursuing goals. Such a machine might be able to pass a Turing Test. It might even pass what we could call the Ada Test, which is that it could appear to “originate” its own thoughts that go beyond what we humans program it to do. There would, however, be still another hurdle before we could say that artificial intelligence has triumphed over augmented intelligence. We can call it the Licklider Test. It would go beyond asking whether a machine could replicate all the components of human intelligence to ask whether the machine accomplishes these tasks better when whirring away completely on its own or when working in conjunction with humans. In other words, is it possible that humans and machines working in partnership will be indefinitely more powerful than an artificial intelligence machine working alone? ~ Walter Isaacson,
196:The breakthrough came in the early 1980s, when Judea Pearl, a professor of computer science at the University of California, Los Angeles, invented a new representation: Bayesian networks. Pearl is one of the most distinguished computer scientists in the world, his methods having swept through machine learning, AI, and many other fields. He won the Turing Award, the Nobel Prize of computer science, in 2012. Pearl realized that it’s OK to have a complex network of dependencies among random variables, provided each variable depends directly on only a few others. We can represent these dependencies with a graph like the ones we saw for Markov chains and HMMs, except now the graph can have any structure (as long as the arrows don’t form closed loops). One of Pearl’s favorite examples is burglar alarms. The alarm at your house should go off if a burglar attempts to break in, but it could also be triggered by an earthquake. (In Los Angeles, where Pearl lives, earthquakes are almost as frequent as burglaries.) If you’re working late one night and your neighbor Bob calls to say he just heard your alarm go off, but your neighbor Claire doesn’t, should you call the police? Here’s the graph of dependencies: If there’s an arrow from one node to another in the graph, we say that the first node is a parent of the second. So Alarm’s parents are Burglary and Earthquake, and Alarm is the sole parent of Bob calls and Claire calls. A Bayesian network is a graph of dependencies like this, together with a table for each variable, giving its probability for each combination of values of its parents. For Burglary and Earthquake we only need one probability each, since they have no parents. For Alarm we need four: the probability that it goes off even if there’s no burglary or earthquake, the probability that it goes off if there’s a burglary and no earthquake, and so on. For Bob calls we need two probabilities (given alarm and given no alarm), and similarly for Claire. Here’s the crucial point: Bob calling depends on Burglary and Earthquake, but only through Alarm. Bob’s call is conditionally independent of Burglary and Earthquake given Alarm, and so is Claire’s. If the alarm doesn’t go off, your neighbors sleep soundly, and the burglar proceeds undisturbed. Also, Bob and Claire are independent given Alarm. Without this independence structure, you’d need to learn 25 = 32 probabilities, one for each possible state of the five variables. (Or 31, if you’re a stickler for details, since the last one can be left implicit.) With the conditional independencies, all you need is 1 + 1 + 4 + 2 + 2 = 10, a savings of 68 percent. And that’s just in this tiny example; with hundreds or thousands of variables, the savings would be very close to 100 percent. ~ Pedro Domingos,

IN CHAPTERS [0/0]









WORDNET



--- Overview of noun computer_science

The noun computer science has 1 sense (no senses from tagged texts)
              
1. computer science, computing ::: (the branch of engineering science that studies (with the aid of computers) computable processes and structures)


--- Synonyms/Hypernyms (Ordered by Estimated Frequency) of noun computer_science

1 sense of computer science                      

Sense 1
computer science, computing
   => engineering, engineering science, applied science, technology
     => discipline, subject, subject area, subject field, field, field of study, study, bailiwick
       => knowledge domain, knowledge base, domain
         => content, cognitive content, mental object
           => cognition, knowledge, noesis
             => psychological feature
               => abstraction, abstract entity
                 => entity


--- Hyponyms of noun computer_science

1 sense of computer science                      

Sense 1
computer science, computing
   => object
   => artificial intelligence, AI


--- Synonyms/Hypernyms (Ordered by Estimated Frequency) of noun computer_science

1 sense of computer science                      

Sense 1
computer science, computing
   => engineering, engineering science, applied science, technology




--- Coordinate Terms (sisters) of noun computer_science

1 sense of computer science                      

Sense 1
computer science, computing
  -> engineering, engineering science, applied science, technology
   => aeronautical engineering
   => bionics
   => biotechnology, bioengineering, ergonomics
   => chemical engineering
   => civil engineering
   => electrical engineering, EE
   => computer science, computing
   => architectural engineering
   => industrial engineering, industrial management
   => information technology, IT
   => mechanical engineering
   => nanotechnology
   => nuclear engineering
   => naval engineering
   => rocketry




--- Grep of noun computer_science
computer science
department of computer science



IN WEBGEN [10000/585]

Wikipedia - 2-satisfiability -- Theoretical computer science problem
Wikipedia - 4D vector -- 4-component vector data type in computer science
Wikipedia - AAAI Squirrel AI Award -- American annual computer science prize
Wikipedia - Abstraction (computer science) -- Technique for arranging complexity of computer systems
Wikipedia - ACID (computer science)
Wikipedia - Advice (computer science)
Wikipedia - Affective computing -- Area of research in computer science aiming to understand the emotional state of users
Wikipedia - African American women in computer science
Wikipedia - African-American women in computer science
Wikipedia - Alan Turing Centenary Conference -- Computer Science Conference celebrating Alan Turing in his centenary year
Wikipedia - Algebraic semantics (computer science)
Wikipedia - Alphabet (computer science)
Wikipedia - Anti-unification (computer science)
Wikipedia - AP Computer Science Principles -- AP high school course in procedural programming and computer science concepts
Wikipedia - AP Computer Science -- Concept in Computer Science
Wikipedia - Architectural pattern (computer science)
Wikipedia - Argument (computer science)
Wikipedia - Arrow (computer science)
Wikipedia - Aspect (computer science)
Wikipedia - Assignment (computer science)
Wikipedia - Atomic (computer science)
Wikipedia - Backus-Naur form -- One of the two main notation techniques for context-free grammars in computer science
Wikipedia - Barrier (computer science)
Wikipedia - Behat (computer science)
Wikipedia - Book:Computer science
Wikipedia - Boxing (computer science)
Wikipedia - Branch (computer science)
Wikipedia - British Colloquium for Theoretical Computer Science
Wikipedia - Business entity (computer science)
Wikipedia - Callback (computer science)
Wikipedia - Cambridge Diploma in Computer Science
Wikipedia - Carnegie Mellon School of Computer Science
Wikipedia - Cast (computer science)
Wikipedia - Category:Bibliographic databases in computer science
Wikipedia - Category:Computer science articles needing attention
Wikipedia - Category:Computer science articles needing expert attention
Wikipedia - Category:Computer science articles without infoboxes
Wikipedia - Category:Computer science awards
Wikipedia - Category:Computer science award winners
Wikipedia - Category:Computer science books
Wikipedia - Category:Computer science competitions
Wikipedia - Category:Computer science education in the United Kingdom
Wikipedia - Category:Computer science education
Wikipedia - Category:Computer science educators
Wikipedia - Category:Computer science in India
Wikipedia - Category:Computer science in the Netherlands
Wikipedia - Category:Computer science journal stubs
Wikipedia - Category:Computer science journals
Wikipedia - Category:Computer science papers
Wikipedia - Category:Computer science-related professional associations
Wikipedia - Category:Computer science research organizations
Wikipedia - Category:Computer science stubs
Wikipedia - Category:Computer science
Wikipedia - Category:Computer science writers
Wikipedia - Category:Concurrency (computer science)
Wikipedia - Category:Department of Computer Science, University of Manchester
Wikipedia - Category:History of computer science
Wikipedia - Category:Israel Prize in computer sciences recipients
Wikipedia - Category:Logic in computer science
Wikipedia - Category:Members of the Department of Computer Science, University of Oxford
Wikipedia - Category:People associated with the Department of Computer Science, University of Manchester
Wikipedia - Category:Philosophy of computer science
Wikipedia - Category:Polymorphism (computer science)
Wikipedia - Category:Stanford University Department of Computer Science faculty
Wikipedia - Category:Subfields of computer science
Wikipedia - Category:Theoretical computer science stubs
Wikipedia - Category:Theoretical computer science
Wikipedia - Category:Unassessed Computer science articles
Wikipedia - Category:Unknown-importance Computer science articles
Wikipedia - Category:Unsolved problems in computer science
Wikipedia - Category:WikiProject Computer science articles
Wikipedia - Cellular automaton -- A discrete model studied in computer science
Wikipedia - Center for Discrete Mathematics and Theoretical Computer Science
Wikipedia - Chan-Jin Chung -- Computer science professor (born 1959)
Wikipedia - Channel system (computer science) -- Finite-state machine with fifo buffers for memory
Wikipedia - Circuit (computer science)
Wikipedia - Class (computer science)
Wikipedia - Classes (computer science)
Wikipedia - Closure (computer science)
Wikipedia - Codata (computer science)
Wikipedia - Cohesion (computer science)
Wikipedia - Collection (computer science)
Wikipedia - Collection of Computer Science Bibliographies
Wikipedia - College of Technology and Computer Science at East Carolina University
Wikipedia - Collision (computer science)
Wikipedia - Collision detection -- Term in computer science
Wikipedia - Computability theory (computer science)
Wikipedia - Computability theory -- Branch of mathematical logic, computer science, and the theory of computation studying computable functions and Turing degrees
Wikipedia - Computational geometry -- Branch of computer science
Wikipedia - Computational musicology -- Interdisciplinary research area between musicology and computer science
Wikipedia - Computer algebra -- Scientific area at the interface between computer science and mathematics
Wikipedia - Computer graphics (computer science) -- Sub-field of computer science
Wikipedia - Computer Science and Engineering
Wikipedia - Computer science and engineering -- University academic program
Wikipedia - Computer science education
Wikipedia - Computer Science Ontology
Wikipedia - Computer Science Press, Inc.
Wikipedia - Computer Science Press
Wikipedia - Computer Sciences Corporation
Wikipedia - Computer sciences
Wikipedia - Computer Science Teachers Association -- Professional association
Wikipedia - Computer science theory
Wikipedia - Computer Science Tripos
Wikipedia - Computer Science
Wikipedia - Computer science -- Study of the foundations and applications of computation
Wikipedia - Computer scientist -- Scientist specializing in computer science
Wikipedia - Conceptual model (computer science)
Wikipedia - Concern (computer science)
Wikipedia - Concurrency (computer science) -- Ability of different parts or units of a program, algorithm, or problem to be executed out-of-order or in partial order, without affecting the final outcome
Wikipedia - Connascence (computer science)
Wikipedia - Consensus (computer science)
Wikipedia - Constant (computer science)
Wikipedia - Constructor (computer science)
Wikipedia - Correctness (computer science)
Wikipedia - Coupling (computer science)
Wikipedia - Covariance and contravariance (computer science)
Wikipedia - CS50 -- Computer science course
Wikipedia - Cynthia B. Lee -- Computer science lecturer at Stanford University
Wikipedia - Dana Angluin -- Professor of computer science
Wikipedia - David R. Cheriton School of Computer Science
Wikipedia - Decision problem -- Yes/no problem in computer science
Wikipedia - Declaration (computer science)
Wikipedia - Decomposition (computer science)
Wikipedia - Deforestation (computer science)
Wikipedia - Demetri Terzopoulos -- American professor of computer science
Wikipedia - Department of Computer Science and Technology, University of Cambridge
Wikipedia - Department of Computer Science of TU Darmstadt -- Department of Computer Science of the Technische UniversitM-CM-$t Darmstadt
Wikipedia - Department of Computer Science, University of Bristol
Wikipedia - Department of Computer Science, University of Illinois at Urbana-Champaign
Wikipedia - Department of Computer Science, University of Manchester
Wikipedia - Department of Computer Science, University of Oxford -- Department of the University of Oxford
Wikipedia - Department of Computer Science (University of Toronto)
Wikipedia - Dependency (computer science)
Wikipedia - Design pattern (computer science)
Wikipedia - Destructor (computer science)
Wikipedia - Deterministic parsing -- Parsing related to computer science
Wikipedia - Discrete Mathematics and Theoretical Computer Science
Wikipedia - Divergence (computer science)
Wikipedia - Donald Bren School of Information and Computer Sciences
Wikipedia - Dragon Book (computer science)
Wikipedia - ECSE (Academic Degree) -- Academic Degree in computer science
Wikipedia - Edit distance -- Computer science metric of string similarity
Wikipedia - Electrical Engineering and Computer Science
Wikipedia - Encapsulation (computer science)
Wikipedia - End-user (computer science)
Wikipedia - Enrique Alba -- Spanish computer science professor (born 1968)
Wikipedia - Enumerator (in theoretical computer science)
Wikipedia - Erik Demaine -- Professor of Computer Science
Wikipedia - European Association for Theoretical Computer Science
Wikipedia - Exception (computer science)
Wikipedia - Expression (computer science)
Wikipedia - Expressive power (computer science)
Wikipedia - Fahiem Bacchus -- Canadian professor of computer science
Wikipedia - Fiber (computer science) -- Lightweight thread of execution in the field of computer science
Wikipedia - Field (computer science)
Wikipedia - First-order logic -- Collection of formal systems used in mathematics, philosophy, linguistics, and computer science
Wikipedia - Flavors (computer science)
Wikipedia - Foundations and Trends in Theoretical Computer Science
Wikipedia - French Institute for Research in Computer Science and Automation
Wikipedia - Function composition (computer science)
Wikipedia - Function (computer science)
Wikipedia - Gadget (computer science)
Wikipedia - Garbage collection (computer science) -- Form of automatic memory management
Wikipedia - Garbage (computer science)
Wikipedia - Gates-Dell Complex -- Computer Science department at the University of Texas at Austin
Wikipedia - Generator (computer science)
Wikipedia - Genetic memory (computer science)
Wikipedia - Georgia Institute of Technology School of Computational Science & Engineering -- School of computer science in Atlanta, Georgia
Wikipedia - Georgia Institute of Technology School of Computer Science
Wikipedia - Georgia Tech Online Master of Science in Computer Science
Wikipedia - Glossary of computer science -- List of definitions of terms and concepts commonly used in computer science
Wikipedia - Grace Murray Hopper Award -- Computer science award
Wikipedia - Graph (computer science)
Wikipedia - Greedy algorithm -- This article describes a type of algorithmic approach that is used to solve computer science problems
Wikipedia - Grigore Rosu -- Computer science professor
Wikipedia - Guard (computer science)
Wikipedia - Harlan Mills -- Computer science professor
Wikipedia - Heap (data structure) -- Computer science data structure
Wikipedia - Heuristic (computer science)
Wikipedia - History of computer science -- Aspect of history
Wikipedia - Hot spot (computer science)
Wikipedia - IEEE Symposium on Foundations of Computer Science
Wikipedia - IEEE Symposium on Logic in Computer Science
Wikipedia - IEEE Xplore -- Research database focused on computer science, electrical engineering, electronics, and allied fields
Wikipedia - Informatics -- Concept in computer science
Wikipedia - Information and Computer Science
Wikipedia - Information and computer science
Wikipedia - Inheritance (computer science)
Wikipedia - Input (computer science)
Wikipedia - Instance (computer science) -- Concrete manifestation of an object (class) in software development
Wikipedia - Institute for Theoretical Computer Science
Wikipedia - Institute of Computer Science
Wikipedia - Institution (computer science)
Wikipedia - Instruction (computer science)
Wikipedia - Integer (computer science) -- Datum of integral data type
Wikipedia - Interface (computer science)
Wikipedia - Interface (computing) -- Concept of computer science; point of interaction between two things
Wikipedia - International Computer Science Institute
Wikipedia - International Journal of Foundations of Computer Science
Wikipedia - Introspection (computer science)
Wikipedia - Invariant (computer science)
Wikipedia - Jennifer Widom -- University professor in Computer Science
Wikipedia - Joan Francioni -- American Professor of Computer Science
Wikipedia - Journal of Universal Computer Science
Wikipedia - Judith Gal-Ezer -- Israeli computer science professor
Wikipedia - Kernel (computer science)
Wikipedia - Label (computer science)
Wikipedia - Laboratory for Computer Science
Wikipedia - Laboratory for Foundations of Computer Science
Wikipedia - Language (computer science)
Wikipedia - Leaf subroutine -- Subroutines in computer science
Wikipedia - Lecture Notes in Computer Science
Wikipedia - Leonard Adleman -- American theoretical computer scientist and professor of computer science and molecular biology at the University of Southern California
Wikipedia - Levenshtein distance -- Computer science metric for string similarity
Wikipedia - Lexical analysis -- Conversion of character sequences into token sequences in computer science
Wikipedia - Library (computer science)
Wikipedia - List (abstract data type) -- Abstract data type used in computer science
Wikipedia - List (computer science)
Wikipedia - List of academic computer science departments -- Wikipedia list article
Wikipedia - List of computer science awards -- Wikipedia list article
Wikipedia - List of computer science conference acronyms -- Wikipedia list article
Wikipedia - List of computer science conferences -- Wikimedia list article
Wikipedia - List of computer science journals -- Wikipedia list article
Wikipedia - List of important publications in computer science -- Wikimedia list article
Wikipedia - List of important publications in theoretical computer science -- Wikipedia list article
Wikipedia - List of members of the National Academy of Engineering (Computer science)
Wikipedia - List of open problems in computer science
Wikipedia - List of pioneers in computer science -- Wikipedia list article
Wikipedia - List of unsolved problems in computer science -- Wikipedia list article
Wikipedia - Literal (computer science)
Wikipedia - Lixia Zhang -- Professor of Computer Science
Wikipedia - Liz Bacon -- Professor of computer science
Wikipedia - Lock (computer science)
Wikipedia - Logical Methods in Computer Science
Wikipedia - Logic in computer science -- Academic discipline
Wikipedia - Macro (computer science) -- In computer science, a concise representation of a pattern
Wikipedia - Map (computer science)
Wikipedia - Marshalling (computer science)
Wikipedia - Mary Kenneth Keller -- First American woman to receive a PhD in computer science
Wikipedia - Mathematical Foundations of Computer Science
Wikipedia - Max Planck Institute for Computer Science
Wikipedia - McGill University School of Computer Science
Wikipedia - Memory leak -- Computer science term
Wikipedia - Message (computer science)
Wikipedia - Meta-learning (computer science)
Wikipedia - Meta learning (computer science) -- Subfield of machine learning
Wikipedia - Method (computer science)
Wikipedia - Michael Backes -- German professor of computer science
Wikipedia - MIT Computer Science and Artificial Intelligence Laboratory -- CS and AI Laboratory at MIT
Wikipedia - MIT Electrical Engineering and Computer Science Department
Wikipedia - MIT Laboratory for Computer Science
Wikipedia - Monoculture (computer science)
Wikipedia - N. Asokan -- Professor of Computer Science at University of Waterloo
Wikipedia - Natural language processing -- Field of computer science and linguistics
Wikipedia - Node (computer science)
Wikipedia - Object (computer science)
Wikipedia - Offset (computer science)
Wikipedia - Omega (computer science)
Wikipedia - On the Cruelty of Really Teaching Computer Science
Wikipedia - Ontology (computer science)
Wikipedia - Ontology language (computer science)
Wikipedia - Open-shop scheduling -- Scheduling problem in computer science
Wikipedia - Operating Systems: Design and Implementation -- Computer science textbook
Wikipedia - Optimization (computer science)
Wikipedia - Order of operations -- In mathematics and computer science, order in which operations are performed
Wikipedia - Outline of computer science
Wikipedia - Outstanding Contribution to Computer Science Education
Wikipedia - Oxford University Department of Computer Science
Wikipedia - Paola Velardi -- Professor of computer science
Wikipedia - Parameter (computer science)
Wikipedia - Partial word -- Computer science string term
Wikipedia - Pat Hayes -- Computer science researcher in artificial intelligence
Wikipedia - Paxos (computer science)
Wikipedia - Persistence (computer science)
Wikipedia - Philosophy of computer science
Wikipedia - Pointer (computer science)
Wikipedia - Polling (computer science)
Wikipedia - Polymorphism (computer science)
Wikipedia - Pool (computer science) -- Collection of computer resources that are kept ready to use rather than acquired on use and released afterwards
Wikipedia - Portability (computer science)
Wikipedia - Precision (computer science)
Wikipedia - Prentice Hall International Series in Computer Science
Wikipedia - Priority queue -- Abstract data type in computer science
Wikipedia - Privilege (computer science)
Wikipedia - Procedure (computer science)
Wikipedia - Production (computer science) -- In computer science, a rewrite rule specifying a substitution that can be recursively performed to generate new sequences
Wikipedia - Production system (computer science)
Wikipedia - Profiler (computer science)
Wikipedia - Program analysis (computer science)
Wikipedia - Programming language theory -- |Branch of computer science
Wikipedia - Protocol (computer science)
Wikipedia - Protocol Wars -- Computer science debate
Wikipedia - P versus NP problem -- Unsolved problem in computer science
Wikipedia - Question answering -- Computer science discipline
Wikipedia - Raft (computer science)
Wikipedia - Randy Pausch -- American professor of computer science, human-computer interaction and design
Wikipedia - Reachability analysis -- Solution to the reachability problem in distributed systems (computer science)
Wikipedia - Record (computer science) -- Information block that is part of a database (data row)
Wikipedia - Recursion (computer science)
Wikipedia - Reference (computer science)
Wikipedia - Reflection (computer science)
Wikipedia - Regius Professor of Computer Science
Wikipedia - Reification (computer science)
Wikipedia - Replication (computer science)
Wikipedia - Resource (computer science)
Wikipedia - Robustness (computer science)
Wikipedia - Roxana Geambasu -- Associate professor of Computer Science at Columbia University
Wikipedia - School of Electronics and Computer Science, University of Southampton
Wikipedia - Scope (computer science)
Wikipedia - Scottish Informatics and Computer Science Alliance
Wikipedia - Self-healing (computer science)
Wikipedia - Self-management (computer science)
Wikipedia - Semantics (computer science) -- The field concerned with the rigorous mathematical study of the meaning of programming languages
Wikipedia - Serge Belongie -- Professor of Computer Science
Wikipedia - Session (computer science)
Wikipedia - Set (computer science)
Wikipedia - Seymour Cray Computer Science and Engineering Award
Wikipedia - Shadow table -- Object in computer science used to improve the way machines, networks and programs handle information
Wikipedia - Sheila McIlraith -- Professor of Computer Science
Wikipedia - Side-effect (computer science)
Wikipedia - Side effect (computer science) -- Of a function, an additional effect besides returning a value
Wikipedia - SIGCSE Award for Lifetime Service to Computer Science Education
Wikipedia - SIGCSE Award for Outstanding Contribution to Computer Science Education
Wikipedia - SIGCSE Technical Symposium on Computer Science Education
Wikipedia - Signature (computer science)
Wikipedia - Solver (computer science)
Wikipedia - Stack (computer science)
Wikipedia - Starvation (computer science) -- Resource shortage in computers
Wikipedia - State (computer science) -- Remembered information in a computer system
Wikipedia - Statement (computer science)
Wikipedia - String (computer science) -- Sequence of characters, data type
Wikipedia - Structure and Interpretation of Computer Programs -- Computer science textbook
Wikipedia - Subclass (computer science)
Wikipedia - Superclass (computer science)
Wikipedia - Supinfo -- Private institution of higher education in general Computer Science
Wikipedia - Swap (computer science)
Wikipedia - Symposium on Foundations of Computer Science
Wikipedia - Symposium on Logic in Computer Science
Wikipedia - Symposium on Theoretical Aspects of Computer Science
Wikipedia - Synchronization (computer science) -- Concept in computer science, referring to processes, or data
Wikipedia - Technology transfer in computer science
Wikipedia - Template talk:Computer-science-journal-stub
Wikipedia - Template talk:Computer science
Wikipedia - Template talk:TopicTOC-Computer science
Wikipedia - Template talk:WikiProject Computer science
Wikipedia - The Collection of Computer Science Bibliographies
Wikipedia - The Journal of Supercomputing -- Academic computer science journal
Wikipedia - The Limoges Computer Sciences Engineering School -- French engineering school
Wikipedia - Theoretical Computer Science (journal)
Wikipedia - Theoretical Computer Science
Wikipedia - Theoretical computer science
Wikipedia - Theory of computation -- Academic subfield of computer science
Wikipedia - Thrashing (computer science) -- Computer constantly exchanging data between memory and storage leaving little capacity for productive processing
Wikipedia - Thread (computer science)
Wikipedia - Threads (computer science)
Wikipedia - Thundering herd problem -- Resource allocation problem in computer science
Wikipedia - Tiberiu Popoviciu High School of Computer Science -- High school
Wikipedia - Timothy Budd -- Associate professor of computer science at Oregon State University
Wikipedia - Toni Scullion -- Scottish computer science teacher
Wikipedia - Top type -- In mathematical logic and computer science, a type that contains all types as subtypes
Wikipedia - Toyota Technological Institute at Chicago -- Private college focused on computer science
Wikipedia - Trait (computer science)
Wikipedia - Traits (computer science)
Wikipedia - Tuple (computer science)
Wikipedia - Turing Award -- American annual computer science prize
Wikipedia - Type (computer science)
Wikipedia - UBC Department of Computer Science
Wikipedia - Ubiquitous computing -- Concept in software engineering and computer science
Wikipedia - UCPH Department of Computer Science -- Department at University of Copenhagen
Wikipedia - Unification (computer science)
Wikipedia - Union (computer science)
Wikipedia - University of Southampton School of Electronics and Computer Science
Wikipedia - Unsolved problems in computer science
Wikipedia - UNSW School of Computer Science and Engineering
Wikipedia - Upper ontology (computer science)
Wikipedia - Valeria De Antonellis -- Italian professor of computer science and engineering
Wikipedia - Value (computer science) -- Expression in computer science which cannot be evaluated further
Wikipedia - Variable (computer science)
Wikipedia - Variance (computer science)
Wikipedia - Vectored interrupt -- Processing technique in computer science
Wikipedia - Visibility (computer science)
Wikipedia - Vulnerability (computer science)
Wikipedia - Walls and Mirrors -- Computer science textbook
Wikipedia - Whitespace (computer science)
Wikipedia - Widening (computer science)
Wikipedia - Wikipedia talk:WikiProject Computer science
Wikipedia - Wikipedia:WikiProject Computer science -- Wikimedia subject-area collaboration
Wikipedia - William Gates Computer Science Building (Stanford)
Wikipedia - X-tree -- Index tree structure in computer science
Wikipedia - Yuying Li -- Chinese-Canadian professor of computer science
Wikipedia - Zombie (computer science)
Wikipedia - Zuse Institute Berlin -- Research institute for applied mathematics and computer science in Berlin, Germany
https://www.goodreads.com/book/show/112248.Selected_Papers_on_Computer_Science
https://www.goodreads.com/book/show/112268.Foundations_of_Computer_Science
https://www.goodreads.com/book/show/12413355-on-the-cruelty-of-teaching-computer-science
https://www.goodreads.com/book/show/14377896-computer-science
https://www.goodreads.com/book/show/1894064.Notes_on_Introductory_Combinatorics_Progress_in_Computer_Science_and_Applied_Logic
https://www.goodreads.com/book/show/20617928-connecting-with-computer-science
https://www.goodreads.com/book/show/26260896-computer-science
https://www.goodreads.com/book/show/9995070-computer-science-an-overview
Psychology Wiki - Computer_science
Stanford Encyclopedia of Philosophy - computer-science
https://en.wikiquote.org/wiki/Category:Computer_science
https://en.wikiquote.org/wiki/Computer_science
https://academia.fandom.com/wiki/Journal_of_Computer_Science_and_Software_Engineering
https://codelyoko.fandom.com/wiki/List_of_Computer_Science_Topics
https://computerscience.fandom.com/wiki/Computer-science_Wiki
https://computerscience.fandom.com/wiki/Computer-science_Wiki:Community_Portal
https://computerscience.fandom.com/wiki/Computer-science_Wiki:Wanted_Pages
https://engineering.fandom.com/wiki/Computer_science
https://memory-alpha.fandom.com/wiki/Computer_science
https://tardis.fandom.com/wiki/Computer_science
https://wiki.archlinux.org/index.php/List_of_applications#Computer_science
https://commons.wikimedia.org/wiki/Category:Computer_science
https://commons.wikimedia.org/wiki/Category:Computer_science_diagrams
Aarhus University Department of Computer Science
Abstraction (computer science)
Accounting method (computer science)
Adaptation (computer science)
African-American women in computer science
Al-Khawarizmi Institute of Computer Science
ANU College of Engineering and Computer Science
AP Computer Science
AP Computer Science A
AP Computer Science Principles
Arrow (computer science)
Assignment (computer science)
Bachelor of Computer Science
Bandelet (computer science)
Barrier (computer science)
Behat (computer science)
Book:Computer Science
Branch (computer science)
British Colloquium for Theoretical Computer Science
Cambridge Diploma in Computer Science
Channel system (computer science)
Circuit (computer science)
Coalescing (computer science)
Cohesion (computer science)
Collection of Computer Science Bibliographies
Collision (computer science)
Computer graphics (computer science)
Computer science
Computer science and engineering
Computer Science (journal)
Computer Science Ontology
Computer Sciences
Computer Sciences Corporation
Computer Science Undergraduate Association
Conceptual model (computer science)
Concern (computer science)
Concurrency (computer science)
Consensus (computer science)
Convolution (computer science)
Correctness (computer science)
Covariance and contravariance (computer science)
Dalhousie University Faculty of Computer Science
Decomposition (computer science)
Default (computer science)
Department of Computer Science and Technology, University of Cambridge
Department of Computer Science, FMPI, Comenius University
Department of Computer Science, Stony Brook University
Department of Computer Science, University of Illinois at UrbanaChampaign
Department of Computer Science, University of Manchester
Department of Computer Science, University of Oxford
Diploma in Computer Science
Discrete Mathematics & Theoretical Computer Science
Divergence (computer science)
Doctor of Computer Science
Dovetailing (computer science)
Draft:International Journal of Computer Science and Mobile Computing
Dr. A. Q. Khan Institute of Computer Sciences and Information Technology
Electronic Notes in Theoretical Computer Science
Electronic Proceedings in Theoretical Computer Science
Enumerator (computer science)
European Association for Theoretical Computer Science
Expression (computer science)
Expressive power (computer science)
Fiber (computer science)
Field (computer science)
Florida Atlantic University College of Engineering and Computer Science
Foundations and Trends in Theoretical Computer Science
French Institute for Research in Computer Science and Automation
Frontiers of Computer Science
Function composition (computer science)
Gadget (computer science)
Garbage collection (computer science)
Garbage (computer science)
Gates Computer Science Building, Stanford
Genetic improvement (computer science)
Georgia Tech Online Master of Science in Computer Science
Goal node (computer science)
Grigore Moisil National College of Computer Science (Braov)
Guard (computer science)
Heuristic (computer science)
Hylomorphism (computer science)
INFOCOMP Journal of Computer Science
Information and computer science
Input (computer science)
Input enhancement (computer science)
Instance (computer science)
Institute for Computer Science and Control
Institute of Computer Science
Institute of Cryptography, Telecommunications and Computer Science
Integer (computer science)
International Journal of Applied Mathematics and Computer Science
International Journal of Foundations of Computer Science
International Journal of Mathematics and Computer Science
International Symposium on Mathematical Foundations of Computer Science
Journal of Universal Computer Science
Khoury College of Computer Sciences
Label (computer science)
Laboratory for Foundations of Computer Science
Lecture Notes in Computer Science
List of computer science conferences
List of important publications in computer science
List of important publications in theoretical computer science
List of pioneers in computer science
List of unsolved problems in computer science
Lock (computer science)
Logical Methods in Computer Science
Logic in computer science
Macaroons (computer science)
Macro (computer science)
Marshalling (computer science)
Meta learning (computer science)
MIT Computer Science and Artificial Intelligence Laboratory
Neumont College of Computer Science
Node (computer science)
Object (computer science)
Offset (computer science)
On the Cruelty of Really Teaching Computer Science
Outline of computer science
Overwriting (computer science)
Paxos (computer science)
Persistence (computer science)
Polling (computer science)
Polymorphism (computer science)
Pool (computer science)
Prentice Hall International Series in Computer Science
Production (computer science)
Raymond Lister (computer science researcher)
Record (computer science)
Recursion (computer science)
Reference (computer science)
Reification (computer science)
Research Institute for Advanced Computer Science
Research Institute of Computer Science and Random Systems
Robustness (computer science)
Scale factor (computer science)
Schedule (computer science)
Scope (computer science)
Self-management (computer science)
Semantics (computer science)
Session (computer science)
Side effect (computer science)
SIGCSE Award for Lifetime Service to the Computer Science Education Community
SIGCSE Award for Outstanding Contribution to Computer Science Education
Starvation (computer science)
Stream (computer science)
String (computer science)
SUHA (computer science)
Swedish Institute of Computer Science
Symposium on Foundations of Computer Science
Symposium on Logic in Computer Science
Symposium on Theoretical Aspects of Computer Science
Synchronization (computer science)
Theoretical computer science
Theoretical Computer Science (journal)
Thomas M. Siebel Center for Computer Science
Thrashing (computer science)
Transition (computer science)
Tudor Vianu National College of Computer Science
Turku Centre for Computer Science
UBC Department of Computer Science
UCPH Department of Computer Science
Unification (computer science)
University of Central Florida College of Engineering and Computer Science
University of Colorado Boulder Computer Science Department
University of Massachusetts Amherst College of Information and Computer Sciences
University of Toronto Department of Computer Science
UP Diliman Department of Computer Science
Value (computer science)
Vanish (computer science)
Variable (computer science)



convenience portal:
recent: Section Maps - index table - favorites
Savitri -- Savitri extended toc
Savitri Section Map -- 1 2 3 4 5 6 7 8 9 10 11 12
authors -- Crowley - Peterson - Borges - Wilber - Teresa - Aurobindo - Ramakrishna - Maharshi - Mother
places -- Garden - Inf. Art Gallery - Inf. Building - Inf. Library - Labyrinth - Library - School - Temple - Tower - Tower of MEM
powers -- Aspiration - Beauty - Concentration - Effort - Faith - Force - Grace - inspiration - Presence - Purity - Sincerity - surrender
difficulties -- cowardice - depres. - distract. - distress - dryness - evil - fear - forget - habits - impulse - incapacity - irritation - lost - mistakes - obscur. - problem - resist - sadness - self-deception - shame - sin - suffering
practices -- Lucid Dreaming - meditation - project - programming - Prayer - read Savitri - study
subjects -- CS - Cybernetics - Game Dev - Integral Theory - Integral Yoga - Kabbalah - Language - Philosophy - Poetry - Zen
6.01 books -- KC - ABA - Null - Savitri - SA O TAOC - SICP - The Gospel of SRK - TIC - The Library of Babel - TLD - TSOY - TTYODAS - TSZ - WOTM II
8 unsorted / add here -- Always - Everyday - Verbs


change css options:
change font "color":
change "background-color":
change "font-family":
change "padding":
change "table font size":
last updated: 2022-05-07 20:28:02
105822 site hits