classes ::: subject,
children :::
branches ::: Computer Science

Instances, Classes, See Also, Object in Names
Definitions, . Quotes . - . Chapters .

object:Computer Science

--- Concepts
program flow

see also ::: programming

questions, comments, suggestions/feedback, take-down requests, contribute, etc
contact me @ or via the comments below
or join the integral discord server (chatrooms)
if the page you visited was empty, it may be noted and I will try to fill it out. cheers








Computer Science
Essential Books of Computer Science
select ::: Being, God, injunctions, media, place, powers, subjects,
favorite ::: cwsa, everyday, grade, mcw, memcards (table), project, project 0001, Savitri, Savitri (extended toc), the Temple of Sages, three js, whiteboard,
temp ::: consecration, experiments, knowledge, meditation, psychometrics, remember, responsibility, temp, the Bad, the God object, the Good, the most important, the Ring, the source of inspirations, the Stack, the Tarot, the Word, top priority, whiteboard,

--- DICTIONARIES (in Dictionaries, in Quotes, in Chapters)

--- QUOTES [2 / 2 - 172 / 172] (in Dictionaries, in Quotes, in Chapters)

KEYS (10k)

   1 Wikipedia
   1 Harold Abelson


   9 Frederick Lenz
   7 Pedro Domingos
   7 Anonymous
   6 Donald Knuth
   4 Randy Pausch
   4 Edsger Dijkstra
   4 Brad Stone
   4 Bill Gates
   3 Melinda Gates
   3 Hal Abelson
   3 Guy Kawasaki
   3 Donald A Norman
   3 Alan Perlis
   3 Alan Kay
   2 Walter Isaacson
   2 Tracy Kidder
   2 Steven Levy
   2 Stan Kelly Bootle
   2 Reid Hoffman
   2 Malcolm Gladwell
   2 Kai Fu Lee
   2 Julie James
   2 Joe Manganiello
   2 Jaron Lanier
   2 Guy Debord
   2 Ed Catmull
   2 Anita Borg

1:Alan Mathison Turing OBE FRS (/ˈtjʊərɪŋ/; 23 June 1912 - 7 June 1954) was an English computer scientist, mathematician, logician, cryptanalyst and theoretical biologist. He was highly influential in the development of theoretical computer science, providing a formalisation of the concepts of algorithm and computation with the Turing machine, which can be considered a model of a general purpose computer.[2][3][4] Turing is widely considered to be the father of theoretical computer science and artificial intelligence.[5] ~ Wikipedia,
2:[Computer science] is not really about computers -- and it's not about computers in the same sense that physics is not really about particle accelerators, and biology is not about microscopes and Petri dishes...and geometry isn't really about using surveying instruments. Now the reason that we think computer science is about computers is pretty much the same reason that the Egyptians thought geometry was about surveying instruments: when some field is just getting started and you don't really understand it very well, it's very easy to confuse the essence of what you're doing with the tools that you use. ~ Harold Abelson, Introductory lecture to Structure and Interpretation of Computer Programs ,

*** NEWFULLDB 2.4M ***

1:Computer Science is embarrassed by the computer. ~ Alan Perlis,
2:Trees sprout up just about everywhere in computer science. ~ Donald Knuth,
3:Computer science is the operating system for all innovation. ~ Steve Ballmer,
4:Science is to computer science as hydrodynamics is to plumbing. ~ Stan Kelly Bootle,
5:Computer science was then generally a subdepartment of electrical engineering, ~ Ellen Ullman,
6:All problems in Computer Science can be solved by another level of indirection. ~ Butler Lampson,
7:Computer science is no more about computers than astronomy is about telescopes. ~ Edsger Dijkstra,
8:Computer Science is no more about computers than astronomy is about telescopes ~ Edsger W Dijkstra,
9:Computer science really involves the same mindset, particularly artificial intelligence. ~ Frederick Lenz,
10:The first law of computer science: Every problem is solved by yet another indirection. ~ Bjarne Stroustrup,
11:Theoretical Computer Science is just as useless as everything we mathematicians do. ~ Jennifer Tour Chayes,
12:Computer science has as much to do with computers as astronomy has to do with telescopes. ~ Edsger Dijkstra,
13:As so often happens in computer science, we’re willing to sacrifice efficiency for generality. ~ Pedro Domingos,
14:When a professor insists computer science is X but not Y, have compassion for his graduate students. ~ Alan Perlis,
15:Computer science is one of the worst things that ever happened to either computers or to science. ~ Neil Gershenfeld,
16:Computer science departments have always considered 'user interface' research to be sissy work. ~ Nicholas Negroponte,
17:I shopped at J. Crew in high school, I studied computer science. I was a nerd-nerd, now I'm a music-nerd. ~ Mayer Hawthorne,
18:Computer Science is the only discipline in which we view adding a new wing to a building as being maintenance. ~ Jim Horning,
19:I have yet to see a career that is similar in benefit as computer science for doing the advanced exercises. ~ Frederick Lenz,
20:Software Engineering is that part of Computer Science which is too difficult for the Computer Scientist. ~ Friedrich L Bauer,
21:Until Systers came into existence, the notion of a global community of women in computer science did not exist. ~ Anita Borg,
22:I never took a computer science course in college, because then it was a thing you just learned on your own. ~ Mitchel Resnick,
23:Remember, there are only two hard problems in computer science: cache invalidation, naming, and off-by-one errors. ~ Anonymous,
24:The goal of Computer Science is to build something that will last at least until we've finished building it. ~ William C Brown,
25:What English speakers call “computer science” Europeans have known as informatique, informatica, and Informatik ~ James Gleick,
26:I am a professor at the computer science department, but I don't know how to use a computer, not even for Email. ~ Endre Szemeredi,
27:Buck’s girlfriend went by the porny name of Miracle though she had a master’s in computer science from Florida State. ~ Carl Hiaasen,
28:the best data scientists tend to be “hard scientists,” particularly physicists, rather than computer science majors. ~ Mike Loukides,
29:When people think about computer science, they imagine people with pocket protectors and thick glasses who code all night. ~ Marissa Mayer,
30:Both women and computer science are the losers when a geeky stereotype serves as an unnecessary gatekeeper to the profession. ~ Cordelia Fine,
31:But being considered the best speaker in a computer science department is like being known as the tallest of the Seven Dwarfs. ~ Randy Pausch,
32:I went to a school that's predominantly computer science and engineering. So, there's a real shortage of hot girls, let's say. ~ Joe Manganiello,
33:I fear - as far as I can tell - that most undergraduate degrees in computer science these days are basically Java vocational training. ~ Alan Kay,
34:Coding is today's language of creativity. All our children deserve a chance to become creators instead consumers of computer science. ~ Maria Klawe,
35:There's a good part of Computer Science that's like magic. Unfortunately there's a bad part of Computer Science that's like religion. ~ Hal Abelson,
36:computer science has traditionally been all about thinking deterministically, but machine learning requires thinking statistically. ~ Pedro Domingos,
37:I considered law and math. My Dad was a lawyer. I think though I would have ended up in physics if I didn't end up in computer science. ~ Bill Gates,
38:And they came to be included in a culture and community that placed the computer science engineer at the highest level of social status. ~ Alec J Ross,
39:Computer Science: A study akin to numerology and astrology, but lacking the precision of the former and the success of the latter. ~ Stan Kelly Bootle,
40:I don't know how many of you have ever met Dijkstra, but you probably know that arrogance in computer science is measured in nano-Dijkstras. ~ Alan Kay,
41:IBM veteran and computer science professor Frederick Brooks argued that adding manpower to complex software projects actually delayed progress. ~ Brad Stone,
42:I've been programming computers since elementary school, where they taught us, and I stuck with computer science through high school and college. ~ Masi Oka,
43:the most advanced computer science programs in the world, and over the course of the Computer Center’s life, thousands of students passed ~ Malcolm Gladwell,
44:Let’s give them credit,” Schmidt says. “The book guys got computer science, they figured out the analytics, and they built something significant. ~ Brad Stone,
45:For those who wish to stay and work in computer science or technology, fields badly in need of their services, let’s roll out the welcome mat. ~ Sheldon Adelson,
46:He asked the class how many of us were taking computer science, and everybody but me and this one girl who didn’t speak English raised their hands. ~ Ned Vizzini,
47:Computer science is to biology what calculus is to physics. It's the natural mathematical technique that best maps the character of the subject. ~ Harold Morowitz,
48:Computer science needs to be part of the core curriculum - like algebra, biology, physics, or chemistry. We need all schools to teach it, not just 10%. ~ Brad Feld,
49:I can't be as confident about computer science as I can about biology. Biology easily has 500 years of exciting problems to work on. It's at that level. ~ Donald Knuth,
50:When I was 19 years old, I wrote my first book. I took a computer science class, and the book was garbage. I thought I could write a better one, so I did. ~ Jim McKelvey,
51:But biology and computer science - life and computation - are related. I am confident that at their interface great discoveries await those who seek them. ~ Leonard Adleman,
52:As you study computer science you develop this wonderful mental acumen, particularly with relational databases, systems analysis, and artificial intelligence. ~ Frederick Lenz,
53:She shrugged noncommittally. “Not bad.”
Kyle scoffed. “Not bad? Counselor, there are two things I’ve got mad skills at: And computer science is the other one. ~ Julie James,
54:If we suppose that many natural phenomena are in effect computations, the study of computer science can tell us about the kinds of natural phenomena that can occur. ~ Rudy Rucker,
55:If you seek to develop the mind fully, for the enlightenment process, you will benefit if your career is related to computer science, law, medicine, or the arts. ~ Frederick Lenz,
56:Only in high school when I began programming computers, did I become interested in tech and start-ups, which led me to attend Stanford and major in Computer Science. ~ Clara Shih,
57:My hope is that in the future, women stop referring to themselves as 'the only woman' in their physics lab or 'only one of two' in their computer science jobs. ~ Kirsten Gillibrand,
58:Computer science is fascinating. As you study computer science, you will find that you develop your mind. It is literally like doing Buddhist exercises all day long. ~ Frederick Lenz,
59:Computer science doesn't know how to build complex systems that work reliably. This has been a well-understood problem since the very beginning of programmable computers. ~ Matt Blaze,
60:I had almost no background for the work in computer science, artificial intelligence, and cognitive psychology...Interdisciplinary adventure is easiest in new fields. ~ Herbert A Simon,
61:We're losing track of the vastness of the potential for computer science. We really have to revive the beautiful intellectual joy of it, as opposed to the business potential. ~ Jaron Lanier,
62:I try to learn certain areas of computer science exhaustively; then I try to digest that knowledge into a form that is accessible to people who don't have time for such study. ~ Donald Knuth,
63:My background was computer science and business school, so eventually I worked my way up where I was running product groups - development, testing, marketing, user education. ~ Melinda Gates,
64:I recommend computer science to people who practice meditation. The mental structures that are used in computer science are very similar exercises done in Buddhist monasteries. ~ Frederick Lenz,
65:I recommend, for many people, the study of computer science. Our natural resource in America is the mind. The mindset in computer science is very similar to the mindset in Zen. ~ Frederick Lenz,
66:when I arrived at Stanford in 1985, economics, not computer science, was the most popular major. To most people on campus, the tech sector seemed idiosyncratic or even provincial. ~ Peter Thiel,
67:John von Neumann, one of the founding fathers of computer science, famously said that “with four parameters I can fit an elephant, and with five I can make him wiggle his trunk. ~ Pedro Domingos,
68:Computer science … jobs should be way more interesting than even going to Wall Street or being a lawyer--or, I can argue, than anything but perhaps biology, and there it's just a tie. ~ Bill Gates,
69:People think that computer science is the art of geniuses but the actual reality is the opposite, just many people doing things that build on each other, like a wall of mini stones. ~ Donald Knuth,
70:See, Berkeley has always drawn the nuts and flakes of the academic world. That's what happens when you have a university that offers degrees in both computer science and parapsychology. ~ Mira Grant,
71:The rise of Google, the rise of Facebook, the rise of Apple, I think are proof that there is a place for computer science as something that solves problems that people face every day. ~ Eric Schmidt,
72:Computer science is the most misunderstood field there is. You are being paid to solve puzzles. For a person who has practiced meditation in past lives, that is the way your mind works. ~ Frederick Lenz,
73:I decry the current tendency to seek patents on algorithms. There are better ways to earn a living than to prevent other people from making use of one's contributions to computer science. ~ Donald Knuth,
74:Too few people in computer science are aware of some of the informational challenges in biology and their implications for the world. We can store an incredible amount of data very cheaply. ~ Sergey Brin,
75:passed in those days for computer terminals. In 1971, this was state of the art. The University of Michigan had one of the most advanced computer science programs in the world, and over ~ Malcolm Gladwell,
76:Throughout my academic career, I'd given some pretty good talks. But being considered the best speaker in the computer science department is like being known as the tallest of the Seven Dwarfs. ~ Randy Pausch,
77:I remember that mathematicians were telling me in the 1960s that they would recognize computer science as a mature discipline when it had 1,000 deep algorithms. I think we've probably reached 500. ~ Donald Knuth,
78:really a hedge fund but a versatile technology laboratory full of innovators and talented engineers who could apply computer science to a variety of different problems.5 Investing was only the first ~ Brad Stone,
79:The training one receives when one becomes a technician, like a data scientist - we get trained in mathematics or computer science or statistics - is entirely separated from a discussion of ethics. ~ Cathy O Neil,
80:My undergraduate work was in computer science and economics. It just happened to be at that time when 34 percent of computer-science majors were women. We didn't realize it was at the peak at the time. ~ Melinda Gates,
81:Why is computer science a good field for women? For one thing, thats where the jobs are, and for another, the pay is better than for many jobs, and finally, its easier to combine career and family. ~ Madeleine M Kunin,
82:Even when I was studying mathematics, physics, and computer science, it always seemed that the problem of consciousness was about the most interesting problem out there for science to come to grips with. ~ David Chalmers,
83:Persons with Disability (PWD), Ex-Serviceman (XSM), Kashmiri Migrant (KM). Please refer to the Norms for the same. There are 394 vacancies for the above position (200 Electronics, 120 Mechanical, 57 Computer Science, ~ Anonymous,
84:ProPublica’s technology reporter Jeff Larson joined the bunker in London. A computer science graduate, Larson knew his stuff. Using diagrams, he could explain the NSA’s complex data-mining programs – no mean feat. ~ Luke Harding,
85:Perhaps writers should never be allowed to get together in a workplace context. It's not like studying computer science, after all. The emotions are at large, and are shared and are questioned. There is a vulnerability. ~ Graham Joyce,
86:Starting early and getting girls on computers, tinkering and playing with technology, games and new tools, is extremely important for bridging the gender divide that exists now in computer science and in technology. ~ Beth Simone Noveck,
87:Harvard’s Leslie Valiant received the Turing Award, the Nobel Prize of computer science, for inventing this type of analysis, which he describes in his book entitled, appropriately enough, Probably Approximately Correct. ~ Pedro Domingos,
88:leading with computational thinking instead of code itself, and helping students imagine how being computer savvy could help them in any career, boosts the number of girls and kids of color taking—and sticking with—computer science. ~ Anonymous,
89:For years, computer scientists were treating operating systems design as sort of an open-reserch issue, when the field's direction had been decided by commercial operations. Computer science has become completely cut off from reality. ~ David Gelernter,
90:Computer science is no more about computers than astronomy is about telescopes, biology is about microscopes or chemistry is about beakers and test tubes. Science is not about tools. It is about how we use them, and what we find out when we do. ~ Edsger Dijkstra,
91:The best way to prepare [to be a programmer] is to write programs, and to study great programs that other people have written. In my case, I went to the garbage cans at the Computer Science Center and I fished out listings of their operating systems. ~ Bill Gates,
92:I was never as focused in math, science, computer science, etcetera, as the people who were best at it. I wanted to create amazing screensavers that did beautiful visualizations of music. It's like, "Oh, I have to learn computer science to do that." ~ Kevin Systrom,
93:Every weekend the drama department would have parties. The 20 hot girls on campus? All of them were in the drama dept. So we'd have somebody standing guard at the door to keep all the computer science guys out. We had to guard our women at all times. ~ Joe Manganiello,
94:Computer science has some of the most colorful language of any field. In what other field can you walk into a sterile room, carefully controlled at 68°F, and find viruses, Trojan horses, worms, bugs, bombs, crashes, flames, twisted sex changers, and fatal errors? ~ Anonymous,
95:Drs. Margolis and Fisher have done a great service to education, computer science, and the culture at large. Unlocking the Clubhouse should be required reading for anyone and everyone who is concerned about the decreasing rate of women studying computer science. ~ Anita Borg,
96:I actually started off majoring in computer science, but I knew right away I wasn't going to stay with it. It was because I had this one professor who was the loneliest, saddest man I've ever known. He was a programmer, and I knew that I didn't want to do whatever he did. ~ J Cole,
97:I have met bright students in computer science who have never seen the source code of a large program. They may be good at writing small programs, but they can't begin to learn the different skills of writing large ones if they can't see how others have done it. ~ Richard Stallman,
98:[Computer science] is not really about computers and it's not about computers in the same sense that physics is not really about particle accelerators, and biology is not about microscopes and Petri dishes... and geometry isn't really about using surveying instruments. ~ Hal Abelson,
99:Computer science is not as old as physics; it lags by a couple of hundred years. However, this does not mean that there is significantly less on the computer scientist's plate than on the physicist's: younger it may be, but it has had a far more intense upbringing! ~ Richard P Feynman,
100:Two decades later, when I got my PhD in computer science from Carnegie Mellon, I thought that made me infinitely qualified to do anything, so I dashed off my letters of application to Walt Disney Imagineering. And they sent me the nicest go-to-hell letter I'd ever received. ~ Randy Pausch,
101:Daniel Dennett is our best current philosopher. He is the next Bertrand Russell. Unlike traditional philosophers, Dan is a student of neuroscience, linguistics, artificial intelligence, computer science, and psychology. He's redefining and reforming the role of the philosopher. ~ Marvin Minsky,
102:As the Era of Stagnation began, the Soviet scientific establishment lavished resources on the immediate priorities of the state—space exploration, water diversion, nuclear power—while emergent technologies, including computer science, genetics, and fiber optics, fell behind. ~ Adam Higginbotham,
103:I was lucky to get into computers when it was a very young and idealistic industry. There weren't many degrees offered in computer science, so people in computers were brilliant people from mathematics, physics, music, zoology, whatever. They loved it, and no one was really in it for the money. ~ Steve Jobs,
104:Atlantis was a highly evolved civilization where the sciences and arts were far more advanced than one might guess. Atlantis was technologically advanced in genetic engineering, computer science, inter-dimensional physics, and artistically developed with electronic music and crystal art forms. ~ Frederick Lenz,
105:Computer science only indicates the retrospective omnipotence of our technologies. In other words, an infinite capacity to process data (but only data -- i.e. the already given) and in no sense a new vision. With that science, we are entering an era of exhaustivity, which is also an era of exhaustion. ~ Jean Baudrillard,
106:It is hardly surprising that children should enthusiastically start their education at an early age with the Absolute Knowledge of computer science; while they are unable to read, for reading demands making judgments at every line. Conversation is almost dead, and soon so too will be those who knew how to speak. ~ Guy Debord,
107:I fear - as far as I can tell - that most undergraduate degrees in computer science these days are basically Java vocational training. I've heard complaints from even mighty Stanford University with its illustrious faculty that basically the undergraduate computer science program is little more than Java certification. ~ Alan Kay,
108:In our opinion, most search engine optimization (SEO) is bullshit. It involves trying to read Google’s mind and then gaming the system to make Google find crap. There are three thousand computer science PhDs at Google trying to make each search relevant, and then there’s you trying to fool them. Who’s going to win? ~ Guy Kawasaki,
109:If somebody is working on a new medicine, computer science helps us model those things. We have a whole group here in Seattle called the Institute for Disease Modelling that is a mix of computer science and math-type people, and the progress we're making in polio or plans for malaria or really driven by their deep insights. ~ Bill Gates,
110:[Though computer science is a fairly new discipline, it is predominantly based on the Cartesian world view. As Edsgar W. Dijkstra has pointed out] A scientific discipline emerges with the - usually rather slow! - discovery of which aspects can be meaningfully 'studied' in isolation for the sake of their own consistency. ~ Edsger Dijkstra,
111:One of the problems we've had is that the ICT curriculum in the past has been written for a subject that is changing all the time. I think that what we should have is computer science in the future - and how it fits in to the curriculum is something we need to be talking to scientists, to experts in coding and to young people about. ~ Michael Gove,
112:The prerequisite that people have a scientific or engineering degree or a medical degree limits the number of female astronauts. Right now, still, we have about 20 per cent of people who have that prerequisite who are female. So hey, girls: Embrace the very fun career of science and technology. Look at computer science. That's what I did. ~ Julie Payette,
113:The issues involved are sufficiently important that courses are now moving out of the philosophy departments and into mainstream computer science. And they affect everyone. Many of the students attracted to these courses are not technology majors, and many of the topics we discuss relate to ethical challenges that transcend the computer world. ~ D Michael Quinn,
114:Now, the reason that we think computer science is about computers is pretty much the same reason that the Egyptians thought geometry was about surveying instruments. And that is, when some field is just getting started and you don't really understand it very well, it's very easy to confuse the essence of what you're doing with the tools that you use. ~ Hal Abelson,
115:Most of human behavior is a result of subconscious processes. We are unaware of them. As a result, many of our beliefs about how people behave—including beliefs about ourselves—are wrong. That is why we have the multiple social and behavioral sciences, with a good dash of mathematics, economics, computer science, information science, and neuroscience. ~ Donald A Norman,
116:The burgeoning field of computer science has shifted our view of the physical world from that of a collection of interacting material particles to one of a seething network of information. In this way of looking at nature, the laws of physics are a form of software, or algorithm, while the material world-the hardware-plays the role of a gigantic computer. ~ Paul Davies,
117:I actually remember very specifically the night that I launched Facebook at Harvard. I used to go out to get pizza with a friend who I did all my computer science homework with. And I remember talking to him and saying I am so happy we have this at Harvard because now our community can be connected but one day someone is going to build this for the world. ~ Mark Zuckerberg,
118:However, a real implementation may still have to include code to handle the case where something happens that was assumed to be impossible, even if that handling boils down to printf("Sucks to be you") and exit(666) — i.e., letting a human operator clean up the mess [93]. (This is arguably the difference between computer science and software engineering.) ~ Martin Kleppmann,
119:My background, I really am a computer hacker. I've studied computer science, I work in computer security. I'm not an actively a hacker, I'm an executive but I understand the mindset of changing a system to get the outcome that you want. It turns out to make the coffee, the problem is actually how the beans get turn into green coffee. That's where most of the problems happen. ~ Dave Asprey,
120:Throughout my academic career, I'd given some pretty good talks. But being considered the best speaker in the computer science department is like being known as the tallest of the Seven Dwarfs. And right then, I had the feeling that I had more in me, that if I gave it my all, I might be able to offer people something special. "Wisdom" is a strong word, but maybe that was it. ~ Randy Pausch,
121:It's interesting that the greatest minds of computer science, the founding fathers, like Alan Turing and Claude Shannon and Norbert Wiener, they all looked at chess as the ultimate test. So they thought, "Oh, if a machine can play chess, and beat strong players, set aside a world champion, that would be the sign of a dawn of the AI era." With all due respect, they were wrong. ~ Garry Kasparov,
122:The best computer science students at Stanford were some of the best computer science students anywhere. Under Clark they gathered together into a new, potent force. ‘The difference was phenomenal, for me. I don’t know how many people around me noticed. But my God I noticed. The first manifestation was when all of these people started coming up and wanting to be part of my project.’ That ~ Michael Lewis,
123:What information consumes is rather obvious: it consumes the attention of its recipients. Hence, a wealth of information creates a poverty of attention and a need to allocate that attention efficiently among the overabundance of information sources that might consume it. —HERBERT SIMON, recipient of Nobel Memorial Prize in Economics8 and the A.M. Turing Award, the “Nobel Prize of Computer Science ~ Timothy Ferriss,
124:You know that Estonia, based largely on how successful Skype was, built by Estonian developers, that was a tenth of the entire country's GDP when eBay bought it. That was like a decade ago, it was f****** Estonia, they were behind the Iron Curtain two decades earlier. They're now pushing for K-12 education in computer science in public schools. They've gotten the message. They know how much value that can bring. ~ Alexis Ohanian,
125:The C-list girls who just banded together to create their own little utopia. Those are the girls you want to be, it couldn’t be clearer in hindsight. Early anarchists. Badasses. They didn’t bother, exempted themselves, turned their backs and took up softball, computer science, gardening, poetry, sewing. Those are the ones with a shot at becoming fairly content happy/tough/certain/fulfilled/gray-haired grown women. An ~ Elisa Albert,
126:meantime, here is a list of degrees for five of the nerdiest writers: J. STEWART BURNS BS Mathematics, Harvard University MS Mathematics, UC Berkeley DAVID S. COHEN BS Physics, Harvard University MS Computer Science, UC Berkeley AL JEAN BS Mathematics, Harvard University KEN KEELER BS Applied Mathematics, Harvard University PhD Applied Mathematics, Harvard University JEFF WESTBROOK BS Physics, Harvard University PhD Computer Science, Princeton University ~ Simon Singh,
127:I had the opportunity, as a child, to grow up in a community center where I was exposed to theater, music, art, and computer science; things that I would have never had the opportunity to even meet had it not been for those people taking time out of their schedules, helping us as children to travel all over the world while sitting in a gymnasium. That's what I did before I was a musician, before I was a recording artist, I was a teacher and a community leader. ~ Erykah Badu,
128:We are lucky in the United States to have our liberal arts system. In most countries, if you go to university, you have to decide for all English literature or no literature, all philosophy or no philosophy. But we have a system that is one part general education and one part specialization. If your parents say you've got to major in computer science, you can do that. But you can also take general education courses in the humanities, and usually you have to. ~ Martha C Nussbaum,
129:When I studied computer science at Duke University in the first half of the 1980s, I had professors who treated women differently than men. I kind of got used to it. At Microsoft, I had to use my elbows and make sure I spoke up at the table, but it was an incredibly meritocratic place. Outside, in the industry, I would feel the sexism. I'd walk into a room and until I proved my worth, everyone would assume that the guy presenting with me had credibility and I didn't. ~ Melinda Gates,
130:So I think a humanities major who also did a lot of computer science, economics, psychology, or other sciences can be quite valuable and have great career flexibility,’’ Katz said. ‘‘But you need both, in my view, to maximize your potential. And an economics major or computer science major or biology or engineering or physics major who takes serious courses in the humanities and history also will be a much more valuable scientist, financial professional, economist or entrepreneur. ~ Anonymous,
131:Because of its origins and guiding principles, symbolist machine learning is still closer to the rest of AI than the other schools. If computer science were a continent, symbolist learning would share a long border with knowledge engineering. Knowledge is traded in both directions— manually entered knowledge for use in learners, induced knowledge for addition to knowledge bases— but at the end of the day the rationalist-empiricist fault line runs right down that border, and crossing it is not easy. ~ Pedro Domingos,
132:Error processing is turning out to be one of the thorniest problems of modern computer science, and you can't afford to deal with it haphazardly. Some people have estimated that as much as 90 percent of a program's code is written for exceptional, error-processing cases or housekeeping, implying that only 10 percent is written for nominal cases (Shaw in Bentley 1982). With so much code dedicated to handling errors, a strategy for handling them consistently should be spelled out in the architecture. ~ Steve McConnell,
133:I did think about a Ph.D. in computer science, but this is a time in industry where theory and practice are coming together in amazing ways. Yes, there's money, but what really interests me is that private-sector innovation happens faster. You can get more done and on a larger scale and have more impact. With all the start-ups out there, I think this is a time like the Renaissance. Not just one person doing great work, but so many feeding off one another. If you lived then, wouldn't you go out and paint? ~ Allegra Goodman,
134:A fashionable idea in technical circles is that quantity not only turns into quality at some extreme of scale, but also does so according to principles we already understand. Some of my colleagues think a million, or perhaps a billion, fragmentary insults will eventually yield wisdom that surpasses that of any well-thought-out essay, so long as sophisticated secret statistical algorithms recombine the fragments. I disagree. A trope from the early days of computer science comes to mind: garbage in, garbage out. ~ Jaron Lanier,
135:We are unaware of them. As a result, many of our beliefs about how people behave—including beliefs about ourselves—are wrong. That is why we have the multiple social and behavioral sciences, with a good dash of mathematics, economics, computer science, information science, and neuroscience. Consider the following simple experiment. Do all three steps:        1.   Wiggle the second finger of your hand.        2.   Wiggle the third finger of the same hand.        3.   Describe what you did differently those two times. ~ Donald A Norman,
136:2 Metaphors for a Richer CC2E.COM/ 0278 Understanding of Software Development Contents 2.1 The Importance of Metaphors 2.2 How to Use Software Metaphors 2.3 Common Software Metaphors Related Topic Heuristics in design: “Design is a Heuristic Process” in Section 5.1. Computer science has some of the most colorful language of any field. In what other field can you walk into a sterile room, carefully controlled at 68°F, and find viruses, Trojan horses, worms, bugs, bombs, crashes, flames, twisted sex changers, and fatal errors ~ Anonymous,
137:a professor of computer science at MIT named Joseph Weizenbaum writes of a malady he calls “the compulsion to program.” He describes the afflicted as “bright young men of disheveled appearance, often with sunken, glowing eyes,” who play out “megalomaniacal fantasies of omnipotence” at computer consoles; they sit at their machines, he writes, “their arms tensed and waiting to fire their fingers, already poised to strike, at the buttons and keys on which their attention seems to be as riveted as a gambler’s on the rolling dice. ~ Tracy Kidder,
138:adventure, one usually found me, and now I weave those tales into my stories. I am blessed to have written the bestselling Jack Stratton mystery series. The collection includes And Then She Was Gone, Girl Jacked, Jack Knifed, Jacks Are Wild, Jack and the Giant Killer, and Data Jack. My background is an eclectic mix of degrees in theatre, communications, and computer science. Currently I reside in Massachusetts with my lovely wife and two fantastic children. My wife, Katherine Greyson, who is my chief content editor, is an author of her own romance ~ Christopher Greyson,
139:It’s fairly intuitive that never exploring is no way to live. But it’s also worth mentioning that never exploiting can be every bit as bad. In the computer science definition, exploitation actually comes to characterize many of what we consider to be life’s best moments. A family gathering together on the holidays is exploitation. So is a bookworm settling into a reading chair with a hot cup of coffee and a beloved favorite, or a band playing their greatest hits to a crowd of adoring fans, or a couple that has stood the test of time dancing to “their song. ~ Brian Christian,
140:One of the first people I interviewed was Alvy Ray Smith, a charismatic Texan with a Ph.D. in computer science and a sparkling resume that included teaching stints at New York University and UC Berkeley and a gig at Xerox PARC, the distinguished R&D lab in Palo Alto. I had conflicting feelings when I met Alvy because, frankly, he seemed more qualified to lead the lab than I was. I can still remember the uneasiness in my gut, that instinctual twinge spurred by a potential threat: This, I thought, could be the guy who takes my job one day. I hired him anyway. ~ Ed Catmull,
141:What is the central core of the subject [computer science]? What is it that distinguishes it from the separate subjects with which it is related? What is the linking thread which gathers these disparate branches into a single discipline. My answer to these questions is simple -it is the art of programming a computer. It is the art of designing efficient and elegant methods of getting a computer to solve problems, theoretical or practical, small or large, simple or complex. It is the art of translating this design into an effective and accurate computer program. ~ Tony Hoare,
142:In our opinion, most search engine optimization (SEO) is bullshit. It involves trying to read Google’s mind and then gaming the system to make Google find crap. There are three thousand computer science PhDs at Google trying to make each search relevant, and then there’s you trying to fool them. Who’s going to win? Tricking Google is futile. Instead, you should let Google do what it does best: find great content. So defy all the SEO witchcraft out there and focus on creating, curating, and sharing great content. This is what’s called SMO: social-media optimization ~ Guy Kawasaki,
143:This is because computer science has traditionally been all about thinking deterministically, but machine learning requires thinking statistically. If a rule for, say, labeling e-mails as spam is 99 percent accurate, that does not mean it’s buggy; it may be the best you can do and good enough to be useful. This difference in thinking is a large part of why Microsoft has had a lot more trouble catching up with Google than it did with Netscape. At the end of the day, a browser is just a standard piece of software, but a search engine requires a different mind-set. ~ Pedro Domingos,
144:In a book called Computer Power and Human Reason, a professor of computer science at MIT named Joseph Weizenbaum writes of a malady he calls “the compulsion to program.” He describes the afflicted as “bright young men of disheveled appearance, often with sunken, glowing eyes,” who play out “megalomaniacal fantasies of omnipotence” at computer consoles; they sit at their machines, he writes, “their arms tensed and waiting to fire their fingers, already poised to strike, at the buttons and keys on which their attention seems to be as riveted as a gambler’s on the rolling dice. ~ Tracy Kidder,
145:I think that it's extraordinarily important that we in computer science keep fun in computing. When it started out, it was an awful lot of fun. Of course, the paying customers got shafted every now and then, and after a while we began to take their complaints seriously. We began to feel as if we really were responsible for the successful, error-free perfect use of these machines. I don't think we are. I think we're responsible for stretching them, setting them off in new directions, and keeping fun in the house. I hope the field of computer science never loses its sense of fun. ~ Alan Perlis,
146:40. Be Defiant In our opinion, most search engine optimization (SEO) is bullshit. It involves trying to read Google’s mind and then gaming the system to make Google find crap. There are three thousand computer science PhDs at Google trying to make each search relevant, and then there’s you trying to fool them. Who’s going to win? Tricking Google is futile. Instead, you should let Google do what it does best: find great content. So defy all the SEO witchcraft out there and focus on creating, curating, and sharing great content. This is what’s called SMO: social-media optimization. ~ Guy Kawasaki,
147:the groundbreakers in many sciences were devout believers. Witness the accomplishments of Nicolaus Copernicus (a priest) in astronomy, Blaise Pascal (a lay apologist) in mathematics, Gregor Mendel (a monk) in genetics, Louis Pasteur in biology, Antoine Lavoisier in chemistry, John von Neumann in computer science, and Enrico Fermi and Erwin Schrodinger in physics. That’s a short list, and it includes only Roman Catholics; a long list could continue for pages. A roster that included other believers—Protestants, Jews, and unconventional theists like Albert Einstein, Fred Hoyle, and Paul Davies—could fill a book. ~ Scott Hahn,
148:Andy: ugh I’ve never felt so old and slimy.
Sinter: You’re 25. That isn’t old to a 19 year old
Andy: Old. And slimy. Slimy like a slug. Like seaweed.
Sinter: Are you done with your metaphors
Andy: I think these are similes

Though he’d been a computer science major, he’d also been, like me, an English minor. It made him remarkably hot at moments such as this.

Sinter: Right, you’re right
Andy: And no. There are many slimy things and I’m like them all. Slimy like mayo
Sinter: Gross
Andy: Exactly, I am gross
Sinter: Ha no, mayo is gross
Andy: Slimy like a dog’s tongue.
Sinter: Seriously stop. ~ Molly Ringle,
149:The mind is more difficult to comprehend than actions. Most of us start by believing we already understand both human behavior and the human mind. After all, we are all human: we have all lived with ourselves all of our lives, and we like to think we understand ourselves. But the truth is, we don’t. Most of human behavior is a result of subconscious processes. We are unaware of them. As a result, many of our beliefs about how people behave—including beliefs about ourselves—are wrong. That is why we have the multiple social and behavioral sciences, with a good dash of mathematics, economics, computer science, information science, and neuroscience. ~ Donald A Norman,
150:I returned to our surveillance. The houses around us reminded me of Ryan Kessler’s place. About every fifth one was, if not identical, then designed from the same mold. We were staring through bushes at a split-level colonial, on the other side of a dog-park-cum-playground. It was the house of Peter Yu, the part-time professor of computer science at Northern Virginia College and a software designer for Global Software Innovations. The company was headquartered along the Dulles “technology corridor,” which was really just a dozen office buildings on the tollway, housing corporations whose claim to tech fame was mostly that they were listed on the NASDAQ stock exchange. I ~ Jeffery Deaver,
151:Underlying our approach to this subject is our conviction that "computer science" is not a science and that its significance has little to do with computers. The computer revolution is a revolution in the way we think and in the way we express what we think. The essence of this change is the emergence of what might best be called procedural epistemology—the study of the structure of knowledge from an imperative point of view, as opposed to the more declarative point of view taken by classical mathematical subjects. Mathematics provides a framework for dealing precisely with notions of "what is". Computation provides a framework for dealing precisely with notions of "how to". ~ Harold Abelson,
152:Livingston: Why did users like Viaweb? Graham: I think the main thing was that it was easy. Practically all the software in the world is either broken or very difficult to use. So users dread software. They've been trained that whenever they try to install something, or even fill out a form online, it's not going to work. I dread installing stuff, and I have a PhD in computer science. So if you're writing applications for end users, you have to remember that you're writing for an audience that has been traumatized by bad experiences. We worked hard to make Viaweb as easy as it could possibly be, and we had this confidence-building online demo where we walked people through using the software. That was what got us all the users. ~ Jessica Livingston,
153:Just three or four decades ago, if you wanted to access a thousand core processors, you’d need to be the chairman of MIT’s computer science department or the secretary of the US Defense Department. Today the average chip in your cell phone can perform about a billion calculations per second. Yet today has nothing on tomorrow. “By 2020, a chip with today’s processing power will cost about a penny,” CUNY theoretical physicist Michio Kaku explained in a recent article for Big Think,23 “which is the cost of scrap paper. . . . Children are going to look back and wonder how we could have possibly lived in such a meager world, much as when we think about how our own parents lacked the luxuries—cell phone, Internet—that we all seem to take for granted. ~ Peter H Diamandis,
154:Usenet bulletin-board posting, August 21, 1994: Well-capitalized start-up seeks extremely talented C/C++/Unix developers to help pioneer commerce on the Internet. You must have experience designing and building large and complex (yet maintainable) systems, and you should be able to do so in about one-third the time that most competent people think possible. You should have a BS, MS, or PhD in Computer Science or the equivalent. Top-notch communication skills are essential. Familiarity with web servers and HTML would be helpful but is not necessary. Expect talented, motivated, intense, and interesting co-workers. Must be willing to relocate to the Seattle area (we will help cover moving costs). Your compensation will include meaningful equity ownership. Send resume and cover letter to Jeff Bezos. ~ Brad Stone,
155:My son Aaron, who is a professor of computer science, encountered just such a careless signal when he was on the admissions committee at Carnegie Mellon University. One Ph.D. applicant submitted a passionate letter about why he wanted to study at CMU, writing that he regarded CMU as the best computer science department in the world, that the CMU faculty was best equipped to help him pursue his research interests, and so on. But the final sentence of the letter gave the game away: I will certainly attend CMU if adCMUted. It was proof that the applicant had merely taken the application letter he had written to MIT and done a search-and-replace with “CMU” . . . and hadn’t even taken the time to reread it! Had he done so, he would have noticed that every occurrence of those three letters had been replaced. ~ Alvin E Roth,
156:Throw in the valley’s rich history of computer science breakthroughs, and you’ve set the stage for the geeky-hippie hybrid ideology that has long defined Silicon Valley. Central to that ideology is a wide-eyed techno-optimism, a belief that every person and company can truly change the world through innovative thinking. Copying ideas or product features is frowned upon as a betrayal of the zeitgeist and an act that is beneath the moral code of a true entrepreneur. It’s all about “pure” innovation, creating a totally original product that generates what Steve Jobs called a “dent in the universe.” Startups that grow up in this kind of environment tend to be mission-driven. They start with a novel idea or idealistic goal, and they build a company around that. Company mission statements are clean and lofty, detached from earthly concerns or financial motivations. ~ Kai Fu Lee,
157:In terms of funding, Google dwarfs even its own government: U.S. federal funding for math and computer science research amounts to less than half of Google’s own R&D budget. That spending spree has bought Alphabet an outsized share of the world’s brightest AI minds. Of the top one hundred AI researchers and engineers, around half are already working for Google. The other half are distributed among the remaining Seven Giants, academia, and a handful of smaller startups. Microsoft and Facebook have soaked up substantial portions of this group, with Facebook bringing on superstar researchers like Yann LeCun. Of the Chinese giants, Baidu went into deep-learning research earliest—even trying to acquire Geoffrey Hinton’s startup in 2013 before being outbid by Google—and scored a major coup in 2014 when it recruited Andrew Ng to head up its Silicon Valley AI Lab. ~ Kai Fu Lee,
158:The spectacle's instruction and the spectators' ignorance are wrongly seen as antagonistic factors when in fact they give birth to each other. In the same way, the computer's binary language is an irresistible inducement to the continual and unreserved acceptance of what has been programmed according to the wishes of someone else and passes for the timeless source of a superior, impartial and total logic. Such progress, such speed, such breadth of vocabulary! Political? Social? Make your choice. You cannot have both. My own choice is inescapable. They are jeering at us, and we know whom these programs are for. Thus it is hardly surprising that children should enthusiastically start their education at an early age with the Absolute Knowledge of computer science; while they are still unable to read, for reading demands making judgements at every line; and is the only access to the wealth of pre-spectacular human experience. Conversation is almost dead, and soon so too will be those who knew how to speak. ~ Guy Debord,
159:Everyone is talking about what’s going on with sales, and Sergey was paying no attention, just pushing buttons on the AV system and trying to unscrew a panel to understand it,” says Levick. “And I remember thinking, this man does not give a rat’s ass about this part of the business. He doesn’t get what we do. He never will. That set the tone for me very early in terms of the two Googles—the engineering Google and this other Google, the sales and business side.” No matter how much you exceeded your sales quota, a salesperson wouldn’t be coddled as much as a guy with a computer science degree who spent all day creating code. And some tried-and-true sales methods were verboten. For instance, golf outings. “Larry and Sergey hate golf,” says Levick. “Google has never sponsored a golf event and never will.” There would be days when Google salespeople would call agencies and discover that everybody was off on a golf retreat with Yahoo. But Tim Armstrong would tell his troops, “They have to take people on golf outings because they have nothing else. ~ Steven Levy,
160:> In the 21st century, intellectual capital is what will matter in the job market and will help a country grow its economy. Investments in biosciences, computers and electronics, engineering, and other growing high-tech industries have been the major differentiator in recent decades. More careers than ever now require technical skills so in order to be competitive in those fields, a nation must invest in STEM studies. Economic growth has slowed and unemployment rates have spiked, making employers much pickier about qualifications to hire. There is now an overabundance of liberal arts majors. A study from Georgetown University lists the five college majors with the highest unemployment rates (crossed against popularity): clinical psychology, 19.5 percent; miscellaneous fine arts, 16.2 percent; U.S. history, 15.1 percent; library science, 15 percent; and (tied for No. 5) military technologies and educational psychology, 10.9 percent each. Unemployment rates for STEM subjects hovered around 0 to 3 percent: astrophysics/astronomy, around 0 percent; geological and geophysics engineering, 0 percent; physical science, 2.5 percent; geosciences, 3.2 percent; and math/computer science, 3.5 percent.  ~ Philip G Zimbardo,
161:Even more controversial was Google’s insistence on relying on academic metrics for mature adults whose work experience would seem to make college admission test scores and GPAs moot. In her interview for Google’s top HR job, Stacy Sullivan, then age thirty-five, was shocked when Brin and Page asked for her SAT scores. At first she challenged the practice. “I don’t think you should ask something from when people were sixteen or seventeen years old,” she told them. But Page and Brin seemed to believe that Google needed those … data. They believed that SAT scores showed how smart you were. GPAs showed how hard you worked. The numbers told the story. It never failed to astound midcareer people when Google asked to exhume those old records. “You’ve got to be kidding,” said R. J. Pittman, thirty-nine years old at the time, to the recruiter who asked him to produce his SAT scores and GPA. He was a Silicon Valley veteran, and Google had been wooing him. “I was pretty certain I didn’t have a copy of my SATs, and you can’t get them after five years or something,” he says. “And they’re, ‘Well, can you try to remember, make a close guess?’ I’m like, ‘Are you really serious?’ And they were serious. They will ask you questions about a grade that you got in a particular computer science class in college: Was there any reason why that wasn’t an A? And you think, ‘What was I doing way back then? ~ Steven Levy,
162:The best entrepreneurs don’t just follow Moore’s Law; they anticipate it. Consider Reed Hastings, the cofounder and CEO of Netflix. When he started Netflix, his long-term vision was to provide television on demand, delivered via the Internet. But back in 1997, the technology simply wasn’t ready for his vision—remember, this was during the era of dial-up Internet access. One hour of high-definition video requires transmitting 40 GB of compressed data (over 400 GB without compression). A standard 28.8K modem from that era would have taken over four months to transmit a single episode of Stranger Things. However, there was a technological innovation that would allow Netflix to get partway to Hastings’s ultimate vision—the DVD. Hastings realized that movie DVDs, then selling for around $ 20, were both compact and durable. This made them perfect for running a movie-rental-by-mail business. Hastings has said that he got the idea from a computer science class in which one of the assignments was to calculate the bandwidth of a station wagon full of backup tapes driving across the country! This was truly a case of technological innovation enabling business model innovation. Blockbuster Video had built a successful business around buying VHS tapes for around $ 100 and renting them out from physical stores, but the bulky, expensive, fragile tapes would never have supported a rental-by-mail business. ~ Reid Hoffman,
163:Bush’s description of how basic research provides the seed corn for practical inventions became known as the “linear model of innovation.” Although subsequent waves of science historians sought to debunk the linear model for ignoring the complex interplay between theoretical research and practical applications, it had a popular appeal as well as an underlying truth. The war, Bush wrote, had made it “clear beyond all doubt” that basic science—discovering the fundamentals of nuclear physics, lasers, computer science, radar—“is absolutely essential to national security.” It was also, he added, crucial for America’s economic security. “New products and new processes do not appear full-grown. They are founded on new principles and new conceptions, which in turn are painstakingly developed by research in the purest realms of science. A nation which depends upon others for its new basic scientific knowledge will be slow in its industrial progress and weak in its competitive position in world trade.” By the end of his report, Bush had reached poetic heights in extolling the practical payoffs of basic scientific research: “Advances in science when put to practical use mean more jobs, higher wages, shorter hours, more abundant crops, more leisure for recreation, for study, for learning how to live without the deadening drudgery which has been the burden of the common man for past ages.”9 Based on this report, Congress established the National Science Foundation. ~ Walter Isaacson,
164:Where people were once dazzled to be online, now their expectations had soared, and they did not bother to hide their contempt for those who sought to curtail their freedom on the Web. Nobody was more despised than a computer science professor in his fifties named Fang Binxing. Fang had played a central role in designing the architecture of censorship, and the state media wrote admiringly of him as the “father of the Great Firewall.” But when Fang opened his own social media account, a user exhorted others, “Quick, throw bricks at Fang Binxing!” Another chimed in, “Enemies of the people will eventually face trial.” Censors removed the insults as fast as possible, but they couldn’t keep up, and the lacerating comments poured in. People called Fang a “eunuch” and a “running dog.” Someone Photoshopped his head onto a voodoo doll with a pin in its forehead. In digital terms, Fang had stepped into the hands of a frenzied mob. Less than three hours after Web users spotted him, the Father of the Great Firewall shut down his account and recoiled from the digital world that he had helped create. A few months later, in May 2011, Fang was lecturing at Wuhan University when a student threw an egg at him, followed by a shoe, hitting the professor in the chest. Teachers tried to detain the shoe thrower, a science student from a nearby college, but other students shielded him and led him to safety. He was instantly famous online. People offered him cash and vacations in Hong Kong and Singapore. A female blogger offered to sleep with him. ~ Evan Osnos,
165:I work in theoretical computer science: a field that doesn’t itself win Fields Medals (at least not yet), but that has occasions to use parts of math that have won Fields Medals. Of course, the stuff we use cutting-edge math for might itself be dismissed as “ivory tower self-indulgence.” Except then the cryptographers building the successors to Bitcoin, or the big-data or machine-learning people, turn out to want the stuff we were talking about at conferences 15 years ago—and we discover to our surprise that, just as the mathematicians gave us a higher platform to stand on, so we seem to have built a higher platform for the practitioners. The long road from Hilbert to Gödel to Turing and von Neumann to Eckert and Mauchly to Gates and Jobs is still open for traffic today.

Yes, there’s plenty of math that strikes even me as boutique scholasticism: a way to signal the brilliance of the people doing it, by solving problems that require years just to understand their statements, and whose “motivations” are about 5,000 steps removed from anything Caplan or Bostrom would recognize as motivation. But where I part ways is that there’s also math that looked to me like boutique scholasticism, until Greg Kuperberg or Ketan Mulmuley or someone else finally managed to explain it to me, and I said: “ah, so that’s why Mumford or Connes or Witten cared so much about this. It seems … almost like an ordinary applied engineering question, albeit one from the year 2130 or something, being impatiently studied by people a few moves ahead of everyone else in humanity’s chess game against reality. It will be pretty sweet once the rest of the world catches up to this. ~ Scott Aaronson,
166:Soon, I found myself criss-crossing the country with Steve, in what we called our “dog and pony show,” trying to drum up interest in our initial public offering. As we traveled from one investment house to another, Steve (in a costume he rarely wore: suit and tie) pushed to secure early commitments, while I added a professorial presence by donning, at Steve’s insistence, a tweed jacket with elbow patches. I was supposed to embody the image of what a “technical genius” looks like—though, frankly, I don’t know anyone in computer science who dresses that way. Steve, as pitch man, was on fire. Pixar was a movie studio the likes of which no one had ever seen, he said, built on a foundation of cutting-edge technology and original storytelling. We would go public one week after Toy Story opened, when no one would question that Pixar was for real. Steve turned out to be right. As our first movie broke records at the box office and as all our dreams seemed to be coming true, our initial public offering raised nearly $140 million for the company—the biggest IPO of 1995. And a few months later, as if on cue, Eisner called, saying that he wanted to renegotiate the deal and keep us as a partner. He accepted Steve’s offer of a 50/50 split. I was amazed; Steve had called this exactly right. His clarity and execution were stunning. For me, this moment was the culmination of such a lengthy series of pursuits, it was almost impossible to take in. I had spent twenty years inventing new technological tools, helping to found a company, and working hard to make all the facets of this company communicate and work well together. All of this had been in the service of a single goal: making a computer-animated feature film. And now, we’d not only done it; thanks to Steve, we were on steadier financial ground than we’d ever been before. For the first time since our founding, our jobs were safe. I ~ Ed Catmull,
167:So far this does not tell us anything very general about structure except that it is hierarchical. But we can say more. Each assembly or subassembly or part has a task to perform. If it did not it would not be there. Each therefore is a means to a purpose. Each therefore, by my earlier definition, is a technology. This means that the assemblies, subassemblies, and individual parts are all executables-are all technologies. It follows that a technology consists of building blocks that are technologies, which consist of further building blocks that are technologies, which consist of yet further building blocks that are technologies, with the pattern repeating all the way down to the fundamental level of elemental components. Technologies, in other words, have a recursive structure. They consist of technologies within technologies all the way down to the elemental parts.

Recursiveness will be the second principle we will be working with. It is not a very familiar concept outside mathematics, physics, and computer science, where it means that structures consist of components that are in some way similar to themselves. In our context of course it does not mean that a jet engine consists of systems and parts that are little jet engines. That would be absurd. It means simply that a jet engine (or more generally, any technology) consists of component building blocks that are also technologies, and these consist of sub-parts that are also technologies, in a repeating (or recurring) pattern.

Technologies, then, are built from a hierarchy of technologies, and this has implications for how we should think of them, as wee will see shortly. It also means that whatever we can say in general about technologies-singular must hold also for assemblies or subsystems at lower levels as well. In particular, because a technology consists of main assembly and supporting assemblies, each assembly or subsystem must be organized this way too. ~ W Brian Arthur,
168:the AuThoRS Neal Lathia is a research associate in the Computer laboratory at the university of Cambridge. His research falls at the intersection of data mining, mobile systems, and personalization/recommender systems. lathia has a phD in computer science from the university College london. Contact him at neal.lathia@ Veljko Pejovic is a postdoctoral research fellow at the school of Computer science at the university of birmingham, uK. His research focuses on adaptive wireless technologies and their impact on society. pejovic received a phD in computer science from the university of California, santa barbara. Contact him at Kiran K. Rachuri is a phD student in the Computer laboratory at the university of Cambridge. His research interests include smartphone sensing systems, energy efficient sensing, and sensor networks. rachuri received an ms in computer science from the Indian Institute of technology madras. Contact him at Cecilia Mascolo is a reader in mobile systems in the Computer laboratory at the university of Cambridge. Her interests are in the area of mobility modeling, sensing, and social network analysis. mascolo has a phD in computer science from the university of bologna. Contact her at Mirco Musolesi is a senior lecturer in the school of Computer science at the university of birmingham, uK. His research interests include mobile sensing, large-scale data mining, and network science. musolesi has a phD in computer science from the university College london. Contact him at m.musolesi@ Peter J. Rentfrow is a senior lecturer in the psychology Department at the university of Cambridge. His research focuses on behavioral manifestations of personality and psychological processes. rentfrow earned a phD in psychology from the university of texas at Austin. Contact him at Cs articles and columns are also available for free at http ~ Anonymous,
169:In fact, the same basic ingredients can easily be found in numerous start-up clusters in the United States and around the world: Austin, Boston, New York, Seattle, Shanghai, Bangalore, Istanbul, Stockholm, Tel Aviv, and Dubai. To discover the secret to Silicon Valley’s success, you need to look beyond the standard origin story. When people think of Silicon Valley, the first things that spring to mind—after the HBO television show, of course—are the names of famous start-ups and their equally glamorized founders: Apple, Google, Facebook; Jobs/ Wozniak, Page/ Brin, Zuckerberg. The success narrative of these hallowed names has become so universally familiar that people from countries around the world can tell it just as well as Sand Hill Road venture capitalists. It goes something like this: A brilliant entrepreneur discovers an incredible opportunity. After dropping out of college, he or she gathers a small team who are happy to work for equity, sets up shop in a humble garage, plays foosball, raises money from sage venture capitalists, and proceeds to change the world—after which, of course, the founders and early employees live happily ever after, using the wealth they’ve amassed to fund both a new generation of entrepreneurs and a set of eponymous buildings for Stanford University’s Computer Science Department. It’s an exciting and inspiring story. We get the appeal. There’s only one problem. It’s incomplete and deceptive in several important ways. First, while “Silicon Valley” and “start-ups” are used almost synonymously these days, only a tiny fraction of the world’s start-ups actually originate in Silicon Valley, and this fraction has been getting smaller as start-up knowledge spreads around the globe. Thanks to the Internet, entrepreneurs everywhere have access to the same information. Moreover, as other markets have matured, smart founders from around the globe are electing to build companies in start-up hubs in their home countries rather than immigrating to Silicon Valley. ~ Reid Hoffman,
170:But come on—tell me the proposal story, anyway.”

She raised an eyebrow. “Really?”

“Really. Just keep in mind that I’m a guy, which means I’m genetically predisposed to think that whatever mushy romantic tale you’re about to tell me is highly cheesy.”

Rylann laughed. “I’ll keep it simple, then.” She rested her drink on the table. “Well, you already heard how Kyle picked me up at the courthouse after my trial. He said he wanted to surprise me with a vacation because I’d been working so hard, but that we needed to drive to Champaign first to meet with his former mentor, the head of the U of I Department of Computer Sciences, to discuss some project Kyle was working on for a client.” She held up a sparkly hand, nearly blinding Cade and probably half of the other Starbucks patrons. “In hindsight, yes, that sounds a little fishy, but what do I know about all this network security stuff? He had his laptop out, there was some talk about malicious payloads and Trojan horse attacks—it all sounded legitimate enough at the time.”

“Remind me, while I’m acting U.S. attorney, not to assign you to any cybercrime cases.”

“Anyhow. . . we get to Champaign, which as it so happens, is where Kyle and I first met ten years ago. And the limo turns onto the street where I used to live while in law school, and Kyle asks the driver to pull over because he wants to see the place for old time’s sake. So we get out of the limo, and he’s making this big speech about the night we met and how he walked me home on the very sidewalk we were standing on—I’ll fast-forward here in light of your aversion to the mushy stuff—and I’m laughing to myself because, well, we’re standing on the wrong side of the street. So naturally, I point that out, and he tells me that nope, I’m wrong, because he remembers everything about that night, so to prove my point I walk across the street to show him and”—she paused here— “and I see a jewelry box, sitting on the sidewalk, in the exact spot where we had our first kiss. Then I turn around and see Kyle down on one knee.”

She waved her hand, her eyes a little misty. “So there you go. The whole mushy, cheesy tale. Gag away.”

Cade picked up his coffee cup and took a sip. “That was actually pretty smooth.”

Rylann grinned. “I know. Former cyber-menace to society or not, that man is a keeper ~ Julie James,
171:a harbinger of a third wave of computing, one that blurred the line between augmented human intelligence and artificial intelligence. “The first generation of computers were machines that counted and tabulated,” Rometty says, harking back to IBM’s roots in Herman Hollerith’s punch-card tabulators used for the 1890 census. “The second generation involved programmable machines that used the von Neumann architecture. You had to tell them what to do.” Beginning with Ada Lovelace, people wrote algorithms that instructed these computers, step by step, how to perform tasks. “Because of the proliferation of data,” Rometty adds, “there is no choice but to have a third generation, which are systems that are not programmed, they learn.”27 But even as this occurs, the process could remain one of partnership and symbiosis with humans rather than one designed to relegate humans to the dustbin of history. Larry Norton, a breast cancer specialist at New York’s Memorial Sloan-Kettering Cancer Center, was part of the team that worked with Watson. “Computer science is going to evolve rapidly, and medicine will evolve with it,” he said. “This is coevolution. We’ll help each other.”28 This belief that machines and humans will get smarter together is a process that Doug Engelbart called “bootstrapping” and “coevolution.”29 It raises an interesting prospect: perhaps no matter how fast computers progress, artificial intelligence may never outstrip the intelligence of the human-machine partnership. Let us assume, for example, that a machine someday exhibits all of the mental capabilities of a human: giving the outward appearance of recognizing patterns, perceiving emotions, appreciating beauty, creating art, having desires, forming moral values, and pursuing goals. Such a machine might be able to pass a Turing Test. It might even pass what we could call the Ada Test, which is that it could appear to “originate” its own thoughts that go beyond what we humans program it to do. There would, however, be still another hurdle before we could say that artificial intelligence has triumphed over augmented intelligence. We can call it the Licklider Test. It would go beyond asking whether a machine could replicate all the components of human intelligence to ask whether the machine accomplishes these tasks better when whirring away completely on its own or when working in conjunction with humans. In other words, is it possible that humans and machines working in partnership will be indefinitely more powerful than an artificial intelligence machine working alone? ~ Walter Isaacson,
172:The breakthrough came in the early 1980s, when Judea Pearl, a professor of computer science at the University of California, Los Angeles, invented a new representation: Bayesian networks. Pearl is one of the most distinguished computer scientists in the world, his methods having swept through machine learning, AI, and many other fields. He won the Turing Award, the Nobel Prize of computer science, in 2012. Pearl realized that it’s OK to have a complex network of dependencies among random variables, provided each variable depends directly on only a few others. We can represent these dependencies with a graph like the ones we saw for Markov chains and HMMs, except now the graph can have any structure (as long as the arrows don’t form closed loops). One of Pearl’s favorite examples is burglar alarms. The alarm at your house should go off if a burglar attempts to break in, but it could also be triggered by an earthquake. (In Los Angeles, where Pearl lives, earthquakes are almost as frequent as burglaries.) If you’re working late one night and your neighbor Bob calls to say he just heard your alarm go off, but your neighbor Claire doesn’t, should you call the police? Here’s the graph of dependencies: If there’s an arrow from one node to another in the graph, we say that the first node is a parent of the second. So Alarm’s parents are Burglary and Earthquake, and Alarm is the sole parent of Bob calls and Claire calls. A Bayesian network is a graph of dependencies like this, together with a table for each variable, giving its probability for each combination of values of its parents. For Burglary and Earthquake we only need one probability each, since they have no parents. For Alarm we need four: the probability that it goes off even if there’s no burglary or earthquake, the probability that it goes off if there’s a burglary and no earthquake, and so on. For Bob calls we need two probabilities (given alarm and given no alarm), and similarly for Claire. Here’s the crucial point: Bob calling depends on Burglary and Earthquake, but only through Alarm. Bob’s call is conditionally independent of Burglary and Earthquake given Alarm, and so is Claire’s. If the alarm doesn’t go off, your neighbors sleep soundly, and the burglar proceeds undisturbed. Also, Bob and Claire are independent given Alarm. Without this independence structure, you’d need to learn 25 = 32 probabilities, one for each possible state of the five variables. (Or 31, if you’re a stickler for details, since the last one can be left implicit.) With the conditional independencies, all you need is 1 + 1 + 4 + 2 + 2 = 10, a savings of 68 percent. And that’s just in this tiny example; with hundreds or thousands of variables, the savings would be very close to 100 percent. ~ Pedro Domingos,

--- IN CHAPTERS (in Dictionaries, in Quotes, in Chapters)


03.01_-_The_Pursuit_of_the_Unknowable, #Savitri, #Sri Aurobindo, #Integral Yoga
  object:programs (Computer Science)
  class:Computer Science
  see also ::: Computer Science

class, #unset, #Sri Aurobindo, #Integral Yoga
     11 Plato
     11 Computer Science
     11 adverb

change font "color":
change "background-color":
change "font-family": 45423 site hits