[ACCEPTED]-Significant new inventions in computing since 1980-innovation
The Internet itself pre-dates 1980, but 15 the World Wide Web ("distributed hypertext via simple 14 mechanisms") as proposed and implemented 13 by Tim Berners-Lee started in 1989/90.
While 12 the idea of hypertext had existed before 11 (Nelson’s Xanadu had tried to implement a distributed scheme), the 10 WWW was a new approach for implementing 9 a distributed hypertext system. Berners-Lee 8 combined a simple client-server protocol, markup 7 language, and addressing scheme in a way 6 that was powerful and easy to implement.
I 5 think most innovations are created in re-combining 4 existing pieces in an original way. Each 3 of the pieces of the WWW had existed in 2 some form before, but the combination was 1 obvious only in hindsight.
And I know for sure that you are using it right now.
Free Software Foundation (Established 1985)
Even if you aren't a 5 wholehearted supporter of their philosophy, the 4 ideas that they have been pushing, of free 3 software, open-source has had an amazing 2 influence on the software industry and content 1 in general (e.g. Wikipedia).
I think it's fair to say that in 1980, if 62 you were using a computer, you were either 61 getting paid for it or you were a geek... so 60 what's changed?
Printers and consumer-level 59 desktop publishing. Meant you didn't need a printing press 58 to make high-volume, high-quality printed 57 material. That was big - of course, nowadays 56 we completely take it for granted, and mostly 55 we don't even bother with the printing part 54 because everyone's online anyway.
Colour. Seriously. Colour 53 screens made a huge difference to non-geeks' perception 52 of games & applications. Suddenly games 51 seemed less like hard work and more like 50 watching TV, which opened the doors for 49 Sega, Nintendo, Atari et al to bring consumer 48 gaming into the home.
Media compression (MP3s 47 and video files). And a whole bunch of things 46 - like TiVO and iPods - that we don't really 45 think of as computers any more because they're 44 so ubiquitous and so user-friendly. But 43 they are.
The common thread here, I think, is 42 stuff that was once impossible (making printed 41 documents; reproducing colour images accurately; sending 40 messages around the world in real time; distributing 39 audio and video material), and was then 38 expensive because of the equipment and logistics 37 involved, and is now consumer-level. So 36 - what are big corporates doing now that 35 used to be impossible but might be cool 34 if we can work out how to do it small & cheap?
Anything 33 that still involves physical transportation 32 is interesting to look at. Video conferencing 31 hasn't replaced real meetings (yet) - but 30 with the right technology, it still might. Some 29 recreational travel could be eliminated 28 by a full-sensory immersive environment 27 - home cinema is a trivial example; another 26 is the "virtual golf course" in an office 25 building in Soho, where you play 18 holes 24 of real golf on a simulated course.
For me, though, the 23 next really big thing is going to be fabrication. Making 22 things. Spoons and guitars and chairs and 21 clothing and cars and tiles and stuff. Things 20 that still rely on a manufacturing and distribution 19 infrastructure. I don't have to go to a 18 store to buy a movie or an album any more 17 - how long until I don't have to go to the 16 store for clothing and kitchenware?
Sure, there 15 are interesting developments going on with 14 OLED displays and GPS and mobile broadband 13 and IoC containers and scripting and "the 12 cloud" - but it's all still just new-fangled 11 ways of putting pictures on a screen. I 10 can print my own photos and write my own 9 web pages, but I want to be able to fabricate 8 a linen basket that fits exactly into that 7 nook beside my desk, and a mounting bracket 6 for sticking my guitar FX unit to my desk, and 5 something for clipping my cellphone to my 4 bike handlebars.
Not programming related? No... but 3 in 1980, neither was sound production. Or 2 video distribution. Or sending messages 1 to your relatives in Zambia. Think big, people... :)
Package management and distributed revision 22 control.
These patterns in the way software 21 is developed and distributed are quite recent, and 20 are still just beginning to make an impact.
Ian 19 Murdock has called package management "the single biggest 18 advancement Linux has brought to the industry". Well, he 17 would, but he has a point. The way software 16 is installed has changed significantly since 15 1980, but most computer users still haven't 14 experienced this change.
Joel and Jeff have 13 been talking about revision control (or 12 version control, or source control) with Eric Sink in 11 Podcast #36. It seems most developers haven't yet 10 caught up with centralized systems, and DVCS is widely 9 seen as mysterious and unnecessary.
From 8 the Podcast 36 transcript:
0:06:37
Atwood: ... If you assume -- and this 7 is a big assumption -- that most developers 6 have kinda sorta mastered fundamental source 5 control -- which I find not to be true, frankly...
Spolsky: No. Most 4 of them, even if they have, it's the check-in, check-out 3 that they understand, but branching and 2 merging -- that confuses the heck out of 1 them.
BitTorrent. It completely turns what previously seemed 19 like an obviously immutable rule on its 18 head - the time it takes for a single person 17 to download a file over the Internet grows 16 in proportion to the number of people downloading 15 it. It also addresses the flaws of previous 14 peer-to-peer solutions, particularly around 13 'leeching', in a way that is organic to 12 the solution itself.
BitTorrent elegantly 11 turns what is normally a disadvantage - many 10 users trying to download a single file simultaneously 9 - into an advantage, distributing the file 8 geographically as a natural part of the 7 download process. Its strategy for optimizing 6 the use of bandwidth between two peers discourages 5 leeching as a side-effect - it is in the 4 best interest of all participants to enforce 3 throttling.
It is one of those ideas which, once 2 someone else invents it, seems simple, if 1 not obvious.
Damas-Milner type inference (often called 18 Hindley-Milner type inference) was published 17 in 1983 and has been the basis of every 16 sophisticated static type system since. It 15 was a genuinely new idea in programming 14 languages (admitted based on ideas published 13 in the 1970s, but not made practical until 12 after 1980). In terms of importance I put 11 it up with Self and the techniques used 10 to implement Self; in terms of influence 9 it has no peer. (The rest of the OO world 8 is still doing variations on Smalltalk or 7 Simula.)
Variations on type inference are 6 still playing out; the variation I would 5 single out the most is Wadler and Blott's 4 type class mechanism for resolving overloading, which 3 was later discovered to offer very powerful 2 mechanisms for programming at the type level. The 1 end to this story is still being written.
Here's a plug for Google map-reduce, not just for itself, but 7 as a proxy for Google's achievement of running 6 fast, reliable services on top of farms 5 of unreliable, commodity machines. Definitely 4 an important invention and totally different 3 from the big-iron mainframe approaches to 2 heavyweight computation that ruled the roost 1 in 1980.
Tagging, the way information is categorized. Yes, the 13 little boxes of text under each question.
It 12 is amazing that it took about 30 years to 11 invent tagging. We used lists and tables 10 of contents; we used things which are optimized 9 for printed books.
However 30 years is much 8 shorter than the time people needed to realize 7 that printed books can be in smaller format. People 6 can keep books in hands.
I think that the 5 tagging concept is underestimated among 4 core CS guys. All research is focused on 3 natural language processing (top-down approach). But tagging is the first language in which computers and people can both understand well. It 2 is a bottom-up approach that makes computers 1 use natural languages.
I think we are looking at this the wrong 69 way and drawing the wrong conclusions. If 68 I get this right, the cycle goes:
Idea -> first 67 implementation -> minority adoption -> critical 66 mass -> commodity product
From the very first 65 idea to the commodity, you often have centuries, assuming 64 the idea ever makes it to that stage. Da 63 Vinci may have drawn some kind of helicopter 62 in 1493 but it took about 400 years to get 61 an actual machine capable of lifting itself 60 off the ground.
From William Bourne's first 59 description of a submarine in 1580 to the 58 first implementation in 1800, you have 220 57 years and current submarines are still at 56 an infancy stage: we almost know nothing 55 of underwater traveling (with 2/3rdof the 54 planet under sea, think of the potential 53 real estate ;).
And there is no telling that 52 there wasn't earlier, much earlier ideas 51 that we just never heard of. Based on some 50 legends, it looks like Alexander the Great 49 used some kind of diving bell in 332 BC 48 (which is the basic idea of a submarine: a 47 device to carry people and air supply below 46 the sea). Counting that, we are looking 45 at 2000 years from idea (even with a basic 44 prototype) to product.
What I am saying 43 is that looking today for implementations, let 42 alone products, that were not even ideas 41 prior to 1980 is ... I betcha the "quick 40 sort" algorithm was used by some no name 39 file clerk in ancient China. So what?
There 38 were networked computers 40 years ago, sure, but 37 that didn't compare with today's Internet. The 36 basic idea/technology was there, but regardless 35 you couldn't play a game of Warcraft online.
I 34 claim that we need really new ideas in most 33 areas of computing, and I would like to 32 know of any important and powerful ones 31 that have been done recently. If we can't 30 really find them, then we should ask "Why?" and 29 "What should we be doing?"
Historically, we 28 have never been able to "find them" that 27 close from the idea, that fast. I think 26 the cycle is getting faster, but computing 25 is still darn young.
Currently, I am trying 24 to figure out how to make an hologram (the 23 Star Wars kind, without any physical support). I 22 think I know how to make it work. I haven't 21 even gathered the tools, materials, funding 20 and yet even if I was to succeed to any 19 degree, the actual idea would already be 18 several decades old, at the very least and 17 related implementations/technologies have 16 been used for just as long.
As soon as you 15 start listing actual products, you can be 14 pretty sure that concepts and first implementations 13 existed a while ago. Doesn't matter.
You 12 could argue with some reason that nothing 11 is new, ever, or that everything is new, always. That's 10 philosophy and both viewpoints can be defended.
From 9 a practical viewpoint, truth lies somewhere 8 in between. Truth is not a binary concept, boolean 7 logic be damned.
The Chinese may have come 6 up with the printing press a while back, but 5 it's only been about 10 years that most 4 people can print decent color photos at 3 home for a reasonable price.
Invention is 2 nowhere and everywhere, depending on your 1 criteria and frame of reference.
Google's Page Rank algorithm. While it could be 3 seen as just a refinement of web crawling 2 search engines, I would point out that they 1 too were developed post-1980.
DNS, 1983, and dependent advances like email 33 host resolution via MX records instead of 32 bang-paths. *shudder*
Zeroconf working on top of DNS, 2000. I 31 plug my printer into the network and my 30 laptop sees it. I start a web server on 29 the network and my browser sees it. (Assuming 28 they broadcast their availability.)
NTP (1985) based 27 on Marzullo's algorithm (1984). Accurate 26 time over jittery networks.
The mouse scroll 25 wheel, 1995. Using mice without it feels 24 so primitive. And no, it's not something 23 that Engelbart's team thought of and forgot 22 to mention. At least not when I asked someone 21 who was on the team at the time. (It was 20 at some Engelbart event in 1998 or so. I 19 got to handle one of the first mice.)
Unicode, 1987, and 18 its dependent advances for different types 17 of encoding, normalization, bidirectional 16 text, etc.
Yes, it's pretty common for people 15 to use all 5 of these every day.
Are these 14 "really new ideas?" After all, there were 13 mice, there were character encodings, there 12 was network timekeeping. Tell me how I can 11 distinguish between "new" and "really new" and 10 I'll answer that one for you. My intuition 9 says that these are new enough.
In smaller 8 domains there are easily more recent advances. In 7 bioinformatics, for example, Smith-Waterman 6 (1981) and more especially BLAST (1990) effectively 5 make the field possible. But it sounds like 4 you're asking for ideas which are very broad 3 across the entire field of computing, and 2 the low-hanging fruit gets picked first. Thus 1 is it always with a new field.
What about digital cameras?
According to 3 Wikipedia, the first true digital camera appeared in 1988, with mass 2 market digital cameras becoming affordable 1 in the late 1990s.
Modern shading languages and the prevalence of modern GPUs.
The GPU is also a low cost parallel supercomputer 9 with tools like CUDA and OpenCL for blazing 8 fast high level parallel code. Thank you to all those 7 gamers out there driving down the prices 6 of these increasingly impressive hardware 5 marvels. In the next five years I hope every 4 new computer sold (and iPhones too) will 3 have the ability to run massively parallel 2 code as a basic assumption, much like 24 1 bit color or 32 bit protected mode.
JIT compilation was invented in the late 1 1980s.
To address the two questions about "Why 53 the death of new ideas", and "what to do 52 about it"?
I suspect a lot of the lack of 51 progress is due to the massive influx of 50 capital and entrenched wealth in the industry. Sounds 49 counterintuitive, but I think it's become 48 conventional wisdom that any new idea gets 47 one shot; if it doesn't make it at the first 46 try, it can't come back. It gets bought 45 by someone with entrenched interests, or 44 just FAILs, and the energy is gone. A couple 43 examples are tablet computers, and integrated 42 office software. The Newton and several 41 others had real potential, but ended up 40 (through competitive attrition and bad judgment) squandering 39 their birthrights, killing whole categories. (I 38 was especially fond of Ashton Tate's Framework; but 37 I'm still stuck with Word and Excel).
What 36 to do? The first thing that comes to mind 35 is Wm. Shakespeare's advice: "Let's kill 34 all the lawyers." But now they're too well 33 armed, I'm afraid. I actually think the 32 best alternative is to find an Open Source 31 initiative of some kind. They seem to maintain 30 accessibility and incremental improvement 29 better than the alternatives. But the industry 28 has gotten big enough so that some kind 27 of organic collaborative mechanism is necessary 26 to get traction.
I also think that there's 25 a dynamic that says that the entrenched 24 interests (especially platforms) require 23 a substantial amount of change - churn - to 22 justify continuing revenue streams; and 21 this absorbs a lot of creative energy that 20 could have been spent in better ways. Look 19 how much time we spend treading water with 18 the newest iteration from Microsoft or Sun 17 or Linux or Firefox, making changes to systems 16 that for the most part work fine already. It's 15 not because they are evil, it's just built 14 into the industry. There's no such thing 13 as Stable Equilibrium; all the feedback 12 mechanisms are positive, favoring change 11 over stability. (Did you ever see a feature 10 withdrawn, or a change retracted?)
The other 9 clue that has been discussed on SO is the 8 Skunkworks Syndrome (ref: Geoffrey Moore): real 7 innovation in large organizations almost 6 always (90%+) shows up in unauthorized projects 5 that emerge spontaneously, fueled exclusively 4 by individual or small group initiative 3 (and more often than not opposed by formal 2 management hierarchies). So: Question Authority, Buck 1 the System.
One thing that astounds me is the humble 23 spreadsheet. Non-programmer folk build 22 wild and wonderful solutions to real world 21 problems with a simple grid of formula. Replicating 20 their efforts in desktop application often 19 takes 10 to 100 times longer than it took 18 to write the spreadsheet and the resulting 17 application is often harder to use and full 16 of bugs!
I believe the key to the success 15 of the spreadsheet is automatic dependency 14 analysis. If the user of the spreadsheet 13 was forced to use the observer pattern, they'd 12 have no chance of getting it right.
So, the 11 big advance is automatic dependency analysis. Now 10 why hasn't any modern platform (Java, .Net, Web 9 Services) built this into the core of the 8 system? Especially in a day and age of 7 scaling through parallelization - a graph 6 of dependencies leads to parallel recomputation 5 trivially.
Edit: Dang - just checked. VisiCalc 4 was released in 1979 - let's pretend it's 3 a post-1980 invention.
Edit2: Seems that 2 the spreadsheet is already noted by Alan 1 anyway - if the question that bought him to this forum is correct!
Software:
Virtualization and emulation
P2P 5 data transfers
community-driven projects 4 like Wikipedia, SETI@home ...
web crawling 3 and web search engines, i.e. indexing information 2 that is spread out all over the world
Hardware:
the 1 modular PC
E-paper
The rediscovery of the monad by functional 11 programming researchers. The monad was instrumental 10 in allowing a pure, lazy language (Haskell) to 9 become a practical tool; it has also influenced 8 the design of combinator libraries (monadic 7 parser combinators have even found their 6 way into Python).
Moggi's "A category-theoretic 5 account of program modules" (1989) is generally 4 credited with bringing monads into view 3 for effectful computation; Wadler's work 2 (for example, "Imperative functional programming" (1993)) presented 1 monads as practical tool.
Shrinkwrap software
Before 1980, software was mostly specially 45 written. If you ran a business, and wanted 44 to computerize, you'd typically get a computer 43 and compiler and database, and get your 42 own stuff written. Business software was 41 typically written to adapt to business practices. This 40 is not to say there was no canned software 39 (I worked with SPSS before 1980), but it 38 wasn't the norm, and what I saw tended to 37 be infrastructure and research software.
Nowadays, you 36 can go to a computer store and find, on 35 the shelf, everything you need to run a 34 small business. It isn't designed to fit 33 seamlessly into whatever practices you used 32 to have, but it will work well once you 31 learn to work more or less according to 30 its workflow. Large businesses are a lot 29 closer to shrinkwrap than they used to be, with 28 things like SAP and PeopleSoft.
It isn't 27 a clean break, but after 1980 there was 26 a very definite shift from expensive custom 25 software to low-cost off-the-shelf software, and 24 flexibility shifted from software to business 23 procedures.
It also affected the economics 22 of software. Custom software solutions 21 can be profitable, but it doesn't scale. You 20 can only charge one client so much, and 19 you can't sell the same thing to multiple 18 clients. With shrinkwrap software, you 17 can sell lots and lots of the same thing, amortizing 16 development costs over a very large sales 15 base. (You do have to provide support, but 14 that scales. Just consider it a marginal 13 cost of selling the software.)
Theoretically, where 12 there are big winners from a change, there 11 are going to be losers. So far, the business 10 of software has kept expanding, so that 9 as areas become commoditized other areas 8 open up. This is likely to come to an end 7 sometime, and moderately talented developers 6 will find themselves in a real crunch, unable 5 to work for the big boys and crowded out 4 of the market. (This presumably happens 3 for other fields; I suspect the demand for 2 accountants is much smaller than it would 1 be without QuickBooks and the like.)
Outside of hardware innovations, I tend 8 to find that there is little or nothing 7 new under the sun. Most of the really big 6 ideas date back to people like von Neumann 5 and Alan Turing.
A lot of things that are 4 labelled 'technology' these days are really 3 just a program or library somebody wrote, or 2 a retread of an old idea with a new metaphor, acronym, or 1 brand name.
Computer Worms were researched in the early eighties of 32 the last century in the Xerox Palo Alto 31 Research Center.
From John Shoch's and Jon 30 Hupp's The "Worm" Programs - Early Experience with a Distributed Computation" (Communications of the ACM, March 29 1982 Volume 25 Number 3, pp.172-180, march 28 1982):
In The Shockwave Rider, J. Brunner developed the notion of an 27 omnipotent "tapeworm" program 26 running loose through a network of computers 25 - an idea which may seem rather disturbing, but 24 which is also quite beyond our current 23 capabilities. The basic model, however, remains 22 a very provocative one: a program or a 21 computation that can move from machine 20 to machine, harnessing resources as needed, and replicating 19 itself when necessary.
In a similar vein, we 18 once described a computational model based 17 upon the classic science-fiction film, The Blob: a 16 program that started out running in one 15 machine, but as its appetite for computing 14 cycles grew, it could reach out, find 13 unused machines, and grow to encompass 12 those resources. In the middle of the 11 night, such a program could mobilize hundreds 10 of machines in one building; in the morning, as 9 users reclaimed their machines, the "blob" would 8 have to retreat in an orderly manner, gathering 7 up the intermediate results of its computation. Holed 6 up in one or two machines during the day, the program 5 could emerge again later as resources 4 became available, again expanding the 3 computation. (This affinity for nighttime 2 exploration led one researcher to describe 1 these as "vampire programs.")
Quoting Alan Kay: "The best way to predict the future is to invent it."
Better user interfaces.
Today’s user interfaces still suck. And 48 I don't mean in small ways but in large, fundamental 47 ways. I can't help but to notice that even 46 the best programs still have interfaces 45 that are either extremely complex or that 44 require a lot of abstract thinking in other 43 ways, and that just don't approach the ease 42 of conventional, non-software tools.
Granted, this 41 is due to the fact that software allows 40 to do so much more than conventional tools. That's 39 no reason to accept the status quo though. Additionally, most 38 software is simply not well done.
In general, applications 37 still lack a certain “just works” feeling 36 are too much oriented by what can be done, rather 35 than what should be done. One point that has been 34 raised time and again, and that is still 33 not solved, is the point of saving. Applications 32 crash, destroying hours of work. I have 31 the habit of pressing Ctrl+S every few seconds 30 (of course, this no longer works in web 29 applications). Why do I have to do this? It's 28 mind-numbingly stupid. This is clearly a 27 task for automation. Of course, the application 26 also has to save a diff for every modification 25 I make (basically an infinite undo list) in 24 case I make an error.
Solving this probem 23 isn't even actually hard. It would just 22 be hard to implement it in every application 21 since there is no good API to do this. Programming 20 tools and libraries have to improve significantly 19 before allowing an effortless implementation 18 of such effords across all platforms and 17 programs, for all file formats with arbitrary 16 backup storage and no required user interaction. But 15 it is a necessary step before we finally 14 start writing “good” applications instead 13 of merely adequate ones.
I believe that Apple 12 currently approximates the “just works” feeling 11 best in some regards. Take for example their 10 newest version of iPhoto which features 9 a face recognition that automatically groups 8 photos by people appearing in them. That is 7 a classical task that the user does not 6 want to do manually and doesn't understand why the computer 5 doesn't do it automatically. And even iPhoto 4 is still a very long way from a good UI, since 3 said feature still requires ultimate confirmation 2 by the user (for each photo!), since the 1 face recognition engine isn't perfect.
HTM systems (Hiearchical Temporal Memory).
A new approach to Artifical 9 Intelligence, initiated by Jeff Hawkins 8 through the book "On Intelligence".
Now active 7 as a company called Numenta where these ideas are 6 put to the test through development of "true" AI, with 5 an invitation to the community to participate 4 by using the system through SDKs.
It's more 3 about building machine intelligence from 2 the ground up, rather than trying to emulate 1 human reasoning.
The use of Physics in Human Computer interaction to provide an alternative, understandable 14 metaphor. This combined with gestures and 13 haptics will likely result in a replacment 12 for the current common GUI metaphor invented 11 in the 70's and in common use since the 10 mid to late 80's.
The computing power wasn't 9 present in 1980 to make that possible. I 8 believe Games likely led the way here. An example 7 can easily be seen in the interaction of 6 list scrolling in the iPod Touch/iPhone. The 5 interaction mechanism relies on the intuition 4 of how momentum and friction work in the 3 real world to provide a simple way to scroll 2 a list of items, and the usability relies 1 on the physical gesture that cause the scroll.
I believe Unit Testing, TDD and Continuous 2 Integration are significant inventions after 1 1980.
Mobile phones.
While the first "wireless 6 phone" patent was in 1908, and they were 5 cooking for a long time (0G in 1945, 1G 4 launched in Japan in 1979), modern 2G digital 3 cell phones didn't appear until 1991. SMS 2 didn't exist until 1993, and Internet access 1 appeared in 1999.
I started programming Jan 2nd 1980. I've 21 tried to think about significant new inventions 20 over my career. I struggle to think of 19 any. Most of what I consider significant 18 were actually invented prior to 1980 but 17 then weren't widely adopted or improved 16 until after.
- Graphical User Interface.
- Fast processing.
- Large memory (I paid $200.00 for 16k in 1980).
- Small sizes - cell phones, pocket pc's, iPhones, Netbooks.
- Large storage capacities. (I've gone from carrying a large 90k floppy to an 8 gig usb thumb drive.
- Multiple processors. (Almost all my computers have more than one now, software struggles to keep them busy).
- Standard interfaces (like USB) to easily attach hardware peripherals.
- Multiple Touch displays.
- Network connectivity - leading to the mid 90's internet explosion.
- IDE's with Intellisense and incremental compiling.
While the hardware has improved 15 tremendously the software industry has struggled 14 to keep up. We are light years ahead of 13 1980, but most improvements have been refinements 12 rather than inventions. Since 1980 we have 11 been too busy applying what the advancements 10 let us do rather than inventing. By themselves 9 most of these incremental inventions are 8 not important or powerful, but when you 7 look back over the last 29 years they are 6 quite powerful.
We probably need to embrace 5 the incremental improvements and steer them. I 4 believe that truly original ideas will probably 3 come from people with little exposure to 2 computers and they are becoming harder to 1 find.
Nothing.
I think it's because people have 9 changed their attitudes. People used to 8 believe that if they would just find that 7 "big idea", then they would strike it rich. Today, people 6 believe that it is the execution and not 5 the discovery that pays out the most. You 4 have mantras such as "ideas are a dime a 3 dozen" and "the second mouse gets the cheese". So 2 people are focused on exploiting existing 1 ideas rather than coming up with new ones.
Open Source community development.
0
The iPad (released April 2010): surely such 6 a concept is absolutely revolutionary!
alt text http://www.ubergizmo.com/photos/2010/1/apple-ipad//apple-ipad-05.JPG
No way Alan 5 Kay saw that coming from the 1970's!
Imagine 4 such a "personal, portable information 3 manipulator"...
...
Wait? What!? The 2 Dynabook you say?
Thought out by Alan Kay as early 1 as 1968, and described in great details in this 1972 paper??
NOOOoooooooo....
Oh well... never mind.
Ideas around Social Computing have had advances since the 13 1980. The Well started in 1985. While I'm sure 12 there were online communities before, I 11 believe some of the true insights in the 10 area have happened post 1980. The adverse 9 dynamic aspects of social communities and 8 their interaction on a software system are 7 much like the disasters of the Tacoma Narrows Bridge.
I think 6 Clay Shirky's work in the area illuminates those effects 5 and how to mitigate them. I'd say interesting 4 real world examples of social software insights 3 include things like reCAPTCHA and Wikipedia, where significant 2 valuable work is done by the participants 1 mediated by the software.
I think the best ideas invented since the 72 1980's will be the ones that we're not aware 71 of. Either because they are so small and 70 ubiquitous as to be unnoticable, or because 69 their popularity hasn't really taken off.
One 68 example of the former is Clicking and Dragging to select a portion of text. I believe this 67 first appeared on the Macintosh in 1984. Before 66 that you had seperate buttons for picking 65 the beginning of a selection, and the end 64 of a selection. Quite onerous.
An example 63 of the latter is (may be) Visual Programming languages. I'm not talking 62 like hypercard, I mean like Max/MSP, Prograph, Quartz 61 Composer, yahoo pipes, etc. At the moment 60 they are really niche, but how I see it, is 59 that there's really nothing stopping them 58 from being just as expressive and powerful 57 as a standard programming language, except 56 for mindshare.
Visual programming languages 55 effectively enforce the functional programming 54 paradigm of referential transparency. This 53 is a really useful property for code to 52 have. The way they enforce this isn't artificial 51 either- it's simply by virtue of the metaphore 50 they use.
VPL's make programming accessible 49 to people who would not otherwise be able 48 to program, such as people with language 47 difficulties, like dyslexia, or even just 46 laymen that need to whip up a simple time-saver. Professional 45 programmmers may scoff at this, but personally, I 44 think it would be great if programming became 43 a really ubiquitous skill, like literacy.
As 42 it stands though, VPL's are reall a niche 41 interest, and haven't really got particularly 40 mainstream.
What we should do differently
all computer science majors should 39 be required to double major- coupling the CS major with one of the humanities. Painting, literature, design, psychology, history, english, whatever. A 38 lot of the problem is that the industry 37 is populated with people that have a really 36 narrow and unimaginative understanding of 35 the world, and therefore can't begin to 34 imagine a computer working any significantly 33 differently than it already does. (if it 32 helps, you can imagine that I'm talking 31 about someone other than you, the person 30 reading this.) Mathematics is great, but 29 in the end it's just a tool for achieving. we need experts who understand the nature of creativity, who also understand technology.
But 28 even if we have them, there needs to be 27 an environment where there's a possibility 26 that doing something new would be worth 25 the risk. It's 100 times more likely that 24 anything truly new gets rejected out of hand, rather viciously. (the newton is an example of this). so we need a much higher tolerance for failure. We should 23 not be afraid to try an idea which has failed 22 in the past. We should not fully reject 21 our own failures- and we should learn to 20 recognize when we have failed. We should 19 not see failure as a bad thing, and so we 18 shouldn't lie to ourselves or to others 17 about it. We should just get used to it, because 16 it is just about the only constant in this 15 ever changing industry. Post mortems are useful in this 14 regard.
One of the more interesting things, about 13 smalltalk, I think, was not the language 12 itself, but the process that was used to 11 arrive at the design of smalltalk. The iterative 10 design process, going through many many 9 revisions- But also very carefully and critically 8 identifying the flaws of the existing system, and 7 finding solutions in the next one. The more 6 perspectives, and the broader the perspectives 5 we have on the situation, the better we 4 can judge where the mistakes and problems 3 are. So don't just study computer science. Study 2 as many other academic subjects as you can 1 get yourself to be interested in.
The pre-1980 days were, of course, the glory 97 days of Xerox PARC. Back when the GUI, the 96 mouse, the laser printer, the internet, and 95 the personal computer were all being created. (Seeing 94 as I'm too young to have been alive back 93 then, and you were pretty much working on 92 inventing all of those, I can't tell you 91 anything about 1980 that you don't already 90 know, so let's move on.)
The thing is, though, that 89 the pre-1980 days were a lot more vibrant in 88 terms of truly disruptive new technologies. That's 87 the way it is with any new field -- hwo 86 many game-changing technology advances have 85 you seen in railroads in the past 100 years? How 84 many have you seen in lightbulbs? In the 83 printing press? Once something ignites a 82 hype in the right circles, there is an explosive 81 period of invention, followed by a long 80 period of maturing. After that, you're not 79 going to see the same kind of completely 78 radical changes again UNLESS the basic circumstances 77 change.
Luckily, that might be happening 76 in a number of fields, and it has already 75 happened in a few others:
Mobility - smart 74 phones bring computing to a truly portable 73 platform, which will soon include location-based 72 services and proximity-based ad-hoc networks. It's 71 a completely new paradigm that's potentially 70 as game-changing as the GUI has been
The 69 WWW (HTTP, HTML and DNS) has already been 68 mentioned and is an obvious addition to 67 the list, since it is enabling global, inexpensive, mainstream 66 rich communication across the globe - all 65 thanks to a computing platform
On the interface 64 side, both touch, multitouch (Jeff Han comes 63 to mind) and the Wiimote need mentioning. Currently, they 62 are basically curiosities, but so were the 61 early GUIs.
OOP design patterns -- higher 60 level solutions as best practices to hard 59 problems. Depending on your definition of 58 'computing', it may or may not belong on 57 the list, but if you count OOP as a significant 56 advance pre-1980 (I certainly do), I think 55 design patterns and the GoF deserve a mention 54 too
Google's PageRank and MapReduce algorithms 53 - I am pleased to notice I wasn't the first 52 to mention them, and seriously --- where 51 would the world be without the principles 50 of both of them? I vividly remember what 49 the world looked like before them, and suffice 48 it to say Google really IS my friend.
Non-volatile 47 memory -- it's on the hardware side, but 46 it is going to play a significant role in 45 the future of computing - making bootup 44 times a thing of the past, for example, and 43 enabling us to use computers in entirely 42 new ways
Semantic (natural language) search 41 / analysis / classification / translation... We're 40 not quite there yet, but companies like 39 Powerset give the impression that we're 38 on the brink.
On that note, intelligent HTMs 37 should be on this list as well. I am yet 36 another believer in Jeff Hawkins' model 35 and approach, and if it works, it will mean 34 a complete redefinition of what computers 33 can do, what it means to be human, and where 32 the world can go from here. Creating a real 31 intelligence in that way (synthetically) would be 30 bigger than anything the human race has 29 accomplished before.
GNU + Linux
3D printing 28 / rapid prototyping (and, in time, manufacturing)
P2P 27 (which also lead to VoIP etc.)
E-ink, once 26 the technologies mature a bit more
RFID might 25 belong on the list, but the verdict is still 24 out on that one
Quantum Computing is the 23 most obvious element on the list, except 22 we still haven't been able to get enough 21 qubits to play along. However, my friends 20 in the field tell me there's incredible 19 progress going on even as we speak, so I'm 18 holding my breath for that one.
And finally, I 17 want to mention a personal favourite: distributed 16 intelligence, or its other name: artificial 15 artificial intelligence. The idea of connecting 14 a huge number of people in a network and 13 allowing them access to the combined minds 12 of everyone else through some form of question 11 answering interface. It's been done a number 10 of times recently, with Yahoo Answers, Askville, Amazon 9 Mechanical Turk, and so on, but in my mind, those 8 are all missing the mark by a LOT... much 7 like the many implementations of distributed 6 hypertext that came before Tim Berners-Lee's 5 HTML, or the many web crawlers before Google. Seriously 4 -- someone needs to build an search interface 3 into 'the hive mind' to blow everyone else 2 out of the water. IMHO - it is only a matter 1 of time.
Reorganization is what we need, not reinvention.
We 21 have all the hardware and software components 20 we need right now to do amazing things for 19 years to come.
I believe there is a disease 18 in the Sciences, where ever participant 17 is always trying to invent something new 16 to distinguish themselves from others. This 15 is in contrast to doing some of the messy 14 work of cataloging or teaching older works.
People 13 who build 'new' things are generally considered 12 of a higher pedigree than people who reuse 11 existing and something almost ancient works. (Ancient 10 to say a 20 year old to whom something like 9 say Lisp was made more than double their 8 life time in the past. 1958)
Good old ideas 7 need to be resurrected and propagated far 6 and wide, and we need to stop trying to 5 build businesses or programmer movements 4 that effectively trample old works and systems 3 in power-plays to be the next new thing-when 2 in fact most 'new shiny' things are just 1 aspects of old ideas resurrected.
Effective Parallelization and Quantum Computing 8 - I think these are two areas where progress 7 has been made and much more progress will 6 be made to make very significant changes 5 to our use of computing power.
Effective 4 Parallelization meaning parallelizing and 3 distributing processing without the need 2 for special programming techniques, but 1 where it is built into the compiler/framework.
Flying cars and hoverboards. Oh wait, those 3 haven't been invented yet. But by 2015, we 2 have to have them. Otherwise Back To The 1 Future 2 will have been a big lie!
One thing that hasn't changed in mainstream computing 33 is the hierarchical filesystem. That's a 32 shame, IMO, since some work was being done 31 in the late 1980s and 1990s to design new 30 kinds of file systems more appropriate for 29 modern, object-oriented operating systems 28 -- ones which are OO from the ground up.
The 27 OO operating systems tended to have flat 26 object stores that were expandable and flexible. I 25 think the EROS Project was one built around that idea; PenPoint OS was 24 an 1990s object-oriented OS; and Amazon S3 of course 23 is a contemporary, flat object store.
The 22 are at least two ideas in OO, flat filesystems 21 that I particularly liked:
The entire disk 20 was essentially swap space. Objects exist 19 in memory, get paged out when they are not 18 needed, and brought back in when they are. There's 17 no need for a hierarchical filesystem that's 16 separate from virtual memory. Programs are 15 "always running," in a sense.
A 14 flat file/object store allows content to 13 be indexed and searched, rather than forcing 12 the user to decide -- ahead of time -- where 11 the content will live in relation to other 10 content and what its name shall be. A hierarchical 9 system could be built on top of the flat 8 storage, but it's not required.
As Alan Cooper 7 states in his book, About Face, hierarchical filesystems 6 are a kludge, designed for the computers 5 of the 1960s and 1970s with limited memory 4 and disk storage. Sadly, the popularity 3 of Windows and Unix have guaranteed the 2 dominance of the hierarchical filesystem 1 to this day.
Pretty much everything important in modern 5 3D computer graphics. Ray-tracing (in the 4 compute graphics sense) got its jump start 3 from Whitted's 1980 paper. Marching cubes 2 ('87) is the standard way to extract an 1 isosurface from 3D data.
Virtual Worlds in which you are represented 8 by a virtual alter ego (aka Avatar), for 7 socializing and roleplaying.
Most commonly 6 referred to as MMOs - Massive(ly) Multiplayer 5 Online. Some popular examples include World 4 of Warcraft, Everquest, Second Life.
PS: no, they 3 still don't require the heavy headgear as 2 typically depicted in geek movies of the 1 80s. It's a shame....
Touchscreens and Motion Sensing interfaces 3 for human computer interaction.
For example:
- Touchscreens for PDAs, iPhone or Nintendo DS
- Motion Sensing, Nintendo Wii Controller or (to a lesser degree) SixAxis controller for Playstation 3.
Only 2 question is ... are these technologies really 1 post-80s?
As for programming concepts, IoC / Dependancy 3 injection in 1988 with roots in 1983. Fowler 2 has some notes on the history of the concept 1 on his Bliki.
Access to massive data.
The sheer size and 8 scale of the data we have available these 7 days is massive compared to what it used 6 to be in the 80s. We've had to make a large 5 number of changes to both our hardware and 4 software to be able to store and display 3 this stuff. One day, we'll actually learn 2 how to qualify and mine it for something 1 useful. Someday.
Paul.
Premise: virtually no new inventions since 1980.
The first thing to do is define invention, or else 70 you'll get off on the wrong track. The 69 second definition of invention from Dictionary.com 68 says:
U.S. Patent Law. a new, useful process, machine, improvement, etc., that 67 did not exist previously and that is recognized 66 as the product of some unique intuition 65 or genius, as distinguished from ordinary 64 mechanical skill or craftsmanship.
Thus, since 63 1980, there have been very few new inventions in computing. What 62 has there been? Obviously there has been 61 large amounts of new technologies and new 60 things coming about, but what are they?
We 59 aren't inventing any more, we are improving what 58 primarily exists already.
A simple example:
The CD, or compact 57 disk, was first started in 1977 though they 56 weren't accepted by industry until 1982. At 55 this time the first factory for pressing 54 CDs just came into readiness. Eventually, by 53 1985, the CD-ROM (Read-Only Memory) was 52 accepted as a medium. The CD-RW followed 51 5 years later. (Source: Wikipedia)
Now what? Well, given 50 that we have larger hard drives (still just 49 improvement on the paradigm) we need more 48 space to be able to supplant the VHS market 47 and make videos compatible with computers. Thus 46 came about the DVD, though I am cutting 45 out many improvements to the existing CD 44 technology.
The DVD came about, was "invented", during 43 the year of 1995. (Source: Wikipedia)
Since then 42 we have had:
- Writable, and ReWritable DVDs
- Dual-layer DVDs
- Triple- and Quad-layer DVDs (unreleased though feasible through a simple driver revision)
- HD-DVD
- Blu-ray Disc
Obviously this list isn't all 41 inclusive. But spot the new invention, remember 40 the definition I gave above, in that list. You 39 can't! They're all just variations on the 38 concept of an optical disc, all just variations 37 on the same hardware, and all just variations 36 on existing software.
WHY?
Cost. See, it's cheaper 35 economically to make incremental improvements 34 to an existing product. If I can sell you 33 a HD DVD or a Blu-ray Disc because you believe it 32 to be necessary or cool, then I have no 31 need to release my plans for the Triple 30 or Quad layer DVDs. In fact, I can charge 29 you through the nose just to get the new 28 technology because you are an early adopter 27 and you need my "new and improved!" hardware.
This 26 is called either marketing, or product relations.
But what about software?
What 25 about it? Pre-1980 there was a lot of software 24 inventiveness going on, but since then it 23 has mostly just been improvements on what 22 already exists or reinvention of the wheel. Look 21 at any OS or office package to see this.
Conclusion
As 20 far as I'm concerned, there have been virtually 19 no new inventions in the past 29 years. I 18 could wax long and cross a great many industries, but 17 why should I bother? Once you start thinking 16 about it, and start comparing an "invention" to 15 a prior, similar product ... you'll find 14 it is so similar that it isn't even funny. Even 13 the internal combustion engine has been around since 1906 with no 12 new inventions in that field since then; many 11 improvements and variations of this "wheel" yes, but 10 no new inventions.
Not even that new weapon 9 America deployed in Iraq--the one that uses 8 microwaves to make a person feel shocked 7 like they touched a lightbulb--is new. The 6 same idea was used in security systems, then 5 classified and taken off the market, with 4 ultrasound to make an intruder feel physically 3 ill. This is a directed form of the weapon 2 with a different wavelength and application, not 1 a new invention.
Electrically Erasable Programmable Memory, generalized 15 into non volatile read/write memory the 14 most well known and ubiquitous currently 13 being Flash. http://en.wikipedia.org/wiki/EEPROM lists this as being invented 12 in 1984.
By giving the storage medium the 11 same general physics, power requirements, size 10 and stability as the processing units we 9 remove this as a limiting factor in designs 8 for where we place processors. This expands 7 the possibilities for how and where we place 6 'intelligence' to such a plethora of smart 5 devices (and things that would previously 4 never have been candidates for being considered 3 smart at all) that we are still taken up 2 in the surge. Mp3 players are really just 1 a fraction of this.
Optical computing. Seems like it should 5 have been around longer but I can't currently 4 find any references pre-dating 1982 or so 3 (and the relevant piece of technology, the 2 optical transistor, didn't pop up until 1 1986).
Well the World Wide Web has already been 25 told, but more basically, I would say "DNS". Seems 24 that it was invented in 1983 (http://en.wikipedia.org/wiki/Domain_Name_System) and IMHO 23 we can consider that it's the mandatory 22 link between invention of the internet protocol 21 and the capability to spread all over the 20 world what is now called the web.
Still in 19 the "network" section, I would 18 add WIFI. It was invented in the 90's (but 17 I agree it's not exactly "computing", but 16 more related to hardware).
In a more strict 15 "algorithmic" section, I think 14 about turbocodes (dated 1993); some say 13 it's only closing the limit defined by the 12 Shannon signal theory, but wouldn't this 11 argument reject all other answers to "everything 10 was already in seed in Lovelace, Babbage 9 and Turing writings" ?
On the field 8 of cryptography, I would add the PGP program 7 from P.Zimmermann (dated 1991), which brought 6 a quite robust (at this time) free encryption 5 program to the citizen, and contributed 4 to shake a little the government's posture 3 about encryption. In fact I think it was 2 one of the factor of cryptography "liberalization", which 1 was a prerequisite for developing e-commerce.
The changes to infrastructure to allow accessible 5 internet from home and office.
Documented 4 and accepted standards from W3C through 3 to APIs
Apart from that most of what we'd 2 think of as new dates back a lot longer 1 than you'd think (e.g. GUI, OOP).
I think the laptop was invented around 1980 5 and I also think that the development of 4 laptops and portable computing changed 3 a lot of people's lives - certainly those 2 of us who work in IT, or who use computers 1 and travel.
I'd say the biggest trend is an ever increasing 10 lack of location dependence and pervasiveness. An 9 interesting philosophical exercise these 8 days is to count the computers in you immediate 7 area. They're everywhere desktops, keyboards, microwaves, radios, televisions, cell 6 phones etc... My grandmother computer is 5 illiterate however her life is as infested 4 with small computers as everyone else's. She 3 can make a call to me from the middle of 2 an empty field. I can then answer that call 1 zipping down the highway.
Declarative Programming.
In 1979 "computer programs" were 27 imperative. The programmer was expected 26 to instruct the compiler on both what to do 25 and how to do it. (N1)
Today, ASP.NET WebForms and WPF programmers 24 regularly write code without knowing or 23 caring how it will be implemented. Wikipedia has other, less 22 mainstream examples. Additionally, all of 21 the SGML-derived "markup" languages 20 are declarative, and I doubt many of the 19 programmers of 1979 would have predicted 18 their importance or ubiquity in 30 years.
Although 17 the concept of declarative programming existed 16 before 1980 (see this paper from 1975), it's invention took 15 place with the introduction of Caml in 1985 14 (debatable) or Haskell in 1990 (less debatable). (N2) Since 13 then, declarative programming has increased 12 greatly in popularity. And, when massively 11 multicore processors finally arrive, we'll 10 all be declarative programmers.
--
Notes:
(N1) I can't 9 vouch for this firsthand, since I was a 8 fetus in 1979.
(N2) From other answers, it 7 seems like people are confusing conception 6 with invention. Da Vinci conceived of a 5 helicopter, but he didn't invent it. The 4 question is specifically on inventions in computing.
(N3) Please 3 don't mention Prolog (rel. 1975) in the 2 comments unless you have actually built 1 an app in it.
Podcasting It allows for an informative 4 way to distribute information and debate. I 3 find it to be more interactive then standard 2 interviews but have less noize then blog 1 comments.
Instant Messaging has been around from long time (mid to 17 late 60), but IRC did not come before 1988.
Video communication, on 16 top of that, (as in, for instance, Windows Live Messenger, or 15 Skype, or ...) really did change the way we are communicating ;) and is much more recent.
<correction>
(see VideoConferencing: 1968, alt text http://wpcontent.answers.com/wikipedia/en/thumb/6/64/On_Line_System_Videoconferencing_FJCC_1968.jpg/180px-On_Line_System_Videoconferencing_FJCC_1968.jpg, as 14 Alan Kay himself points out in the comment:
Again, please 13 check out what Engelbart demoed in 1968 (including live video chatting 12 and screen sharing). IOW, guessing really doesn't work as well as looking things up. This is why most 11 people make weak assumptions about when 10 things were invented.)
Take that in my face 9 ;), and rightfully so.
Note: the "webcam" (video 8 setup) of those times were not exactly made 7 for your average living-room ;)
</correction>
[... resuming 6 the answer:]
The generalization of webcam alt text http://wpcontent.answers.com/wikipedia/commons/thumb/c/c5/Logitech_Quickcam_Pro_4000.jpg/180px-Logitech_Quickcam_Pro_4000.jpg helped 5 too (Started in 1991, the first such camera, called 4 the CoffeeCam, was pointed at the Trojan 3 room coffee pot in the computer science 2 department of Cambridge University).
So: Post-1980: 2 1 out of 3: IRC and Webcam.
“American’s have no past and no future, they 5 live in an extended present.” This describes 4 the state of computing. We live in the 80’s 3 extended into the 21st century. The only 2 thing that’s changed is the size. Alan Kay 1
The memristor.
While the idea is not newer 5 than 1980, I believe a working model was 4 not created until 2008. Should it make it 3 past R&D, it will be the most significant 2 advance in computer hardware since the transistor; at 1 the very least, obviating secondary memory.
I claim that we need really new ideas in 13 most areas of computing, and I would like 12 to know of any important and powerful 11 ones that have been done recently. If 10 we can't really find them, then we should 9 ask "Why?" and "What should we be doing?"
The 8 way that I see it, we have not had so many 7 new ideas in computing because we largely 6 haven't needed them. We have been milking 5 the old ideas, and getting so much out of 4 them, such as the phenomenal growth of cpu 3 speed.
When we need new ideas because the "well 2 has run dry" so to speak, then we will see 1 that necessity is the mother of invention.
The one activity I can think of that wasn't 17 there in 1980 was Global Searching Across 16 Disjoint Domains. i.e. google and a (very 15 few) predecessors - all of which were well 14 post-1980. Associated with conventions for 13 syntactic markup,I think it qualifies as 12 a "new idea"; but I think it also has only 11 just begun; there's a lot of overhead space 10 to build up into.
One device that has the 9 potential to accelerate this already lightning-speed 8 vector will soon emerge as the combination 7 camera/GIS/phone/network. It creates the 6 opportunity to automatically collect, classify, and 5 aggregate datapoints in four-dimensional 4 space for the first time. Even tedious manual 3 collections of this type of data are sprouting; imagine 2 when it's done by default.
For better or 1 worse.
Design Patterns which brought computer science 4 closer to computer engineering. GPS and 3 internet address lookup for location based 2 interactions. Service Oriented Architecture 1 (SOA).
Open PC design that led to affordable components (except 5 from Apple :-) and competition that drove 4 innovation and lower prices. This caused 3 the big change from the user going to the computer -- where there 2 was a terminal to use -- to the computer coming to the user and appearing 1 at home and even in ones lap.
Games With a Purpose - Collective intelligence tools like Luis 5 von Ahn and his team are developing might 4 have been a dream before 1980, but there 3 wasn't a widely deployed network with millions 2 of people available and a need (e.g. reCAPTCHA) to 1 actually make it happen.
IP Multicast (1991) and Van Jacobsen's Dissemination Networking (2006) are 1 the biggest inventions since 1989.
This is a negative result, which is odd 19 as a 'Fundemental innovation', but I think 18 applies since it opened new areas of research, and 17 closed off useless ones.
The impossibility 16 of distributive consensus: PODC Influential Paper Award: 2001
We assumed 15 that the main value of our impossibility 14 result was to close off unproductive lines 13 of research on trying to find fault-tolerant consensus 12 algorithms. But much to our surprise, it 11 opened up entirely new lines of research. There 10 has been analysis of exactly what assumptions about 9 the distributed system model are needed 8 for the impossibility proof. Many related 7 distributed problems to which the proof 6 also applies have been found, together 5 with seemingly similar problems which 4 do have solutions. Eventually a long line 3 of research developed in which primitives 2 were classified based on their ability 1 to implement wait-free fault-tolerant consensus.
Low cost/home computing. Something that 17 (at least here in Blighty) wasn't really 16 heard of until the early 1980s. Without 15 home computing, how many people posting 14 here would have got into computing as a 13 career? Or even as a hobby1?
Myself, had my 12 folks not got Clive Sincliar's humble rubber-keyed 11 ZX Spectrum back in 1982/1983, I probably wouldn't 10 be here now. And it wasn't just the Speecy: the 9 C64, Vic-20, Acorn Electron, BBC A/B/Master, Oric-1, Dragon-32, etc. all fuelled the home computer 8 market and made programmers out of every 7 8 year old boy and girl who had access to 6 one.
If that wasn't a revolution in terms 5 of computing and programming, I won't know 4 what was...!
1 curious aside: what is the 3 breakdown of hobbyists vs pro programmers 2 on this site? I realise these stats aren't 1 collated, but could be interesting to know.
Augmented Reality. This hasn't really taken off yet, but as 14 ideas go I think it is huge, from being 13 able to paint virtual arrows on the ground 12 to help you find your destination, to decorating 11 everything around you with useful information 10 or aesthetic fancies.
Imagine your phone 9 ringing across the room, you look at it 8 and a information bubble pops up above it 7 to tell you who is calling. How cool would 6 that be? AR will bring massive changes 5 in the way we think about and interact with 4 technology.
Haunted houses would probably 3 get significantly scarier too.
I also wanted 2 to mention Electroencephalography for brain-computer interfacing, but apparently this was 1 first invented in the 1970's.
Virtualization?
applications like VirualBox 1 OSE or VMWare have saved me many hours.
Adoption of Object Orientation.
The idea 6 was around earlier (e.g. Simula), but it 5 became mainstream in the 1990s. (IMHO, one 4 of its greatest benefits is having providing 3 a common vocabulary amongst developers, so 2 its widespread adoption made it much more 1 valuable.)
I would also nominate 3D mouse. There are 5 several variants in existance from early 4 1990s. For anyone working with 3D, things 3 like SpaceNavigator make life much easier. (Disclaimer: I'm 2 not affiliated with 3Dconnexion in any way, just 1 satisfied and now RSI-free user.)
I belive that nothing important was invented.. but 51 the perspective on software changed a lot since the '80s. Back then there were more theoreticians 50 involved in this thing, and now you are 49 asking this question on a programmers 'forum'.
Most 48 of the ideas back then didn't get implemented, or 47 when implemented they didn't had any real 46 importance as the software industry did 45 not exist, nor marketing or HR or development 44 stages, or alpha versions:).
Another reason 43 for this lack of inventions is the fact 42 that most people use Windows:) dont get 41 me wrong, i do hate M$, but look at it this 40 way: you have a perfectly working interface, with 39 nothing new to add to it, maybe just some 38 new colored buttons. Its also closed enough 37 so you wont be able to to anything with 36 it without breaking it. Thats why i prefer 35 open apps, this way you get more "open" people, to 34 whom yo can actually talk, ask then questions, propose 33 new ideeas that actually gets implemented, or 32 at least put on an open todo-list, thus 31 you get some kind of "evolution". You dont 30 really see anything new because you are 29 stuck with the same basic interface "invented" lots 28 of years ago... did anyone actually tried 27 ION window-manager in a production environment? It 26 has a new kind of interface, and actually 25 lets you do things faster, event it it looks 24 quirky
M$, Adobe..you name it,holds lots 23 of patents so you wont be able to base your 22 work on them, or derivatives(you also wont 21 know what kind of undeveloped tehnologies 20 they hold). Look at MP3 and GIF as examples( i 19 belive that they are both free formats now, but 18 they are also kinda dead..) MP3 is the 'king' of 17 audio evend if there are few algorithms 16 out there much better that it..but didnt 15 get enough traction because they weren't 14 pushed on the consumer market. The GIF... come 13 on, 256 colors??? From this point of voew 12 i'm curios how many people from this thread 11 are working on something "open" that will 10 get to be reused in some other projects, and 9 how many on "closed", protected by NDA's 8 projects?
Even if it sounds kinda "free willy" kinda 7 speech, back in the 80's the software was 6 free, you got documentation for everything, and 5 all hardware was more simple and easier 4 to work with... and also more limited, so 3 people didnt actually waste time to implement 2 3d games or web-pages but worked on real 1 algorithms.
Ctrl-C + Ctrl-V + Ctrl-X combo :)
0
The first true multimedia personal computer, the 6 Amiga: the first 32-bit preemptive multitasking 5 personal computer, the first with hardware 4 graphics acceleration, the first with multichannel 3 sound and in many ways a far more useful 2 and capable machine than the multicore, multigigahertz 1 Windows boxen that proliferate today.
The Bizarre style of development (as described 4 in http://www.catb.org/~esr/writings/cathedral-bazaar/cathedral-bazaar/ by Eric S Raymond). Raymond credits 3 Linus Tourvald's release of the Linux kernel 2 in 1991 as the first use of the Bizarre 1 style of development.
Sensor networks: very tiny (nano scale) computers 4 form ad-hoc p2p networks and transmit "sensory" information.
3D 3 printing: Star Trek replicator for physical 2 objects (no Early Grey tea yet).
DNA computing: Massively 1 parallel computing for some types of problems.
Translation software with community support 26 to make manual corrections and recommendations, followed 25 up with an AI bot to form patterns to eventually 24 distinguish and correctly predict ambiguity 23 in different translations and contexts.
While 22 it's true Google Translate might not be that beast, it is 21 the mother, or perhaps the grandmother of 20 a system just waiting to be developed.
If 19 you think about it - textual language is 18 really input to the brain, the eyes see 17 the text and sends images to the brain, which 16 then translates this into understanding.
While 15 its true communication (especially human 14 communication) is an advanced topic, the 13 basics are input (with context) -> translation 12 -> understanding.
Why do we still have 11 no really good way to send emails to distant 10 co-workers, or partners who don't speak 9 our language? This is obviously the Phase 8 1.
Once this is complete, we can move onto 7 stuff like real-time phone call translation.
Instead 6 month after month our greatest intelectual 5 assets are involved in other more crucial 4 projects, like space research, and meteor 3 detection, or trying to prove the Bible 2 wrong (yawn).
How about we dedicate more 1 time to basic practical communication?
USB Keys/Thumb drives
USB Keys were the effective 3 replacement of the floppy, where the floppy 2 was still superior to the CD or DVD in simple 1 transfer.
I think a very important invention for computing 5 in the past 50 years was GOOGLE. The internet 4 means nothing without a good tool to search 3 it. The advent of search engine revolutionized 2 the internet and enabled it to be monetized 1 by the little guy.
RAID (1988).
Arguably this is just an application 4 of error correction codes from years gone 3 by, but then arguably everything in computer 2 science can be reduced to basic mathematics 1 which has been around for millennia.
Augmented Reality
Where a view the the real world is combined 7 with virtual elements in some way.
The term 6 Virtual Reality was coined in 1989 a few 5 years before the term "Augmented Reality" came 4 into existence.
Some early enabling technologies 3 were invented before 1980 but the concept 2 itself dates from the early nineties (at 1 least that's what Wikipedia says.)
Maybe a forum of science fiction authors 10 would give you more interesting answers? ;-)
I 9 suspect theres a bit of a fallacy at work 8 here, your viewing the history of technology 7 and science as a steady march of progress, as 6 a linear phenomenon. I suspect it is in 5 fact a process of fits and starts, context, economics, serendipity 4 and plain ole randomness.
You should feel 3 fortunate that you were at the centre of 2 one of the great waves of history, most 1 people will never have that experience.
A few answers mention quantum computers 20 as if they're still far in the future, but 19 I beg to differ.
There were vague mentions 18 of possibility of quantum computers in 1970s 17 and 1980s (see timeline on Wikipedia), however the first "working" 3-qubit 16 NMR quantum computer was built in 1998. The 15 field is still in infancy, and almost all 14 progress is still theoretical and confined 13 to academia, but in 2007 company called 12 D-Wave Systems presented a prototype of 11 a working 16-qubit, and later during the 10 year 28-qubit adiabatic quantum computer. Their 9 effort is notable since they claim that 8 their technology is commercially viable 7 and scalable. As of 2010, they have 7 rigs, current 6 generation of their chips has 128 qubits. They 5 seem to have partnered with Google to find 4 interesting problems to test their hardware 3 on.
I recommend this short 24-minute video and Wikipedia article on D-Wave for a quick 2 overview, and there a lot more resources 1 on this blog written by D-Wave founder and CFO.
MPI and PVM for parallelization.
0
Utilization of functional programming/languages 1 within OS core development.
'Singularity', and all projects like it, i.e. development 1 of operating systems in managed code.
Not sure about 1980, but the AI community 2 has been an idea-generator for decades, and 1 they're still at it.
To answer a slightly different question. I 13 think we need big ideas in the areas of 12 Privacy, Trust and Reputation. My computer has the ability to capture 11 almost everything about me, where I am, what 10 I say, what I type, what I see,... A huge 9 amount of information with an equally large 8 number of entities (people, shops, sites, services) with 7 whom I might want to share some of that 6 information even if it's just a single piece 5 of data.
My information needs to mine (not 4 Google's, Facebook's or Apple's). My computer 3 needs to use it on my behalf and so trust 2 needs to be end-to-end. Then we can dis-intermediate 1 the new information middle men.
(Widespread) Encryption. Without Encryption 3 no financial transaction would ever take 2 place. And this is still an area which can 1 use more innovation and user friendlieness.
Multi-Agent Systems.
You can go back to distributed 25 artificial intelligence roots, and I think 24 still stay safely this side of the 80s.
There's 23 many components to multi-agent systems, with 22 lots of studies going into speech acts or 21 cooperation, so it's rather difficult to 20 point and say "See, here, this is different, innovative 19 and important!" But I'll try anyway. :-)
I 18 think the Belief-Desire-Intention model 17 is particularly noteworthy. Agents have 16 internally constructed models of the world. They 15 have particular desires, or goals, and formulate 14 plans on how to interact with the world 13 as they know it to achieve those goals, thereby 12 making up intentions.
Or, to use an analogy, the 11 characters in Tron, the movie, have a certain 10 understanding of how the world around them 9 worked. They did not KNOW the whole world, and 8 they could be mistaken about parts of it. But 7 they had desires and goals, and they came 6 up with plans to try to further that. If 5 you saw Tron, I'm sure you'll get the analogy.
It 4 hasn't had much an impact on computing YET. But, see, things 3 that have impact on computing seems to take 2 a few decades anyway. See: OOP, GC, bytecode 1 compilation.
The massive increases in processor speed 8 that have occurred over the last 30 years 7 can't be overlooked. All manner of clever 6 ideas such as pipelining and pre-emptive 5 branching, as well as improvements in electronic 4 side of processor design, mean that programmers 3 today can worry more about the design and 2 maintainability of their programs and worry 1 less about counting clock-cycles.
The mouse - There have been posts about 25 human interaction. To me, the mouse was 24 the gateway to human interaction. Without 23 it, we'd still be typing and not clicking 22 in dragging, even with our fingers.
GUI 21 - Complimented the mouse perfectly. I work 20 in an environment where an as400 is the 19 backend of one of our major apps. Yeah.. Interesting 18 stuff but it just reminds me of the screens 17 'Bill Gates' is working in in the movie 16 'Pirates of Silicon Valley' even though 15 that's not what it was. To me, 1 and 2 are 14 the reason anybody, including grandpas and 13 grandmas can use a computer.
Excel / spreadsheets 12 - Someone mentioned this before but it's 11 work mentioning again. It's so user friendly 10 and is a great entry point for non-technical 9 users to try their hand at simple programming 8 concepts when performing calculations on 7 cells. Granted it came out before 1980, but 6 the versions post 1980 are when the technology 5 in spreadsheets evolved.
Internet (of course) - Not 4 sure how people wrote code without it! Don't 3 flame me for repeating because this belongs 2 on every list.
INTELLISENSE - LOVE IT LOVE 1 IT LOVE IT!!!!
The successful integration of different 6 programming paradigms into single programming 5 environments.
The exemplar of this (for me) is 4 the Mozart/Oz programming system, which integrates functional, OO, logic, concurrent 3 and distributed programming mechanisms into 2 a coherent whole. There are other examples 1 though.
The rise of motion sensors in gaming which 9 does away with the traditional game joysticks 8 and lets the user very close to the game 7 itself. This complements our ever changing 6 urban landscape and lifestyle where we have 5 limited physical activity. This advancement 4 in gaming definitely induces atleast some 3 physical activity while doing something 2 that one enjoys. It is definitely better 1 than doing same mundane reps at your gym.
I think the most concepts in computing have 8 mostly been undergoing refinements, but 7 there have been some new developments, particularly 6 in distributed computing.
- Robustness against failure and defection, and failure recovery, ie. Paxos, Byzantine Fault Tolerance, etc.
- I know people have mentioned P2P, and that P2P communication was happening in the 70s, but with all due respect I don't think it was of the same nature as is commonplace today, with distributed hash tables, efficient dynamic ad-hoc networks, and most importantly, anonymity (ala Freenet, Tor).
The majority of 5 work has been refinement, and while many 4 modern systems are little better than the 3 original concepts first described in the 2 60s or earlier, some are orders of magnitude 1 better.
I would say that CDMA was/is an important 2 and powerful new idea that was created after 1 1980.
c++ programming language (1983) template 1 metaprogramming (1994)
X.500 and the x.500 series of standards (circa 5 1988). While the x.500 standards were inspired 4 by telco standards dating back decades, they are significant 3 as they paved the way for the widespread 2 use of LDAP/AD and our current incantation 1 of x.509 certificates to name a few.
A really hard question since, aside ridiculously 6 improved hardware, there's few things that'd 5 have been significantly positive inventions 4 after that time. Though there are many significant 3 inventions before 1980s that affect people 2 only but now because they were infeasible 1 back then.
Heck. Descent
the Enterprise Service Bus would appear to be a fairly recent 'invention', though 1 of course it is based on much older technologies.
and it's of use of the Lengauer-Tarjan dominator tree algorithm for memory usage 1 analysis.
Digital music synthesizers.
I think, the 12 whole music scene was affected by the availability 11 of cheap polyphonic synths. The early polyphonic 10 synths where effectively multiple analog 9 synths (discrete or using CEM or SSM chips). They 8 were both expensive and very liited. During 7 the 80's, the first digital systems arrived 6 (I am not sure, but I think Kurzweil was 5 one of the first). Today, mostly all are 4 digital - even the analog ones are typically 3 "virtual anlog".
regards
EDIT: oops - I just 2 found out that the CMI fairlight was invented 1 in 1978. So forget the above - sorry.
I'm not qualified to answer this in the 31 general sense, but restricted to computer 30 programming? Not much.
Why? I've been thinking 29 about this for a while and I think we lack 28 two things: a sense of history and a way 27 to objectively judge everything we've produced. This 26 isn't true in all cases but is in the general.
For 25 history, I think it's just something not 24 emphasized enough in popular writing or 23 computer science programs. Take language 22 features, for example. A canonical source 21 might be HOPL, but it's definitely not common 20 knowledge among programmers to be able to 19 mark the point in time or in which language 18 a feature like GC or closures first appeared. And 17 of course after that there's knowledge of 16 progression over time: how has OOP changed 15 since Simula? Compare and contrast our 14 sense of history with that of other fields 13 like maybe political science or philosophy.
As 12 for judgement, this is really a failure 11 on our part to seek objective measures of 10 success. Given foobar, in what measurable 9 way has it improved some aspect in the act 8 of programming where foobar is any of design 7 patterns, agile methodology, TDD, etc etc. Have 6 we even tried to measure this? What do 5 we even want to measure? Correctness, programmer 4 productivity, code legibility, etc? How? Software 3 engineering should really be picking away 2 at these questions, but I've yet to see 1 it.
I think part of the problem with these answers 9 is they are either not well researched or 8 are attempting to a new implementation or 7 some technology that has seen significant 6 "improvements." However, this is not a significant 5 invention. For instance, any talking about 4 functional programming or object oriented 3 programming just fails; most of these ideas 2 have been circulating since before most 1 of the participants of SO were born.
In order to start thinking about this, I 53 need a model for what "innovation" means.
The 52 best model I've seen is The Technology Adoption 51 Life Cycle. You can get an overview at 50 this Wikipedia Article.
Using this model, I began to ask myself... at 49 what stage of the life cycle is software 48 itself? We can think of "software" as 47 a distinct technology from machinery going 46 all the way back to Babbage, or perhaps 45 more precisely, to Lady Ada Lovelace.
But 44 it surely remained at the very early pioneering 43 stage at least until about 1951. That's 42 the year programmed computers "went 41 commercial" in terms of selling a model 40 for a computer product, and building lots 39 of units of that model. I'm thinking of 38 the machine that Univac sold to the Census 37 Bureau.
From 1951 to about 1985, software 36 innovations were numerous. They mostly 35 had to do with extending the span of computing 34 to an ever wider field of endeavor. In 33 parallel, mass marketing and mass production 32 kept bringing the cost of entry down till 31 the Apple and IBM-PC made a programmable 30 device a commonplace appliance.
Somewhere 29 between 1980 and 1985, I'd say that software 28 passed from the Innovator's domain to the 27 "Early majority" domain. Sorry, guys, but 26 that makes all of you that participated 25 in MS-DOS, the Mac, Windows, C++ and Java 24 eraly majority rather than innovators. That 23 doesn't preclude your having done significant 22 innovation on your own turf and in your 21 own projects. It just means that the field 20 itself had moved on from the earliest stage.
While 19 the Internet's precursor had been around 18 since the 1970s, it wasn't until Al Gore 17 invented the internet (sorry) that everybody 16 hooked up. At that stage, software passed 15 from the early majority to the late majority. This 14 shift was subtle, as the top of the bell 13 curve suggests. Not every shop moved from 12 early majority to late majority at the same 11 time.
I don't think software has quite passed 10 into the "laggard" stage yet, but 9 I think that real innovators are tackling 8 the problem of producing progress on different 7 fronts today.
Two fronts that I can think 6 of are Bioengineering and Information Appliances. Both 5 of these fields require software, but the 4 main thrust is not software innovation. It's 3 applying software to uncharted territory. There 2 are probably lots of other fronts that I'm 1 not even aware of.
I would vote, as a Debian user, for package 9 management. It makes OSX and Windows 7 look 8 like primitive amateurish playthings.
But 7 since package management was already mentioned, I 6 will vote for X. The network transparent 5 window server has made a lot of applications 4 possible. It's wonderful to be able to seamlessly 3 summon programs running on different computers 2 side by side on the same screen.
And that 1 was a tad more impressive in the late 80s.
Bitcoin's solution to the double-spending problem. It 10 was used to create a decentralized electronic 9 currency. A variant called Namecoin uses the same 8 technology to build a decentralized naming 7 system (similar to DNS).
There were attempts 6 to create cryptocurrency in the past (and 5 the idea is certanly not new), but Bitcoin 4 seems to be the first implementation which 3 took off. Its unique P2P algorithm solves 2 the double-spending problem without relying 1 on any trusted authority.
Protected memory. Before protected memory 7 if your program made a mistake, you could 6 start executing code anywhere- virtually 5 always hanging the entire machine. That's 4 right, reboot time!
Low cost of hardware. My 3 first computer cost $500 in 1978- a huge 2 sum at the time. Lowering costs put PCs 1 on every desk.
Natural Language Processing. The first time I encountered this was 4 in the early 1990s with a program from Symantec 3 called Q&A that let you query the database 2 by typing English queries. I am still impressed 1 by it to this day.
StackOverFlow.com
0
Paxos protocol. It's difficult to describe 1 how valuable it is in internet era.
Computer Graphics, Special Effects, and 1 3D Animation
I do not know if somebody has already answered, "machine 5 learning" as a significant new development 4 that is developing fast. With intelligent 3 spam filtering, stock market predictions, intelligent 2 machines like robots, ...
May be, machine 1 intelligence might be the next big thing.
Let's see, Connection Machines (Massive 3 Parallelism) for one.
Anyway, this whole 2 question seems like an egoboo for Alan Kay 1 since he invented everything.
The mathematics for quantum computing has 4 been around since before 1980, but the hardware 3 isn't here yet and may be physically and 2 economically infeasible for many years to 1 come.
The Personal Computer.
Hands down, the most important part of 6 computing in the last thirty years is that 5 everyone is now part of it. Computers for 4 home use only date to 1977 or so, and widespread 3 adoption took until well into the 80's. Now, kindergartens, senior 2 centers, and every next door neighbor you'll 1 ever have owns one.
The Internet.
That's it.
0
I'd have to say that the biggest invention 10 in computing since 1980 is Moore's law. There 9 were tons of really cool, innovative things 8 created in the 1960s and 1970s - but they 7 were insanely expensive one-off projects. And 6 most of these projects are lost in the mists 5 of time.
Today, the cool, innovative project 4 gets a couple rounds of funding and is available 3 on everybody's desktop or web browser in 2 6 months or so.
If that's not innovative, what 1 is?
I would say Linux and the reification of 8 the worse-is-better philosophy, but you 7 can argue that those are older. So I´d say: quantum, chemical, peptide, dna, and 6 membrane computing, (re)factoring in a non 5 ad-hoc fashion and automated, aspects, generic 4 programming, some types of type inference, some 3 types of testing,
The reason why we have 2 no new ideas: sw patents (this comes from 1 the late 60s ...), corporations and education.
Personal Broadcast Communication
Facebook, Twitter, Buzz, Qaiku... the implementations 11 are varying, focusing on different aspects 10 - managed audience, conciseness, discussions. The 9 specific services come and go, but the new 8 concept of communication remains. Blogs 7 are of course what started this, but the 6 new services have made the communication 5 socially connected, which is an essential 4 difference.
Not quite sure if this exactly 3 goes under the subject of computing, though, but 2 it's something that's significant, and only 1 made possible by computing and networks.
Open Croquet http://www.opencroquet.org - A Squeak, Smalltalk-based 3D environment 23 which lets multiple users interact and program 22 the environment from inside itself. It has 21 it's own object replication protocol for 20 sharing environments efficiently and scaleably 19 over the internet. **It's difficult to describe 18 because there just isn't anything else remotely 17 like it...
1) I'm proposing this because 16 when I try to explain to other people what 15 it is I find them expecting me to compare 14 it to other things... and I still haven't 13 found anything remotely like it although 12 there are many elements present from other 11 systems (e.g. Smalltalk, Open GL, etoys, virtual 10 worlds, remote collaboration, object-oriented 9 replication architectures) the whole seems 8 to be much more than the parts...
2) Unlike 7 many of the technologies mentioned here 6 it hasn't settled down into a widely exploited 5 commercial niche...
Both points are signs 4 of an early-stage technology.
I suspect that 3 when Alan Kay started work on it, he might 2 have been thinking about the theme of this 1 question in the first place.
Fast clustering algorithms ( O(n log n) in 7 the number of data points ) such as DBScan (from 1996) seem 6 to all date from after 1980.
These have been 5 part of general wave of progress in data-mining 4 techniques.
Contrast this with lack of progress 3 in line-finding for which poorly scaling 2 techniques like the Hough still seem to 1 represent the state of the art.
More Related questions
We use cookies to improve the performance of the site. By staying on our site, you agree to the terms of use of cookies.