Who In The Heck Are These Smart People. And How Did They Learn To Hack & Manipulate The Internet

26 replies
  • OFF TOPIC
  • |
Okay folks. I have a question. Who are the people that were the brains of the technical workings of the internet. And how and where did they aquire such knowledge to be able to build software or algorithms or hacks that changed the internet forever.


These were obviously some geniuses that had studied this from the beginning of Internet times.

So who started computer science and how and the world was people smart enough to learn the actual workings of it in order to build there systems?

So does anyone know how the unknown computer geeks were able to accomplish starting the internet and manipulating softwares for all purposes.
  • Profile picture of the author ForumGuru
    Banned
    Originally Posted by Dano101 View Post

    So does anyone know how the unknown computer geeks were able to accomplish starting the internet and manipulating softwares for all purposes.
    This is a decent place to start regarding the internet...

    https://en.wikipedia.org/wiki/ARPANET

    Checkout the Internet History Timeline: ARPANET to the World Wide Web below...

    http://www.livescience.com/20727-internet-history.html

    As far as computer science goes...the first Department of Computer Science in the United States was established at Purdue University in 1962.

    https://www.cs.purdue.edu/history/history.html

    Cheers

    -don
    {{ DiscussionBoard.errors[10639249].message }}
  • Profile picture of the author Dano101
    I have another question. Is that Google movie The Internship with Vince Vaugnh and Owen Wilson real in certain parts.

    Does Google actually find the best of the best through a intern program?

    Also. Where and how these canidates learned how to make the internet or google more efficient.
    {{ DiscussionBoard.errors[10639262].message }}
  • Profile picture of the author ForumGuru
    Banned
    These are the schools Apple, Microsoft, IBM, Google, Facebook, Yahoo and Twitter turned to most often for recruits (from: Wired - class of 2014):



    Cheers

    -don
    {{ DiscussionBoard.errors[10639280].message }}
  • Profile picture of the author socialentry
    In the Marines.
    {{ DiscussionBoard.errors[10639547].message }}
  • Profile picture of the author seasoned
    Originally Posted by Dano101 View Post

    Okay folks. I have a question. Who are the people that were the brains of the technical workings of the internet. And how and where did they aquire such knowledge to be able to build software or algorithms or hacks that changed the internet forever.


    These were obviously some geniuses that had studied this from the beginning of Internet times.

    So who started computer science and how and the world was people smart enough to learn the actual workings of it in order to build there systems?

    So does anyone know how the unknown computer geeks were able to accomplish starting the internet and manipulating softwares for all purposes.
    The computer industry as we know it, started with electronic engineers. It was built from the ground up, and even UNIX effectively had hundreds of people working on it.

    "computer science" is now a nebulous term, but it is basically a number of theories that should seem pretty obvious, and info on what was created. So electronics and various theories creates a new piece of hardware, or software, and, if it becomes popular enough and is unique enough, it may be taught in computer science.

    As for fame? That is the way it often works. Even some STARS don't get credit! On Laurel and Hardy, Laurel is almost always the one portrayed as a fool, but he had a lot more to do with the show than merely staring on it. The three stooges could have shown more talent etc... but they had the show they did.

    And Jobs? He fired a lot of talent, and demanded that people gamble all on a new computer. It was a total flop. Overpriced, incompatible, etc... They did everything wrong. Meanwhile, Wozniaks name may become a footnote, even though he had so much to do with the company.

    OH, and as far as colleges? ForumGuru might be right about the first degree being in purdue in 1962, but computer science predates that a LOT. The first computer that may still be viable for direct application to work here was likely one announced in 1964, and delivered in 1965. And NOPE, it wasn't the first. HOW did people learn about that IBM computer? They either had to study at a place like IBM, or they had to work at a large company that trusted them enough to even give them access.

    Most people that graduated from that class in purdue would have been lost in dealing with the internet. They would have to learn a unique language created in 1969, and a lot of other things created in the 70s.

    One thing is for sure, it is always evolving. Colleges have to constantly update courses and all to keep up. They don't always do that. Computer courses often start with some history. They might talk about a computer that used no electricity, one that used no electronic theory(that actually created the term BUG), one that used electronics, but no silicon, etc.... It might talk about languages many consider to be dead. It may talk about a file access method that even few today may really ever think about. They may seem antiquated now, but they were once state of the art.

    As for where people find out how to speed up processes? THAT is where the computer science comes in. Like those theories and even tools I spoke of. You could have hundreds of people trying to improve various parts.

    Steve
    {{ DiscussionBoard.errors[10640112].message }}
    • Profile picture of the author ForumGuru
      Banned
      Originally Posted by seasoned View Post

      OH, and as far as colleges? ForumGuru might be right about the first degree being in purdue in 1962, but computer science predates that a LOT. The first computer that may still be viable for direct application to work here was likely one announced in 1964, and delivered in 1965. And NOPE, it wasn't the first. HOW did people learn about that IBM computer? They either had to study at a place like IBM, or they had to work at a large company that trusted them enough to even give them access.
      Actually, the first M.S. degrees were awarded in 1964, and the first Ph. D. degrees were awarded in 1966.

      Originally, only four computers connected to ARPAnet in 1969, all of them in their respective computer research labs. The four nodes were:

      UCLA (Honeywell DDP 516 computer),
      Stanford Research Institute (SDS-940 computer),
      UC Santa Barbara (IBM 360/75)
      University of Utah (DEC PDP-10).

      Stanford and UCLA were the first two nodes to connect...

      As the network grew, different models of computers were connected creating compatibility issues... The solution was a better set of protocols, TCP/IP (Transmission Control Protocol/Internet Protocol) was designed in 1982 to alleviate the connectivity/compatibility issues.

      Several other major innovations occurred under/with ARPAnet:

      Email (1971)
      Telnet (1972)
      FTP (1973)

      A bit of Purdue history...

      Purdue's computer science department acquired a VAX 11/780 in 1978, which was the FIRST VAX to be running outside the developer's sites, Berkely and AT&T Bell labs.



      Introduced in 1977, the VAX-11/780 was (for a time) the standard in CPU benchmarks. It was supposed to be a one-MIPS machine, equivalent to an IBM System/360. It was was fairly widely reported that the actual performance was about half of that. VAX is the acronym for --> virtual address extension.

      A few notable Purdue alum..

      MSEE '62 and former Computer Science instructor Frank Greene Jr's. company, ZeroOne Systems, successfully established the facility for the first Cray supercomputer installation at NASA (Ames Research Center).

      BSIE '75, MSIE '78 Rocky Rhodes, along with a team led Jim Clark, founded Silicon Graphics in 1982. Silicon Graphics is the leading manufacturer of high end visual computing systems.

      Purdue is no stranger to new technology and cutting edge study.

      BSEE '33 Edward Purcell - Nobel Laureate for Nuclear Magnetism (Physics - 1952) Purcell found a way to detect magnetism around the atomic nucleus. Purcell's work contributed greatly to the development of the MRI. He was the science advisor for Presidents Eisenhower, Kennedy, and Johnson.

      The first and the last man to walk on the moon both graduated from Purdue.

      BSAE ’55 Neil Armstrong walked on the moon in 1969.

      BSEE ’56 Eugene Cernan walked on the moon in 1972.

      Apollo 1 pioneers Roger Chaffee (BSAAE ’57) and Gus Grissom (BSME ’50) were killed in a fire aboard the Apollo Command module, subsequently changes were made making the module much safer for future astronauts.

      Purdue has put a bunch of peeps on the space shuttle as well...

      MSAAE ’66 John E. Blaha

      MSAAE ’66 Roy D. Bridges Jr.

      BSAAE ’73 Mark N. Brown

      MSAAE ’67 John H. Casper

      MSAAE ’69 Richard O. Covey

      BS ’89, Solid Earth Sciences, MS ’91, Geophysics Andrew J. Feustel

      MSAAE ’86 Guy S. Gardner

      BSAAE ’78 Gregory J. Harbaugh

      MSAAE ’72 Gary E. Payton

      Cheers

      -don
      {{ DiscussionBoard.errors[10640245].message }}
      • Profile picture of the author seasoned
        Originally Posted by ForumGuru View Post

        Actually, the first M.S. degrees were awarded in 1964, and the first Ph. D. degrees were awarded in 1966.

        Originally, only four computers connected to ARPAnet in 1969, all of them in their respective computer research labs. The four nodes were:

        UCLA (Honeywell DDP 516 computer),
        Stanford Research Institute (SDS-940 computer),
        UC Santa Barbara (IBM 360/75)
        University of Utah (DEC PDP-10).
        I might as well say I was referring to the IBM 360! As you can see, it is the third one you listed. The 4th doesn't count, as it was created AFTER the early 60s. Do the other 2 exist in any form in the internet era?

        Steve
        {{ DiscussionBoard.errors[10640502].message }}
        • Profile picture of the author ForumGuru
          Banned
          Originally Posted by seasoned View Post

          I might as well say I was referring to the IBM 360! As you can see, it is the third one you listed. The 4th doesn't count, as it was created AFTER the early 60s. Do the other 2 exist in any form in the internet era?

          Steve
          The SDS 940 (Scientific Data Systems') was the first machine to support time-sharing directly. Announced in 1966, it became a major part of Tymshare. The Stanford Research Institute "oN-Line System" (NLS) was demo'd on this system.

          ARPANET was connected to an SDS 940 at SRI in October, 1969...these machines served some of the very early bulletin board projects, and Tymshare was the biggest buyer back in the day. When Xerox bought SDS, the machine became the XDS 940.

          The DDP-516 was the basis of "Interface Message Processors" (IMP) used to connect the very first networked computers to the ARPANET. The first 516's were manufactured in 1966.

          The PDP-10 was manufactured from 1966 to 1980 and is the machine that made time-sharing common, it was a top choice of universities and research labs back in the day.

          The IBM 360 was released in 1965.

          Let us not forget that ARPANET's first 4 nodes were not connected until Dec, 1969...the first message was transmitted from UCLA's SDS Sigma 7 Host computer to Stanford's SDS 940 Host computer.

          Below is an early sketch of the first four nodes...



          Cheers

          -don
          {{ DiscussionBoard.errors[10640533].message }}
          • Profile picture of the author David Beroff
            Originally Posted by ForumGuru View Post

            Below is an early sketch of the first four nodes...
            And now we're at a point where a four-byte IP address, (i.e., 2^(4*8) = 4+ billion nodes), isn't sufficient.
            Signature
            Put MY voice on YOUR video: AwesomeAmericanAudio.com
            {{ DiscussionBoard.errors[10640762].message }}
            • Profile picture of the author seasoned
              Originally Posted by David Beroff View Post

              And now we're at a point where a four-byte IP address, (i.e., 2^(4*8) = 4+ billion nodes), isn't sufficient.
              Well, even with 4 nodes, it is a kind of misnomer. Because the idea was ALWAYS that you could have a loop back, group addressability, and unit addresability. So that 4 could have been hundreds.

              Still, when it was created, the world laughed at the idea of everyone having a computer. Do you realize that almost NOBODY has a phone anymore!?!?!?!?!? HECK, I don't! And the phone company doesn't provide many phone lines anymore. The last three companies I worked at, that were LARGE INTERNATIONAL companies, had NO phones!!!!!!!!!! They had NO phone lines!

              YEAH, I know, you think I am nuts. Cell Phones today, for example, are NOT phones, and don't use phone lines. They are handheld computers that convert the analog input to digital, and then transfer it over a network. And this means they end up using IP services. They CAN use unroutable to a degree and DHCP, but it is still a high usage.

              The way things are going now, they may need like 25 or more per person. Assuming a doubling of the worlds population, because that is where it is headed, and that is about 300 billion addresses. If you assume that a business needs twice as many and 1 in 10 are businesses, that is like 10billion, so 330billion. LUCKILY v6, the latest standard, is supposed to be FAR larger. HOPEFULLY, in a few years, they will switch to it. ALL the latest windows, linux, and unix versions now support it, and I believe all the new hardware does. So they just need to upgrade the drivers, and old hardware, and setup all the software right.

              BTW the 4billion nodes ipv4 supposedly handles is actually LESS!!!! It isn't 2^(4*8). It is more like 2^((3*8)+4). WHY? 10 is INTERNAL and used for other things, 127 is loop back, and you have 192, etc.... They are non routable addresses.

              Steve
              {{ DiscussionBoard.errors[10640821].message }}
  • Profile picture of the author ForumGuru
    Banned
    Computer Evolution...

    The Difference Engine was first developed by Charles Babbage in 1822, #2 in 1849, and it was capable of computing several sets of numbers, and making a hard copy of the result. A full scale machine was not completed back then due to a lack of funding, in 1991 the London Science Museum completed the machine's printing mechanism.


    The first general mechanical computer, The Analytical Machine, was completed in 1910 by Henry Babbage, this machine was able to perform basic calculations.



    The first programmable computer was the Z1 created by Konrad Zuse in 1936, it was the first electro-mechanical, binary programmable computer.



    Developed by Tommy Flowers in 1943, the Colossus was the first electric programmable computer. This computer was used to break the German coded messages.



    The Atanasoff-Berry Computer was developed 1937-1942 at Iowa State University. The ABC was the first digital computer, not the ENIAC. The ABC used vacuum tubes...



    ENIAC was invented at the University of Pennsylvania and completed in 1946. The behemoth weighed 50 tons and took up 1,800 square feet and contained 18,000 vacuum tubes. Many still consider this to be the first digital computer, as it was fully functional.



    The first stored program on a computer was on EDSAC (British) in 1949. The program was a game called "Baby".

    In 1950 the UNIVAC 1101 is considered the first system capable of storing and running a program from memory.



    In 1942 the Z4 (Zuse) became the first commercial computer.

    In 1953 IBM introduces it's first commercial scientific computer, the 701.

    In 1955 MIT introduced the Whirlwind Machine, the first digital computer with magnetic core RAM and real time graphics.



    In 1956 the TX-O became the first transistorized computer, an MIT project.

    The first mini-computer was released by DEC in 1960, the PDP-1.

    The first mass market computer was the Programm 101, introduced at the 1964 World's Fair. The computer was invented by Pier Giorgio Perotto...44,000 units were sold at $3,200 each.

    In 1968 HP released the HP 9100A...considered to be the first mass market desktop computer.

    In 1974 Xerox did a demo of the first workstation, the Alto, which was never sold to the public.

    The first microprocessor was created by Intel in 1972

    The first micro-computer was a Mircal using the IBM 8008 processor released in 1973. Mircal was developed by Frances Gernelle and Andre Troung Trong Thi. This was the first commercial non-assembly computer.

    The first personal computer was the Altair 8800 in 1975...Ed Roberts invented it and coined the term personal computer. This rig sold for $750.



    In 1975 the first laptop or portable computer was released, and it was the IBM 5100 --> for $1,795 you got a 55lb machine 1.9Mhz processor and 64KB of ram.

    In 1976 the first Cray-1 system was installed at Los Alamos National Laboratory. The cost was $8.8 million boasting 160 million floating-point operations per second and 8 MB of main memory.

    The Apple 1 was a computer kit developed By Steve Wozniak in 1976. The first Apple as a stock machine ran at 1.023 MHz. and came with 4K of ram.

    The first IBM personal computer was the Acorn, released in 1981.

    One of the first multimedia computers (based on MPC standard) was the Tandy M2500 and M4020 released in 1992.

    Sorry, I became bored with running down pix to match the machines, but this is a decent evolutionary overview. Yes, I missed a few notables, but this should be comprehensive enough for off-topic work.

    Cheers

    -don
    {{ DiscussionBoard.errors[10640380].message }}
    • Profile picture of the author seasoned
      Yeah, Babbage was the non electric one I spoke of. Mark 1 was the first non electronic, but electric, and eniac the first non silicon electronic. But yeah, I WAS talking about what they may talk about in the US. Ones like the flowers were supposedly top secret. Of course it would be silly to keep it secret TODAY.

      Steve
      {{ DiscussionBoard.errors[10640507].message }}
      • Profile picture of the author ForumGuru
        Banned
        Originally Posted by seasoned View Post

        Yeah, Babbage was the non electric one I spoke of. Mark 1 was the first non electronic, but electric, and eniac the first non silicon electronic
        Steve
        Actually, the first non silicon electric was the Atansoff-Berry Computer (Iowa State).

        The Atanasoff-Berry Computer (ABC)

        In 1973, US Federal Judge Earl R. Larson decided the ENIAC patent by Eckert and Mauchly was invalid, and Atanasoff was recognized as the inventor of the electronic digital computer.

        Cheers

        -don
        {{ DiscussionBoard.errors[10640517].message }}
  • Profile picture of the author seasoned
    Universities used the PDP10 in 1980? UNREAL! ADMITTEDLY I was immersed in DEC(Working for a DEC OEM) only from about 1979, but from some time before that until even like 1997, when the DEC 2K problem first reared its head and may have played a part in their downfall, the PDP11 seemed to be the most popular. HECK! By 1980, the PDP11-23 seemed to be the SMALLEST system they had. I'll never understand why the DECMATE used a PDP-8.

    I was talking about the relevance of that early degree to the internet.

    Steve
    {{ DiscussionBoard.errors[10640806].message }}
    • Profile picture of the author ForumGuru
      Banned
      Originally Posted by seasoned View Post

      Universities used the PDP10 in 1980? UNREAL!
      They sure did...

      In must have been 1980. I was about 17, went to school in Darmstadt, Germany, and was 2 year before university-entrance diploma.

      >snip<

      But then my school participated on an early "computer for schools" project.

      They gave us a single DECwriter connected over a 300 baud modem to the computer center of Technical University in Darmstadt. The machine on the other end was a PDP-10 under TOPS-10, were the whole school had one single user account.

      A physic teacher tried build up a class teaching BASIC, but soon the project consisted merely of five nerds, fighting for time before the DECwriter.


      PDP-10 KI10: Personal memories
      ...TU Darmstadt (Germany - founded 1877) was the first university in the world to set up a chair in electrical engineering (1882), which lead to the foundation of the electrical engineering faculty in 1883.

      Cheers

      -don
      {{ DiscussionBoard.errors[10640809].message }}
  • Profile picture of the author Christopher Fox
    Originally Posted by Dano101 View Post

    So does anyone know how the unknown computer geeks were able to accomplish starting the internet and manipulating softwares for all purposes.
    It wasn't unknown computer geeks - it was the Military/Defense Contractors/University Research Departments.

    The Internet's FIRST purpose, why it was invented, was to serve the Military. Then, they gave those 'unknown computer geeks' their hand-me-down communications infrastructure ...
    Signature
    One man alone can be pretty dumb sometimes, but for real bona fide stupidity, there ain't nothing can beat teamwork.

    - Seldom Seen Smith
    {{ DiscussionBoard.errors[10649033].message }}
    • Profile picture of the author seasoned
      Originally Posted by Christopher Fox View Post

      It wasn't unknown computer geeks - it was the Military/Defense Contractors/University Research Departments.

      The Internet's FIRST purpose, why it was invented, was to serve the Military. Then, they gave those 'unknown computer geeks' their hand-me-down communications infrastructure ...
      ACTUALLY, the language, O/S, and networking were created by PRIVATE industry! The military is still crowing about Grace Hopper and COBOL.

      UNIX | operating system | Britannica.com (OS)(late 1960s) AT&T
      https://en.wikipedia.org/wiki/Ethernet (MEDIA/NETWORKING)(first COMMERCIALLY used in 1980, but created in 1973) XEROX

      https://en.wikipedia.org/wiki/C_(programming_language) (LANGUAGE)(1969-1973) BELL LABS

      As for computer languages?

      A-0 1951 Grace hopper. I have never seen this language(and can't seem to find a sample of it anywhere), but Wikipedia describes it as more of a kind of linker than a compiler. That would make it like forth WITHOUT the ability to make new "commands". It may not even have had branches! Still, it was a lowlevel language. The other languages I list here are high level.
      https://en.wikipedia.org/wiki/Grace_Hopper
      FORTRAN 1957 IBM
      COBOL 1959 IBM Frankly, it had a lot of CRAZY problems, as far as I am concerned. WORDY, periods that you had to REALLY worry about, was often strict on positions of statements, etc....
      LISP 1958 Mccarthy of MIT
      PASCAL 1968 Niklaus Wirth
      FORTH designed by Charles "Chuck" Moore in 1971.
      C was developed in 1972 by Dennis Ritchie while working at Bell Labs in New Jersey(according to another site)

      Anyway, the only non proprietary OS that handled networking, had the defined standards, etc... was UNIX, and that is what most of the internet runs on today. Linux is as much a UNIX as many of the other derivatives. Linus gave up on a lot of the big changes, and disregarding unix standards, after he tried to get X windows to work. It was too complicated, so he ended up actually making Linux more Unix like. C was the first language. Last I knew, UNIX, and Linux, were still written in it. Most Unix/Linux utilities are also. And Ethernet is still the main wired standard for networks.

      Even HTTP came out to the public quickly, and as opensource.

      BTW COBOL was for business. Just the paper for program printouts, and the cards to program, probably cost a FORTUNE. PASCAL was built to teach programming, but is still a nice language. C is actually nicknamed the compiler language, because so many compilers and the like are written in it, and it can allocate memory better than most languages can.

      Still, nobody hacked the internet to learn it unless they were just dumb. Hacking has only been done to tweak, or try to break things. Even THEN, any commercial undertaking has to be done with the understanding of documented standards because failure to do so can cause the changes to become worthless, or cause problems with other stuff.

      Steve
      {{ DiscussionBoard.errors[10650512].message }}
      • Profile picture of the author ForumGuru
        Banned
        Originally Posted by seasoned View Post


        Still, nobody hacked the internet to learn it unless they were just dumb. Hacking has only been done to tweak, or try to break things. Even THEN, any commercial undertaking has to be done with the understanding of documented standards because failure to do so can cause the changes to become worthless, or cause problems with other stuff.

        Steve
        Interestingly enough, tons of companies, universities, private and public entities, and even countries hold "hackathons".

        Many of those are specifically ---> exploratory programming jam sessions. A hackathon or hacking marathon is usually referring to exploratory programming, quite often employed to spur new product development and/or feature innovation, as well as the development of young programming talent.

        Sun Microsystems first referred to a 5 day hackathon event in 1999... At that event John Gage instructed/challenged the attendees to write a Java program so the Palm V could communicate via the infrared port with other Palm users, and register it on the net.

        The famous Facebook like button is the product of a Facebook hackathon.

        Hackathons are practiced quite often among many of the industry bigs, Facebook, Google etc....

        https://en.wikipedia.org/wiki/Hackathon

        Events like HackIllinois take place all over the country. 2016 API Sponsors included: Apple, Microsoft, IBM, Intel, John Deere, Go Daddy, Facebook, Google, AT&T,Yahoo etc. etc.

        https://www.hackillinois.org/api

        In the span of 36 hours, our hackers are challenged to innovate, design and achieve.

        At HackIllinois, we believe in three core values:

        Encouraging students to explore their creativity
        Empowering students to create meaningful additions to the field of technology
        Building a community of peers and mentors


        https://www.hackillinois.org/
        A few of the hacks from HackIllinois 2015...

        http://news.mlh.io/favourite-hacks-h...015-03-18-2015

        Facebook held it's 32nd Hackathon way back in July of 2012...

        http://www.wired.com/2012/07/faceboo...amp-hackathon/

        Anyway, some folks may think the term hack only applies to something malicious or with criminal intent, but as we know, a hackathon often refers to a collaborative computer programming session that takes place for an extended period of time. When introduced to the startup world, hackathons were a way employees could take time off from their everyday job to code and develop new ideas, projects, and innovate.

        Cheers

        -don
        {{ DiscussionBoard.errors[10651577].message }}
        • Profile picture of the author seasoned
          Originally Posted by ForumGuru View Post

          Interestingly enough, tons of companies, universities, private and public entities, and even countries hold "hackathons".

          Many of those are specifically ---> exploratory programming jam sessions. A hackathon or hacking marathon is usually referring to exploratory programming, quite often employed to spur new product development and/or feature innovation, as well as the development of young programming talent.

          Sun Microsystems first referred to a 5 day hackathon event in 1999... At that event John Gage instructed/challenged the attendees to write a Java program so the Palm V could communicate via the infrared port with other Palm users, and register it on the net.

          The famous Facebook like button is the product of a Facebook hackathon.

          Hackathons are practiced quite often among many of the industry bigs, Facebook, Google etc....

          https://en.wikipedia.org/wiki/Hackathon

          Events like HackIllinois take place all over the country. 2016 API Sponsors included: Apple, Microsoft, IBM, Intel, John Deere, Go Daddy, Facebook, Google, AT&T,Yahoo etc. etc.

          https://www.hackillinois.org/api



          A few of the hacks from HackIllinois 2015...

          Our Favorite Hacks: HackIllinois 2015 Edition

          Facebook held it's 32nd Hackathon way back in July of 2012...

          Deep Inside a Facebook Hackathon, Where the Future of Social Media Begins | WIRED

          Anyway, some folks may think the term hack only applies to something malicious or with criminal intent, but as we know, a hackathon often refers to a collaborative computer programming session that takes place for an extended period of time. When introduced to the startup world, hackathons were a way employees could take time off from their everyday job to code and develop new ideas, projects, and innovate.

          Cheers

          -don
          Yeah, I guess I was using the term hack in a way more like to get into a system, or use a feature for another purpose. Some will refer to interitive programming as hacking even though most ends up being that. even some things like the "SDLC" plans, like AGILE talk about that like it is some recent revelation, even though it isn't.

          The using the infrared port, as you mentioned would fall into the case where I said:

          "Hacking has only been done to tweak, .... Even THEN, any commercial undertaking has to be done with the understanding of documented standards because failure to do so can cause the changes to become worthless, or cause problems with other stuff.

          If they did things right, they could use it to do so much more. If they did it wrong, it could interfere with other software of the system.

          Steve
          {{ DiscussionBoard.errors[10653063].message }}
  • Profile picture of the author ForumGuru
    Banned
    Originally Posted by Christopher Fox View Post

    It wasn't unknown computer geeks - it was the Military/Defense Contractors/University Research Departments.

    The Internet's FIRST purpose, why it was invented, was to serve the Military. Then, they gave those 'unknown computer geeks' their hand-me-down communications infrastructure ...
    Erhmmmm... ---> I thought I may have covered this in sufficient detail with the ARPANET and LiveScience links I posted in the very first reply to this thread.

    Yes, the Internet was born from a military and academic research project (ARPANET), but it is more of a product of west coast culture than the Pentagon. Yes, many know/believe the stated objective was to give the DoD a way to communicate between the different branches of he military during and after a nuclear attack.

    ARPANET was originally designed and used to allow many different locations to time share computer resources, basically the sharing of computer resources between scientific universities. Contrary to popular belief, it's design was not necessarily military in nature, but scientific. Furthermore, ARPANET was not originally intended to be used to link people, send messages, or be a communications/information facility. (See: Bob Taylor (Pentagon) & Larry Roberts (ARPA))

    Anyway, below I have compiled a brief historic overview of the Internet and WWW and a few of the important folks that helped create it....

    1961 - Leonard Kleinrock published "Information Flow in Large Communications Network". Kleinrock was at MIT at the time...he then joined the faculty at UCLA where he remains today.

    1962 - J.C.R. Licklider (BBN) became the first director of IPTO. Licklider, Kleinrock, and Robert Taylor (NASA at the time) helped create the idea of a network...that network was ARPANET. The Information Processing Techniques Office (IPTO) at DARPA was the United States Department of Defense Advanced Research Projects Agency.

    1965: 2 MIT computers communicate using packet-switching technology. Lawrence Roberts and Thomas Marill connect a TX-2 at MIT Lincoln Lab with a Q-32 at System Development Corporation in Santa Monica, CA. The connection used a lease-line provided by Western Union.

    1968 - The Network Working Group met for the first time...Elmer Shapiro (Stanford Reasearch Group) chaired and others at the meeting included Steve Carr, Ron Stoughton, Steve Crocker and Jeff Rulifson.

    1968 - Shapiro released the report titled "A Study of Network Design Parameters". Based largely on Shapiro's, Paul Baran's, and Thomas Marrill's work, Lawerence Roberts (MIT & DARPA) and Barry Wessler (DARPA) created the (IMP) Internet Message Protocol specifications. Bolt Baranek and Newman Inc.(BBN) received the contract to design and build the IMP subnet.

    1969 - On July 3, UCLA publishes a press release introducing the internet.

    1969 - On Aug 29, the first network switch and the first IMP is sent to UCLA.

    1969 - On Sept 2, the first data is transferred from the UCLA host to the switch.

    1969 - On Oct 29, the first internet message was sent from Kleinrock's lab at UCLA to SRI...this is the internet's original backbone. Below is the original IMP log from the very first internet transmission.



    1971 Ray Tomlinson (Bolt, Beranek, and Newman) sends the first network email. Of course BBN was instrumental in the development of ARPANET. Sadly, Tomlinson passed away last month.

    1973 Bob Metcalfe (Xerox) develops the Ethernet idea.

    1973 Vinton Cerf (Cal, Stanford & DARPA) and Robert Kahn (DARPA) design TCP. For most folks, Kahn and Cerf are considered the inventors (or fathers) of the internet.

    1974 The first commercial network, Telenet (BBN), is born and is considered the first Internet Service Provider.

    1977 Dennis Hayes and Dale Heatherington release the 80-103A modem. This and their other modems were a popular choice of home users to connect to the internet.

    1978 Danny Cohen, John Soch and David Reed split TCP into TCP/IP to support real time traffic. TCP/IP helped to create UDP and is eventually standardized to ARPANET in 1983.

    1984 Jon Postel and Paul Mokapetris introduce DNS and the domain name system. The first internet domain name, Symbolics, was registered on March 15.

    1989 Commerial dial-up is born as "The World" is introduced...I believe they still serve a few dialup customers to this day.

    1990 Tim Berners-Lee (CERN) develops HTML prototype.

    1991 Berners-Lee introduces WWW to the public on Aug 6.

    1993 Mosaic is the most widely used graphical WWW browser.

    1995 Java is developed by James Gosling and others at Sun and is released to the public.

    1995 JavaScript is developed by Brendan Eich (Netscape) and is first introduced with Netscape Navigator 2.xxx. Eich later co-founds the Mozilla project.

    1997 I registered my first domain name.

    1998 Google is born.

    1999 Napster arrives.

    2003 I made my fist sale from the internet.

    2004 Facebook hits the web.

    2005 Youtube launches.

    2009 Internet's 40th anniversary.

    2012 I joined Warrior Forum.

    Vinton Cerf and Robert Kahn are widely recognized as two of the founding fathers of the internet. The WWW was invented by Tim Berners-Lee. Al Gore you may ask? I believe he may have made popular, or coined, the term --> information superhighway.

    The earliest written use of the word "internetworking" appears to be by Vint Cerf in 1974.

    Cheers

    -don
    {{ DiscussionBoard.errors[10651253].message }}
  • Profile picture of the author barbling
    If you want a really great read from prior to that time, check out:

    Hackers: Heroes of the Computer Revolution - 25th...Hackers: Heroes of the Computer Revolution - 25th...
    Talked about the MIT Model Railroad club and other pre-Internet thingees... utterly fascinating read!

    And... 'way back in 1992 when I worked at Bell Labs, this happened:

    An Evening with Berferd
    In Which a Cracker is Lured, Endured, and Studied

    http://www.cheswick.com/ches/papers/berferd.pdf

    We knew how to have fun back then!

    Good times, good times....
    {{ DiscussionBoard.errors[10651627].message }}
    • Profile picture of the author ForumGuru
      Banned
      Originally Posted by barbling View Post

      Hey, Barb... Dang it, I completely forgot about getting back to this thread a few weeks ago. For $4 shipped, I'm going to add that one to my collection. Thanks.

      And... 'way back in 1992 when I worked at Bell Labs, this happened:

      An Evening with Berferd
      In Which a Cracker is Lured, Endured, and Studied

      http://www.cheswick.com/ches/papers/berferd.pdf

      We knew how to have fun back then!

      Good times, good times....
      This reminds me a bit of when I was a radio frequency and crypto tech in the U.S. Navy aboard a guided missile cruiser back in the 80s.

      Most would not believe the things a few crafty ET's could pull off on the ship...including fooling almost everyone on the ship, in the battle group, in the fleet, and at command at the highest levels, including all the way back to the Pentagon.

      It's just nuts what a couple of guys could do with a small transmitter, on a U.S. ship, off of the coast of Iran just a couple of years after the Iranian hostage crisis ended.

      In the process of providing a little "underground entertainment" for AM/FM listeners off of the coast of Iran, in the Persian Gulf in the early 80's, a few RF ET's on my ship inadvertently fooled the entire command, and our entire battle group was going to be ordered to "act" on the new "intelligence" and move to a different location in the Gulf. Uh-oh...

      Needless to say, the rogue radio station was shutdown when our department head got wind that high command thought the broadcasts were real broadcasts, originating from mainland Iran. He did not know who or what was broadcasting...but our Div. Officer was a little brighter than the average bear, and he came to us telling us to shut it down immediately - if we had anything to do with the mysterious transmissions.

      Yeah, the EW's in our battle group must not have been all that great back in those days. Not at all....

      Crazy stuff....thanks for your input!

      Cheers

      -don
      {{ DiscussionBoard.errors[10705094].message }}
  • Profile picture of the author yukon
    Banned
    It's not like the internet started out like you see it today. It evolved, just like the telegraph (circa. 1753) evolved into the telephone (circa. 1876). That's a time span of 123 years.
    {{ DiscussionBoard.errors[10651689].message }}
  • Profile picture of the author seasoned
    OH, and Stephen Wozniak may say he hacked the floppy disk controller. He DID tweak things to do what he wanted. STILL, the overall function, and basic device he controlled, had things he had to understand. Trying to do it blindly could have taken DECADES! He did NOT study another controller. TIME WAS RUNNING OUT! He wanted to get it working properly, integrated with the system, mass produced, and ADVERTISED BEFORE CHRISTMAS! He ended up looking at another controller, found he was doing too much, and hit his target. Who knows, that could have saved apple.

    IMAGINE if he tried to reverse engineer everything through signals and all. He may be STILL trying to get it to work. And what if he picked something that only worked with THAT drive?

    Steve
    {{ DiscussionBoard.errors[10653075].message }}

Trending Topics