Presentation on theme: "JONATHAN. The earliest known tool for use in computation was the, abacus, and it was thought to have been invented in Babylon circa 2400 BCE. Its original."— Presentation transcript:
The earliest known tool for use in computation was the, abacus, and it was thought to have been invented in Babylon circa 2400 BCE. Its original style of usage was by lines drawn in sand with pebbles. This was the first known computer and most advanced system of calculation known to date - preceding Greek methods by 2,000 years. Abaci of a more modern design are still used as calculation tools today.In 1115 BCE, the South Pointing Chariot was invented in ancient China. It was the first known geared mechanism to use a differential gear, which was later used in analog computers. The Chinese also invented a more sophisticated abacus from around the 2nd century BCE, known as the Chinese abacus.
Around 1640, Blaise Pascal, a leading French mathematician, constructed the first mechanical adding device based on a design described by Greek mathematician Hero of Alexandria. Then in 1672 Gottfried Wilhelm Leibniz invented the Stepped Reckoner which he completed in None of the early computational devices were really computers in the modern sense, and it took considerable advancement in mathematics and theory before the first modern computers could be designed.
Before the wide spread of internetworking that led to the Internet, most communication networks were limited by their nature to only allow communications within the stations on the local network and the prevalent computer networking method was based on the central mainframe computer model. Several research programs began to explore and articulate principles of networking between physically separate networks, leading to the development of the packet switching model of digital networking. These research efforts included those of the laboratories of Vinton G. Cerf at Stanford University, Donald Davies, Paul Baran, and Leonard Kleinrock at MIT and at UCLA. The research led to the development of several packet-switched networking solutions in the late 1960s and 1970s, including ARPANET, Telenet, and the X.25 protocols. Additionally, public access and hobbyist networking systems grew in popularity, including unix-to-unix copy (UUCP) and FidoNet.
The terms "bug" and "debugging" are both popularly attributed to Admiral Grace Hopper in the 1940s. While she was working on a Mark II Computer at Harvard University, her associates discovered a moth stuck in a relay and thereby impeding operation, whereupon she remarked that they were "debugging" the system. However the term "bug" in the meaning of technical error dates back at least to 1878 and Thomas Edison, and "debugging" seems to have been used as a term in aeronautics before entering the world of computers. Indeed, in an interview Grace Hopper remarked that she was not coining the term. The moth fit the already existing terminology, so she saved it.
The Internet was invented in the United States during the late 1950s to the 1970s by a group of researchers and scientists at the newly formed Advanced Research Projects Agency (ARPA) after the former Soviet Union launched Sputnik. Realizing that the United States had suffered a great technological blow by allowing the USSR to hold the first successful satellite launch, ARPA set out to create a brand new technology unlike anything that had ever been done before; and the Internet was the result of their hard work.
ENIAC stands for Electronic Numerical Integrator and Computer. It was a secret World War II military project carried out by John Mauchly, a 32- year-old professor at Penn's Moore School of Electrical Engineering and John Presper Eckert Jr., a 24-year-old genius inventor and lab assistant. The challenge was to speed up the tedious mathematical calculations needed to produce artillery firing tables for the Army. ENIAC was not completed until after the war but it performed until 1955 at Aberdeen, Md. ENIAC was enormous. It contained 17,500 vacuum tubes, linked by 500,000 soldered connections. It filled a 50-foot long basement room and weighed 30 tons.
A computer program is a set of ordered instructions that enable a computer to carry out a specific task. A program is prepared by first formulating the task and then expressing it in an appropriate programming language. Programmers may work in machine language or in assembly languages.
he word "binary" describes a system that has only two possible digits. To understand this, let's first compare this to a system you're probably more familiar with, the Decimal system. The binary system works essentially the same way, with the only difference that it only has two digits. These are visually expressed by the digits 0 and 1. Every number expressed in the binary system is a combination of these two digits.
he first fully-automatic calculating machine, was constructed by British computing pioneer Charles Babbage ( ), who first conceived the idea of an advanced calculating machine to calculate and print mathematical tables in Conceived by him in 1834, this machine was designed to evaluate any mathematical formula and to have even higher powers of analysis than his original Difference engine of the 1820s.
IN 1975, Ed Roberts, the designed the first microcomputer, the Altair 8800, which was produced by Micro Instrumentation and Telemetry Systems. The same year, two young hackers, William Gates and Paul Allen approached MITS and promised to deliver a BASIC compiler. So they did and from the sale, Microsoft was born.