Ministry of development of information technologies an communications



Download 34,29 Kb.
bet4/4
Sana08.02.2022
Hajmi34,29 Kb.
#434470
1   2   3   4
Bog'liq
Физика лаборатория (автовосстановление)

11) Computer graphics deals with generating images with the aid of computers. Today, computer graphics is a core technology in digital photography, film, video games, cell phone and computer displays, and many specialized applications. A great deal of specialized hardware and software has been developed, with the displays of most devices being driven by computer graphics hardware. It is a vast and recently developed area of computer science. The phrase was coined in 1960 by computer graphics researchers Verne Hudson and William Fetter of Boeing. It is often abbreviated as CG, or typically in the context of film as computer generated imagery (CGI). The non-artistic aspects of computer graphics are the subject of computer science research.
Some topics in computer graphics include user interface designsprite graphicsrenderingray tracinggeometry processingcomputer animationvector graphics3D modelingshadersGPU design, implicit surfacesvisualizationscientific computingimage processingcomputational photographyscientific visualizationcomputational geometry and computer vision, among others. The overall methodology depends heavily on the underlying sciences of geometryopticsphysics, and perception.
Computer graphics is responsible for displaying art and image data effectively and meaningfully to the consumer. It is also used for processing image data received from the physical world, such as photo and video content. Computer graphics development has had a significant impact on many types of media and has revolutionized animationmoviesadvertisingvideo games, in general.
The term computer graphics has been used in a broad sense to describe "almost everything on computers that is not text or sound".[2] Typically, the term computer graphics refers to several different things:

  • the representation and manipulation of image data by a computer

  • the various technologies used to create and manipulate images

  • methods for digitally synthesizing and manipulating visual content, see study of computer graphics

Today, computer graphics is widespread. Such imagery is found in and on television, newspapers, weather reports, and in a variety of medical investigations and surgical procedures. A well-constructed graph can present complex statistics in a form that is easier to understand and interpret. In the media "such graphs are used to illustrate papers, reports, theses", and other presentation material.[3]
Many tools have been developed to visualize data. Computer-generated imagery can be categorized into several different types: two dimensional (2D), three dimensional (3D), and animated graphics. As technology has improved, 3D computer graphics have become more common, but 2D computer graphics are still widely used. Computer graphics has emerged as a sub-field of computer science which studies methods for digitally synthesizing and manipulating visual content. Over the past decade, other specialized fields have been developed like information visualization, and scientific visualization more concerned with "the visualization of three dimensional phenomena (architectural, meteorological, medical, biological, etc.), where the emphasis is on realistic renderings of volumes, surfaces, illumination sources, and so forth, perhaps with a dynamic (time) component".
12) The World Wide Web ("WWW" or "The Web") is a global information medium which users can access via computers connected to the Internet. The term is often mistakenly used as a synonym for the Internet itself, but the Web is a service that operates over the Internet, just as email and Usenet also do. The history of the Internet dates back significantly further than that of the World Wide Web.
The hypertext portion of the Web in particular has an intricate intellectual history; notable influences and precursors include Vannevar Bush's Memex,[3] IBM's Generalized Markup Language,[4] and Ted Nelson's Project Xanadu.[3] Paul Otlet's Mundaneum project has also been named as an early 20th-century precursor of the Web.[5]
The concept of a global information system connecting homes is prefigured in "A Logic Named Joe", a 1946 short story by Murray Leinster, in which computer terminals, called "logics", are present in every home. Although the computer system in the story is centralized, the story anticipates a ubiquitous information environment similar to the Web. The cultural impact of the Web was imagined even further back in a short story by E. M. Forster, "The Machine Stops", first published in 1909.
In 1980, Tim Berners-Lee, an English independent contractor at the European Organization for Nuclear Research (CERN) in Switzerland, built ENQUIRE, as a personal database of people and software models, but also as a way to play with hypertext; each new page of information in ENQUIRE had to be linked to a page.[3]
Berners-Lee's contract in 1980 was from June to December, but in 1984 he returned to CERN in a permanent role, and considered its problems of information management: physicists from around the world needed to share data, yet they lacked common machines and any shared presentation software.
13) The ARPANET experiment was a complete novelty on the computer science scene. Most of the people involved in the day-to-day work with implementing hardware and software were graduate students, and the personal accounts provided by participants suggested a true spirit of invention, but also of confusion: "No one had clear answers, but the prospects seemed exciting. We found ourselves imagining all kinds of possibilities: interactive graphics, cooperating processes, automatic data base query, electronic mail, but no one knew where to begin". The most important task for the participants in this fledgling network was to ensure the stability of the communication protocol. During the following years the group's participants succeeded in creating a protocol scheme. The idea was to have an underlying protocol taking care of establishing and maintaining communication between the computers on the network and a set of protocols which performed a number of particular tasks. This scheme was successfully tested (only one of the 15 sites involved failed to establish a connection).
During the 1970s the ARPANET was constantly evolving in size and stability, and was a subject of a number of seminal developments, among which the most noteworthy was electronic mail and the establishment of a transatlantic connection. In addition, work was undertaken to improve the basic communication protocols and modernize them according to the constant growth of the ARPANET. The military use of the Internet did not have any direct impact on the civilian use of the research network as such, but highlights the fact that the Internet of today was conceived as a military communications tool.
The following years witnessed the birth of the Usenet. Developed by university students Tom Truscott and Jim Ellis, the Usenet turned out to be the ultimate exponent for the physical anarchy of the ARPANET (no central command control, all connected computers being completely equal in their ability to transmit and receive packets). Truscott and Ellis created a hierarchy of computer users groups which were distributed between a growing number of academic institutions via modems and phone lines. This hierarchy soon turned out to accommodate a wide number of interests, from computer programming to car maintenance, and enabled the participants to read and post information and opinions in what became known as the Usenet Newsgroups. Newsgroups may be determined as discussion groups. Each of these groups is devoted to a particular topic.
At first the Usenet was a practically unofficial activity involving a number of graduate students, but soon it proved to be the network service which heavily contributed to the international growth of the internetworking principle. The Usenet connections were established between several European countries and Australia.
The creation of the ARPANET was followed by the creation of the NSFNET. This fact signalled that universities had begun to consider networking as an essential tool for researches. A high-speed network connection, referred to as the "backbone", was established between the five super-computing centers and they in turn made their facilities available to universities in their region, effectively making the network completely decentralized.
14) The years 1989-96 was another pivotal period for what was effectively known as the Internet, stressing the fact that the original ARPANET had been followed by myriad of fast growing sub-networks operating in the U.S. and internationally. In 1989 the ARPANET was decommissioned, and in April 1995 the NSFNET reverted back to a pure research network, leaving a number of private companies to provide Internet backbone connectivity. At the same time the number of hosts as well as the network traffic grew at an enormous rate.
This veritable explosion in network use, apart from the fact that the personal computer became a household item in the same span of time, can be attributed to the result of a research proposal submitted to the funding authorities of the European Laboratory for Particle Physics in Switzerland, CERN (a French abbreviation for Conseil Europeen pour la Recherche Nucleaire). The title was "WorldWideWeb: Proposal for a HyperText Project," and the authors were Tim Berners-Lee and Robert Cailliau.
The World-Wide Web (also known as the WWW or Web) was conceived as a far more user-friendly and navigationally effective user interface than the previous UNIX-based text interfaces. The communications protocol devised for the WWW was termed HTTP (HyperText Transfer Protocol), hypertext being a navigational tool, linking data objects, be it text or graphics, together by association in what is effectively a web of pages, hence the use of the term "World-Wide Web." Berners- Lee and Cailliau describe the process as follows: "A hypertext page has pieces of text which refer to other texts. Such references are highlighted and can be selected with a mouse....When you select a reference, the browser [the software used to access the WWW] presents you with the text which is referenced: you have made the browser follow a hypertext link."
The WWW prototype was first demonstrated in December 1990, and on May 17, 1991 the WWW began to work due to granting HTTP access to a number of central CERN computers. As soon as browser software became available for the more common operating systems such as Microsoft Windows and Apple Macintosh, this new tool was immediately picked up by the Internet community. The World-Wide Web, the simplicity of Internet access for private individuals, as well as the increasing user-friendliness of the software necessary to master the Internet protocols contributed to the meteoric rise of network use in the 1990s. Browsing through the original WWW proposal reveals an irony very characteristic to the development of the Internet, in the face of its author's assertion that "the project will not aim to do research into multimedia facilities such as sound and video." In 1996 the present and future of the Internet, and the WWW in particular, points to a convergence of media types, and multimedia has indeed become the catch phrase of the day.
Despite serious limitations in contemporary network capacity as far as to sound and video, new technologies constantly enable the increase of interactive network experiences. This development is supplemented by a constant innovation in hardware; today's Internet backbones transmit data packets at a speed up to 200 megabits per second (by comparison, the NSFNET backbone of 1986 ran at the blazing speed of 56 kilobits per second). Today the modems of most Internet users run at a speed of 28.8 kbit/s and a digital connection can deliver at a speed of up to 128 kbit/s, but the possibility of using the fiber optic cables bringing cable TV to millions of homes, for Internet data transmission opens up for private connections running at a speed of up to 10 Mbit/s.
Another new technology, ASDL, promises to use the existing telephone copper wires for even higher transmission speeds.But what will these network technologies deliver to the Internet user? In 1996 commercial Internet hosts have overtaken educational and governmental applications and these commercial interests clearly consider the Internet, and the WWW in particular, as a vehicle for online advertising and commerce. Hence the Net user of today can be described as a consumer. The Internet is still a powerful medium for communication, and has in many ways fulfilled the vision of interactive computing which fueled J.C.R Licklider's imagination, but it remains to be seen whether it will be the democratizing medium of the 21st century, or merely become another staticfilled television channel.
15) On October 4, 1957, the Soviet Union launched the world’s first manmade satellite into orbit. The satellite, known as Sputnik, did not do much: It relayed blips and bleeps from its radio transmitters as it circled the Earth. Still, to many Americans, the beach-ball-sized Sputnik was proof of something alarming: While the brightest scientists and engineers in the United States had been designing bigger cars and better television sets, it seemed, the Soviets had been focusing on less frivolous things—and they were going to win the Cold War because of it.
After Sputnik’s launch, many Americans began to think more seriously about science and technology. Schools added courses on subjects like chemistry, physics and calculus. Corporations took government grants and invested them in scientific research and development. And the federal government itself formed new agencies, such as the National Aeronautics and Space Administration (NASA) and the Department of Defense’s Advanced Research Projects Agency (ARPA), to develop space-age technologies such as rockets, weapons and computers.
Scientists and military experts were especially concerned about what might happen in the event of a Soviet attack on the nation’s telephone system. Just one missile, they feared, could destroy the whole network of lines and wires that made efficient long-distance communication possible. 
In 1962, a scientist from M.I.T. and ARPA named J.C.R. Licklider proposed a solution to this problem: a “galactic network” of computers that could talk to one another. Such a network would enable government leaders to communicate even if the Soviets destroyed the telephone system.
In 1965, another M.I.T. scientist developed a way of sending information from one computer to another that he called “packet switching.” Packet switching breaks data down into blocks, or packets, before sending it to its destination. That way, each packet can take its own route from place to place. Without packet switching, the government’s computer network—now known as the ARPAnet—would have been just as vulnerable to enemy attacks as the phone system.
Download 34,29 Kb.

Do'stlaringiz bilan baham:
1   2   3   4




Ma'lumotlar bazasi mualliflik huquqi bilan himoyalangan ©hozir.org 2024
ma'muriyatiga murojaat qiling

kiriting | ro'yxatdan o'tish
    Bosh sahifa
юртда тантана
Боғда битган
Бугун юртда
Эшитганлар жилманглар
Эшитмадим деманглар
битган бодомлар
Yangiariq tumani
qitish marakazi
Raqamli texnologiyalar
ilishida muhokamadan
tasdiqqa tavsiya
tavsiya etilgan
iqtisodiyot kafedrasi
steiermarkischen landesregierung
asarlaringizni yuboring
o'zingizning asarlaringizni
Iltimos faqat
faqat o'zingizning
steierm rkischen
landesregierung fachabteilung
rkischen landesregierung
hamshira loyihasi
loyihasi mavsum
faolyatining oqibatlari
asosiy adabiyotlar
fakulteti ahborot
ahborot havfsizligi
havfsizligi kafedrasi
fanidan bo’yicha
fakulteti iqtisodiyot
boshqaruv fakulteti
chiqarishda boshqaruv
ishlab chiqarishda
iqtisodiyot fakultet
multiservis tarmoqlari
fanidan asosiy
Uzbek fanidan
mavzulari potok
asosidagi multiservis
'aliyyil a'ziym
billahil 'aliyyil
illaa billahil
quvvata illaa
falah' deganida
Kompyuter savodxonligi
bo’yicha mustaqil
'alal falah'
Hayya 'alal
'alas soloh
Hayya 'alas
mavsum boyicha


yuklab olish