What is the full form of CGI


(i) CGI: Computer Generated Imagery

CGI Full Form

CGI stands for Computer Generated Imagery. It is an application of computer graphics (imaging software) that is used to create realistic-looking (three-dimensional) images, still and animated visual content, anatomical modeling, architectural design, video game art, special effects in movies and electronic media, etc. In short, it allows you to create characters and motions that look real and which may not be created using other methods.

This technique manipulates the environment and creates photorealistic images for print and electronic media such as movies, videos, games, etc. As CGI visuals are more cost-effective than traditional photographic ones, they are widely used throughout the world. A single artist can produce content with CGI without using actors, set pieces, or props.

CGI is created with the help of wireframe models. The features like reflection and illumination can be assigned to the wireframes. These features can be modified as per the requirement of the image and video in order to make them look real. The quality of the visual effects produced by CGI is higher and controllable as compared to physical ones, such as creating miniatures for shots, hiring extras for crowd scenes, etc.

How CGI Works:

First, the artists create computer-generated graphics, and then to make graphics look real the texture, lighting, and color are adjusted. These adjustments make the animation look real and not cartoonish. In live-action films, the graphics are mixed with the previously filmed scenes. In this case, the lighting on the graphics must match the lighting from the scene to make the finished product seamless.

The CGI was used for the first in a movie in 1973; Michael Crichton's "Westworld". After a few years, it was used in the movie "Star Wars." In 1993, it was used in the movie "Jurassic Park". There are many other movies that make good use of CGI, such as Avatar, Lord of the Rings, Inception, Finding Nemo, The Matrix, and more.

History

The earliest examples of computer-generated imagery may be found in the 1950s when mechanical computers were used to design patterns on animation cels that were then incorporated into a feature picture. Vertigo, directed by Alfred Hitchcock, was the first movie to employ CGI (1958).

Even though Alfred may have started things off early with some 2D trickery, it was not until Edwin Catmull and Fred Parke's 1972 computer-animated short film A Computer Animated Hand that 3D computer graphics were officially exposed to the world. To do this, Edwin drew 350 triangles and polygons in ink on his hand, which were then transformed to digital form and laboriously animated utilising information in a 3D animation programme that Catmull himself developed.

A few years later, with Hollywood's support, CGI made yet another advancement. Westworld flexed its biceps in 1973 by releasing the first 2D CGI scene showcasing "Gunslinger vision," a theory of how robots could see. A sequel was produced since the first film was so popular.


(ii) CGI: Common Gateway Interface

CGI stands for Common Gateway Interface. It is a technology that enables a web browser to submit forms and connect to programs over a web server. It is the best way for a web server to send forms and connect to programs on the server. CGI can also be described as a set of standards or rules where a program or script can send data back to the web server where it can be processed.

So, it is an interface for running executables via a web server. In general, it means taking an HTTP request and passing it to an application in order to deliver a dynamically generated HTML page back to a browser. However, any program that can run on a web server is usable as a CGI script. Generally, CGI programs are used to generate pages dynamically or to perform some other action when someone fills out an HTML form and clicks the submit button. CGI applications can be written in any programming language, some of which are Perl, PHP, and Python.

How CGI works?

CGI Full Form

The browser sends a URL that causes the AOL server to use CGI to run a program. The browser runs on a client machine and exchanges information with a Web server using the HyperText Transfer Protocol or HTTP. Depending on the type of request from the browser, the web server may provide a document from its own document directory or executes a CGI program which means it passes the input from the reader (browser) to the program and output from the program back to the reader (browser). Thus, CGI works as a gateway between the AOL server and the program you write.

The steps involved in creating a dynamic HTML document on the fly through CGI are as follows:

  1. The client sends an HTTP request through a URL.
  2. From the URL, the Web server decides that it should activate the gateway program listed in the URL and send any parameters passed via the URL to that program.
  3. The gateway program processes the information and returns HTML text to the Web server. The Web server adds a MIME header and sends the HTML text to the Web browser.
  4. The web browser renders the document received from the web server.

Qualities of CGI

The following list includes some of the advantages of CGI:

  • It is a very well-supported and defined standard.
  • Typically, CGI scripts are written in Perl, C, or even just a straightforward shell script.
  • The technology called CGI communicates with HTML.
  • CGI is currently the fastest way to build a counter; hence it should be used.
  • The CGI standard is typically the one that works best with modern browsers.

Benefits of CGI:

  • Currently, CGI is easier to use than Java for performing complex operations.
  • Using pre-written code is usually simpler than writing your own.
  • As long as they adhere to the definition, CGI allows programmes to be created in any language and on any platform.
  • There are many CGI-based counters and CGI programmes that can carry out basic functions.

Disadvantages of CGI

The following list includes some of the disadvantages of CGI:

  • Because programmes must be loaded into memory for each page load in Common Gateway Interface, overhead is incurred.
  • In general, it is difficult to cache data in memory between page loads.
  • A substantial amount of existing code is written in Perl.
  • Processing time for CGI is high.

Next TopicFull Forms List




Latest Courses