Thursday, October 29, 2009

RAM

Random-access memory (usually known by its acronym, RAM) is a form of computer data storage. Today, it takes the form of integrated circuits that allow stored data to be accessed in any order (i.e., at random). The word random thus refers to the fact that any piece of data can be returned in a constant time, regardless of its physical location and whether or not it is related to the previous piece of data.

By contrast, storage devices such as magnetic discs and optical discs rely on the physical movement of the recording medium or a reading head. In these devices, the movement takes longer than data transfer, and the retrieval time varies based on the physical location of the next item.

The word RAM is often associated with volatile types of memory (such as DRAM memory modules), where the information is lost after the power is switched off. Many other types of memory are RAM, too, including most types of ROM and flash memory called NOR-Flash.

Modern types of writable RAM generally store a bit of data in either the state of a flip-flop, as in SRAM (static RAM), or as a charge in a capacitor (or transistor gate), as in DRAM (dynamic RAM), EPROM, EEPROM and Flash. Some types have circuitry to detect and/or correct random faults called memory errors in the stored data, using parity bits or error correction codes. RAM of the read-only type, ROM, instead uses a metal mask to permanently enable/disable selected transistors, instead of storing a charge in them.

As both SRAM and DRAM are volatile, other forms of computer storage, such as disks and magnetic tapes, have been used as persistent storage in traditional computers. Many newer products instead rely on flash memory to maintain data when not in use, such as PDAs or small music players. Certain personal computers, such as many rugged computers and netbooks, have also replaced magnetic disks with flash drives. With flash memory, only the NOR type is capable of true random access, allowing direct code execution, and is therefore often used instead of ROM; the lower cost NAND type is commonly used for bulk storage in memory cards and solid-state drives.

Similar to a microprocessor, a memory chip is an integrated circuit (IC) made of millions of transistors and capacitors. In the most common form of computer memory, dynamic random access memory (DRAM), a transistor and a capacitor are paired to create a memory cell, which represents a single bit of data. The capacitor holds the bit of information—a 0 or a 1 . The transistor acts as a switch that lets the control circuitry on the memory chip read the capacitor or change its state.

Source : http://en.wikipedia.org/wiki/RAM

Wednesday, October 28, 2009

Netbook

Netbooks (also called mini notebooks or subnotebooks) are a rapidly evolving category of small, light and inexpensive laptop computers suited for general computing and accessing web-based applications; they are often marketed as "companion devices," that is, to augment a user's other computer access. Walt Mossberg called them a "relatively new category of small, light, minimalist and cheap laptops." By August 2009, CNET called netbooks "nothing more than smaller, cheaper notebooks."

At their inception in late 2007 — as smaller notebooks optimized for low weight and low cost— netbooks omitted key features (e.g., the optical drive), featured smaller screens and keyboards, and offered reduced specification and computing power. Over the course of their evolution, netbooks have ranged in size from below 5" to over 13", and from ~1 kg (2-3 pounds). Often significantly less expensive than other laptops, by mid-2009, netbooks had been offered to users "free of charge", with an extended service contract purchase.

In the short period since their appearance, netbooks have grown in size and features, now converging with new smaller, lighter notebooks. By mid 2009, CNET noted "the specs are so similar that the average shopper would likely be confused as to why one is better than the other," noting "the only conclusion is that there really is no distinction between the devices."

Source : http://en.wikipedia.org/wiki/Netbook

Monitor


A monitor or display (sometimes called a visual display unit) is a piece of electrical equipment which displays images generated by devices such as computers, without producing a permanent record. The monitor comprises the display device, circuitry, and an enclosure. The display device in modern monitors is typically a thin film transistor liquid crystal display (TFT-LCD), while older monitors use a cathode ray tube (CRT)

Source : http://en.wikipedia.org/wiki/Computer_monitor

Sunday, February 1, 2009

Mouse

In computing, a mouse (plural mouses, mice, or mouse devices) is a pointing device that functions by detecting two-dimensional motion relative to its supporting surface. Physically, a mouse consists of an object held under one of the user's hands, with one or more buttons. It sometimes features other elements, such as "wheels", which allow the user to perform various system-dependent operations, or extra buttons or features can add more control or dimensional input. The mouse's motion typically translates into the motion of a pointer on a display, which allows for fine control of a Graphical User Interface.

The name mouse, originated at the Stanford Research Institute, derives from the resemblance of early models (which had a cord attached to the rear part of the device, suggesting the idea of a tail) to the common mouse.

The first marketed integrated mouse – shipped as a part of a computer and intended for personal computer navigation – came with the Xerox 8010 Star Information System in 1981.

However, not until appearance of Apple Macintosh community masses really had idea of this device's existence: back in 1984, a prominent PC coloumnist commented the release of this new computer with a mouse: “There is no evidence that people want to use these things.”

Mouses now come with most computers although they can be bought separately.

http://en.wikipedia.org/wiki/Computer_mouse

Keyboard

After punch cards and paper tape, interaction via teletype-style keyboards became the main input device for computers. During the 1980s and 1990s almost all computers came equipped with them as the main form of interaction, and most users are familiar with using them.

There are different types of keyboard technologies.

The most popular layout of keys on the modern-day English keyboard is called the QWERTY design, based on the most popular typewriter keyboard layout further extended to the standard 104-key PC keyboard layout, with the addition of cursor keys, a calculator-style numeric keypad, two groups of special function keys, a key for the Windows Start menu (on IBM and clones), and other modifier keys. Some computer manufacturers have added keys specifically related to the Internet and e-mail, but these have not become standard.

The fastest typists (as of 2007) use a stenograph, a kind of chorded keyboard used by most court reporters and closed caption reporters.

Despite the development of alternative input devices such as the mouse, touch sensitive screens, pen devices, character recognition, voice recognition, and improvements in computer speed and memory size, the keyboard remains the most commonly used and most versatile device used for direct human input into computers.

http://en.wikipedia.org/wiki/Alphanumeric_keyboard

CPU

A central processing unit (CPU) is a machine that can execute computer programs. This broad definition can easily be applied to many early computers that existed long before the term "CPU" ever came into widespread usage. The term itself and its initialism have been in use in the computer industry at least since the early 1960s (Weik 1961). The form, design and implementation of CPUs have changed dramatically since the earliest examples, but their fundamental operation has remained much the same.

Early CPUs were custom-designed as a part of a larger, sometimes one-of-a-kind, computer. However, this costly method of designing custom CPUs for a particular application has largely given way to the development of mass-produced processors that are suited for one or many purposes. This standardization trend generally began in the era of discrete transistor mainframes and minicomputers and has rapidly accelerated with the popularization of the integrated circuit (IC). The IC has allowed increasingly complex CPUs to be designed and manufactured to tolerances on the order of nanometers. Both the miniaturization and standardization of CPUs have increased the presence of these digital devices in modern life far beyond the limited application of dedicated computing machines. Modern microprocessors appear in everything from automobiles to cell phones to children's toys.

http://en.wikipedia.org/wiki/Central_processing_unit

The History

A computer is a machine that manipulates data according to a list of instructions.


The first devices that resemble modern computers date to the mid-20th century (1940–1945), although the computer concept and various machines similar to computers existed earlier. Early electronic computers were the size of a large room, consuming as much power as several hundred modern personal computers (PC). Modern computers are based on tiny integrated circuits and are millions to billions of times more capable while occupying a fraction of the space. Today, simple computers may be made small enough to fit into a wristwatch and be powered from a watch battery. Personal computers, in various forms, are icons of the Information Age and are what most people think of as "a computer"; however, the most common form of computer in use today is the embedded computer. Embedded computers are small, simple devices that are used to control other devices — for example, they may be found in machines ranging from fighter aircraft to industrial robots, digital cameras, and children's toys.

The ability to store and execute lists of instructions called programs makes computers extremely versatile and distinguishes them from calculators. The Church–Turing thesis is a mathematical statement of this versatility: any computer with a certain minimum capability is, in principle, capable of performing the same tasks that any other computer can perform. Therefore, computers with capability and complexity ranging from that of a personal digital assistant to a supercomputer are all able to perform the same computational tasks given enough time and storage capacity.