Any device or mechanism created byman, is built on the basis of certain patterns of his work, which will distinguish him through the features of application and functionality. The need to meet immediate needs is the main incentive for developing new types of machines, technologies, etc. Such an opportunity is provided by the accumulation of knowledge in many fields of science and technology, the application of which allows us to create, first, the logical prerequisites of new fields of technology, for example, the logical foundations of a computer, and then implement them in new types of equipment. In simple human language this is called "technical progress".
The impetus for the emergence of a computer became twomotive: the need for large volumes of processing information and achievements in various fields of science and technology (electricity, mathematics, physics and technology of semiconductors, metallurgy and many others). The first samples of electronic computers confirmed the principles of computer operation and the era of rapid development of a new class of technical objects, called "electronic computers".
To implement the technical idea of computingdevices, the logical foundations of the computer were formulated using algebra of logic, which determined a set of functions and a theoretical basis. The laws of the algebra of logic, which defined the logical foundations of the computer, was formulated in the 19th century by the Englishman J. Boule. In fact, this is the theoretical basis for digital information processing systems. Its essence is the rules of logical relationships between numbers: conjunction, disjunction and others, which is very similar to the well-known basic relationships between numbers in arithmetic - multiplication, addition, etc. The numbers in the Boolean algebra have a binary representation, i.e. Figures are represented only by 1 and 0. Numerical operations are described by additional symbols of the algebra of logic. These elements of mathematics allow a combination of the simplest logical laws to describe any computational task or control action by special symbols, that is, "write a program". Using the input device, this program is "loaded" into the computer and serves as an "order" for it, which must be performed.
The input device converts incoming characters toelectrical signals in the form of binary code, and actions over them - transfers and transformations that realize the execution of arithmetic and logical actions, are performed by electronic devices, which are called gates, adders, triggers, etc. They make up the technical stuffing of the computer, where their number reaches tens of thousands of elements.
The design of the computer contains 4 main nodes:UU - control node, RAM and ROM - node of operational and permanent memory, ALU - arithmetic logic unit, UVB - input device. Of course, each of them respects the logic foundations of the computer. The workflow of the computer consists of loading a working program in RAM or ROM, written in special codes, which is stored on punched cards, magnetic tapes, magnetic and optical disks and other storage media. This program is designed to manipulate the CU with streams of current or working information and obtain a programmed result, for example, display the image on the monitor or convert the audio signal to digital, etc. To this end, the UE performs a number of transfers of information blocks between all the devices in the computer.
The main "think tank" of the computer isALU is the executor of all arithmetic and logical operations. Currently, the ALU function performs a device called a processor or microprocessor, which is a semiconductor device the size of a couple of match boxes, with a set of an incredible number of functions. Gradually, the functions of controlling external devices (monitors, printers, etc.) were added to the microprocessor. The latest developments in this field allowed to create microprocessors with a full set of functional computer devices, thanks to which one-chip computers of pocket format and full-fledged computer capabilities appeared. Surprisingly, the logical foundations of computers developed at one time for the first computing devices have not changed to this day.