Sagot :
Answer:
A. Bit - The smallest increment of data on a machine is a bit, which is a binary digit. A bit can only store one of two values: 0 or 1, which correspond to the electrical states of off or on.
B.Byte-The byte is a unit of digital data that usually consists of eight bits. The byte is the smallest addressable unit of memory in many computer architectures since it was historically the number of bits used to encode a single character of text in a computer.
C. Kilobyte- For digital data, the kilobyte is a multiple of the unit byte. The prefix kilo is defined as 1000 bytes in the International System of Units; one kilobyte is 1000 bytes in this definition.
D.Megabyte- For digital data, the megabyte is a multiple of the unit byte. MB is the recommended unit symbol. In the International System of Units, the unit prefix mega is a multiplier of 1000000.
E.Gigabyte- A unit of data storage space that is approximately equal to 1 billion bytes and is pronounced with two hard Gs. In decimal notation, it is also equivalent to two to the 30th power, or 1,073,741,824 Giga is a Greek term that means "giant."
F.Terabyte- A terabyte is 1012 bytes, or one billion billion bytes. The terabyte (abbreviated "TB") is a unit of measurement that equals 1,000 gigabytes and precedes the petabyte. Terabytes are now used to describe all hard drives with a capacity of 1,000 GB or more.
G.Hertz- Hertz is a frequency variable. The number of cycles per second is equal to the number of hertz (abbreviated Hz). Hertz can be used to express the frequency of any phenomenon with regular periodic fluctuations, but it is most often associated with alternating electric currents, electromagnetic waves (light, radar, etc.), and sound.
H.Kilohertz- Kilohertz is commonly used to measure the frequencies of sound waves, since the audible spectrum of sound frequencies is between 20 Hz and 20 kHz.
i.Megahertz-One megahertz (abbreviated: MHz) is equal to 1,000 kilohertz, or 1,000,000 hertz. It can also be described as one million cycles per second. Megahertz is used to measure wave frequencies, as well as the speed of microprocessors.
j.Gigahertz- 1,000 megahertz (MHz) or 1,000,000,000 Hz equals one gigahertz. It's a common way of determining how fast a machine can process information. For a long time, computer processor speeds were measured in megahertz, but after personal computers passed the 1,000 megahertz mark around the year 2000, gigahertz became the standard measurement unit.
Explanation:
Hope this helps :))