In the year 1826, Georg Simon Ohm, a teacher of physics at Cologne, published a book containing details of some experiments he had made to investigate the relationship between the current passing through a wire and the potential difference between the ends of the wire. As a result of these experiments, he arrived at the following law. The current passing through a wire at constant temperature is proportional to the potential difference between its ends. A conductor for which this relationship is true is said to obey Ohm’s law. This law may also be expressed as.
For a given potential difference, a high resistance will pass a small current and a low resistance a large current. Thus, the value of the constant in the above equation, which is high when the current value is small and low when the current is large, can
be used as a measure of the resistance of the wire.We may therefore write In other words, the resistance of a_conductor..is the ratio Lthe p. t ntial differ!:..nce across it to the current flowing through it. It must be appreciated that this relationship
defines the resistance of a conductor and applies whether Ohm’s law is obeyed or not. We have already chosen the units of potential difference and current, namely, the volt and the ampere. The above definition enables us to define the unit of electric resistance. This is called the ohm and is defined as follows: The ohm (0) is the resistance of a conductor such that, when a potential difference of 1 volt is applied to its ends a current of 1 ampere flows through it.
Anyone of these expressions may be derived algebraically from any other. The last one is usually the easiest to memorize.