A person measures the depth of a well by measuring the time interval between dropping a stone and receiving the sound of impact with the bottom of the well. The error in his measurement of time is $\delta \mathrm{T}=0.01$ seconds and he measures the depth of the well to be $\mathrm{L}=20$ meters. Take the acceleration due to gravity $\mathrm{g}=10 \mathrm{~ms}^{-2}$ and the velocity of sound is $300 \mathrm{~ms}^{-1}$. Then the fractional error in the measurement, $\delta \mathrm{L} / \mathrm{L}$, is closest to
$0.2 \%$
$1 \%$
$3 \%$
$5 \%$
The percentage error in the measurement of $g$ is $.....\%$ (Given that $g =\frac{4 \pi^2 L }{ T ^2}, L =(10 \pm 0.1)\,cm$, $T =(100 \pm 1)\,s )$
Write rule for error produced in result due to addition and subtraction of error.
What is error in measurement ? What is mistake in measurement ?
In a experiment to measure the height of a bridge by dropping a stone into water underneath, if the error in the measurement of times is $0.1\;s$ at the end of $2\;s$, then the error in the estimation of the height of the bridge will be
Quantity $Z$ varies with $x$ and $y$ , according to given equation $Z = x^2y - xy^2$ , where $x = 3.0 \pm 0.1$ and $y = 2.0 \pm 0.1$ . The value of $Z$ is