Relevant time durations for modern science range from the femtosecond of very fast electronics to the age of the universe, 15 billion years. This corresponds to a range of about 32 orders of magnitude. Moreover, this variable can be controlled and measured with an accuracy better than 10−14 by modern atomic clocks. However, the accuracy that can be obtained by purely electronic circuits, such as integrated circuits, is only of the order of 10−3. This is because there is no combination of available electronic components (like a RC time constant for example) that is more precise and constant with time and temperature. Now, 10−3 corresponds to an error of about 1.5 minute per day, which is totally unacceptable for timekeeping applications. The same is true for applications to modern telecommunications, which exploits the frequency spectrum up to 300Ghz.
1