Embedded LINUX in a Soft Real-Time Task:
The Canadian Geological Survey Internet Seismometer
© September 2001
Seismology as a science is traditionally focused on the question how earthquakes occur, what their sources are and if there is any way to predict where and when an earthquake will occur, as well as it is now possible to predict the weather over the next days, weeks or even month.
While our understanding of the sources of earthquakes and the dynamic processes in the earth's crust has taking large steps over the last century, advances in earthquake prediction, in the sense of forecasting their onset, have been rather disappointing.
On the other hand, techniques to assess the potential risk that a particular building, a house, a hydro-dam or a bridge may be damaged during an earthquake provide a valuable bases to prepare for the aftermath of a large earthquake. Insurance companies use those risk assessments to calculate their earthquake insurance premiums and engineers need the same kind of information to design buildings which can withstand the shaking of the largest earthquake which can probably occur in a particular location.
In simple terms, earthquake hazard and risk assessment tells you what the chances are that you are struck by an earthquake of a certain magnitude and what the worst case scenario you should prepare for, may look like.
Earthquake hazard assessments are largely based on a good knowledge of the sources and defining tectonic structures in the region as well as on statistical analysis of historic data. Here the general idea is, that the likelihood that a particular place will be affected by an earthquake is based on the sequence of quakes that have affected that particular place in the past. Since our instrumental record of past earthquakes is rather short in comparison to the times over which tectonic features evolve, the statistical assessment alone is often difficult and unreliable.
Additionally scientists and engineers started to realize more recently that hazard assessment which is focused only on the earthquake sources, does not tell the whole story. The actual earthquake damage sustained by structures in a larger city for example showed a rather complex pattern and the effects of earthquakes of even moderate magnitudes would actually vary widely over the distance of a few city-blocks (7).
Traditional earthquake hazard maps do not have that kind of resolution since local effects which can actually amplify (or attenuate) ground motion triggered by an earthquake, could only be accounted for in a limited way.
These local effects are in general related to the geological structure of the imme-diate
subsurface and the topography of an affected area. While geo-technical models
exist which can explain amplification and attenuation of seismic waves depending on
the local subsurface, it was soon recognized, that those effects should actually be di-rectly
A good (and rare) example of a highly detailed earthquake hazard map for the
greater Victoria Area (British Columbia, Canada) can be found under
Amplification effects appear to be non linearly dependent on the amplitude of the original excitation and actual measurements are very much needed to fine-tune prediction models.
A further complication and a directly related problem is the prediction of the actual damage the local shaking my cause to a particular building (5), since this "is a complex function of amplitude, frequency, and duration, and varies with the structure or component being considered" (9).
While a world-wide network of only a few hundred, highly sensitive instruments is sufficient to detect, record and localize almost any significant earthquake on earth, it is quite obvious that in order to study the highly localized effects of an earthquake on densely populated areas, a very dense network of instruments would be required.
Additionally, those instruments would, rather than being extremely sensitive, have to be able to record very strong and violent ground motion and survive an actual earthquake themselves.
Instruments of this kind are called strong motion seismometers and a variety of them is commercially available. For a dense station network of several hundred instruments, however, current strong motion seismometers are not ideally suited. They lack built-in communication capabilities which would allow data to be retrieved remotely.
How important these communication capabilities are becomes apparent if these instruments are also to be used as real-time sensors in an actual major earthquake in order to provide a basis of reconnaissance for the emergency response teams. In order to respond to a major earthquake disaster, the limited resources of fire fighters, ambulances and other disaster relief crews have to be prioritized.
A dense network of strong motion seismometers in an urban area could provide the data for the generation of a shake map which could, within minutes after the event, direct disaster response teams to the most affected areas within a city and save extremely valuable time, which would otherwise be needed to first assess the situation. The Taiwan Central Weather Bureau's Seismic Network (11) (10) and the Southern California TriNet (3) have successfully implemented the generation of shake-maps based on strong-motion instrument networks. Japan operates the Kyoshin strong motion network (6) with 1000 seismic stations since 1996.
With the perspective to an application as a system for instant damage estimation, it is evident that a seismic station within such a network has to be completely self contained, robust, and be able to establish data communication with a central facility in (almost) real time.
Practical considerations also mandate that each individual instrument can be monitored and serviced remotely. If up to 300 instruments are deployed in a city like Vancouver alone, it is practically impossible to attend to each individual instrument on site in order to change parameters of the acquisition or to download data from a seismic event.