Statistical Methods and Modeling of Seismogenesis. Eleftheria Papadimitriou
Чтение книги онлайн.
Читать онлайн книгу Statistical Methods and Modeling of Seismogenesis - Eleftheria Papadimitriou страница 17
Mignan, A. and Woessner, J. (2012). Estimating the magnitude of completeness for earthquake catalogs. Community Online Resource for Statistical Seismicity Analysis.
Orlecka-Sikora, B. (2004). Bootstrap and jackknife resampling for improving in the nonparametric seismic hazard estimation. In The IUGG 3003 Proceedings Volume “Earthquake Hazard, Risk and Ground Motion”, Chen, Y.T., Panza, G.F., Wu, Z.L. (eds). Seismological Press, Beijing.
Orlecka-Sikora, B. (2008). Resampling methods for evaluating the uncertainty of the nonparametric magnitude distribution estimation in the probabilistic seismic hazard analysis. Tectonophysics, 456, 38–51.
Orlecka-Sikora, B. and Lasocki, S. (2005). Nonparametric characterization of mining induced seismic sources. In Proc. Sixth Int. Symp. on Rockburst and Seismicity in Mines 9–11 March 2005, Australia, Potvin, Y. and Hudyma, M. (eds). Australian Centre for Geomechanics, Nedlands.
Orlecka-Sikora, B. and Lasocki, S. (2017). Interval estimation of seismic hazard parameters. Pure Appl. Geophys., 174, 779–791.
Orlecka-Sikora, B., Cielesta, S., Lasocki, S. (2019) Tracking the development of seismic fracture network from The Geysers geothermal field. Acta Geophysica, 67, 341–350.
Orlecka-Sikora, B., Lasocki, S., Kocot, J., Szepieniec, T., Grasso, J-R., Garcia-Aristizabal, A., Schaming, M., Urban, P., Jones, G., Stimpson, I., Dineva, S., Sałek, P., Leptokaropoulos, K., Lizurek, G., Olszewska, D., Schmittbuhl, J., Kwiatek, G., Blanke, A., Saccorotti, G., Chodzińska, K., Rudziński, Ł., Dobrzycka, I., Mutke, G., Barański, A., Pierzyna, A., Kozlovkaya, E., Nevalainen, J., Kinscher, J., Sileny, J., Sterzel, M., Cielesta, S., Fischer, T. (2020). An open data infrastructure for the study of anthropogenic hazards linked to georesource exploitation. Scientific Data, 7, 89.
Pacheco, J.F., Scholz, C.H., Sykes, L.R. (1992). Changes in frequency size relationship from small to large earthquakes, Nature, 355, 71–73.
Pisarenko, V.F. and Sornette, D. (2003). Characterization of the frequency of extreme earthquake events by the Generalized Pareto Distribution. Pure Appl. Geophys., 160, 2343–2364.
Schwartz, D.P. and Coppersmith, K.J. (1984). Fault behavior and characteristic earthquakes: Examples from the Wasatch and San Andreas Fault Zones. J. Geophys. Res., 89, 5681–5698.
Scott, D.W. (2015). Multivariate Density Estimation: Theory, Practice, and Visualization, 2nd Edition. John Wiley, New York.
Silverman, B.W. (1986). Density Estimation for Statistics and Data Analysis. Chapman & Hall, London.
Urban, P., Lasocki, S., Blascheck, P., do Nascimento, A.F., Giang, N.V. and Kwiatek, G. (2016). Violations of Gutenberg–Richter relation in anthropogenic seismicity. Pure Appl. Geophys., 173, 1517–1537.
Utsu, T. (1999). Representation and analysis of the earthquake size distribution: A historical review and some new approaches. Pure Appl. Geophys., 155, 509–535.
Wand, M.P and Jones, M.C. (1995). Kernel Smoothing. Chapman & Hall/CRC, London.
2
Earthquake Simulators Development and Application
Rodolfo CONSOLE1 and Roberto CARLUCCIO2
1CGIAM, Potenza, Italy 2INGV, Rome, Italy
In the last 20 years, thanks to the steady increase in computing power, many earthquake simulators were developed, capable of generating synthetic earthquake catalogs spanning up to hundreds of thousands of years. Physics-based simulators are useful supports for improving overall testing procedures of earthquake forecasting. This chapter deals with the concepts that earthquake simulators have been built on, and gives examples of applications of such simulators to real fault systems, comparing them with observed seismicity. Section 2.2 includes a review of the most popular algorithms applied in earthquake simulators, as they are described in the available seismological literature. Earthquake simulators differ in the type of methodology developed within them and in terms of the geometry type of the patches used in the definition of the earthquake sources. There are simulators that are essentially based on the fit of the Gutenberg–Richter distribution as well as those that incorporate the stress interaction between faults, adding the rate- and state-dependent fault constitutive properties for the sliding strength of faults, and adopting more sophisticated types of loading conditions. Section 2.3 is devoted to the evolution and a detailed description of a special kind of simulator, with the aim of obtaining, as much as possible, simulated earthquake catalogs resembling the real ones in terms of the size distribution of the events and their space-time patterns. Finally, section 2.4 gives an example of application of the above-mentioned simulator for the Nankai mega-thrust fault system, with particular emphasis on the study of the stress evolution on the fault surface during numerous repeating earthquake cycles.
2.1. Introduction
Earthquake simulators were initially based on the elastic rebound theory of earthquakes, introduced by Reid (1910) in his study of the 1906 San Francisco earthquake. Starting from the pioneering work of Burridge and Knopoff (1967), the earthquake process has been simulated by a slider-block model in which each spatially coarse grained site on a fault is represented by a block sliding on a frictional surface. Physically, the blocks are intended to represent the sticking points, or asperities, on the fault surface (Rundle et al. 2002). A slider-block model becomes more and more detailed as the number of blocks is increased (Figure 2.1). While the original slider block model (Burridge and Knopoff 1967) specified massive blocks with inertia, more recent models are commonly based on the stochastic cellular automaton type (Rundle and Jackson 1977; Rundle and Brown 1991). In the last 20 years, the increase in computer power has allowed the development of physics-based earthquake simulators based on very complex and realistic fault models, containing thousands of blocks (also named “patches” or “cells” by different authors). They can generate synthetic earthquake catalogs containing hundreds of thousands of events on a time span of hundreds of thousands of years and in a wide magnitude range. The earthquake simulators differ in the type of methodology developed within them and for the geometry type of the patch used in the definition of the topology of a complex fault.
Of course, even if simulators have become more and more complex, they remain models approximating the infinitely more complex physical reality of the earthquake process with many limitations. As a result, not all conclusions based on this kind of models can be tested in any detail against actual earthquakes. Nevertheless, models can be useful in developing hypotheses to explain earthquake observations, such as well-known statistical relationships like magnitude-frequency distributions, temporal relationships like the Omori law and some properties of earthquake clustering. In this respect, there is a general agreement on the usefulness of physics-based earthquake simulators for improving overall testing procedures of earthquake forecasting (for example Schultz et al. 2015; Christophersen et al. 2017; Field 2019).
2.2. Development of earthquake simulators