Since the dawn of the modern science of earthquakes, a clear split has risen among theoretical and statistical seismology and tectonics. Classical seismology explains what happens during a few seconds of fault slip following rock breakdown, with the consequent radiation of seismic waves, while statistical seismology describes how seismicity occurs in space and time. Even though they investigate the same subject, differences are so deep that a relationship among them can be drawn not without significant effort. Statistical seismology is mainly grounded on two fundamental laws, i.e., the Gutenberg–Richter law and the Omori–Utsu law, both characterized by a scaling behavior. In our model (Zaccagnino et al., 2022), we suggest that the brittle crust abides by a simple principle of optimization of the energy needed to achieve more stable configurations. This paradigm, ruling almost all the natural processes, can be used to put into contact coseismic dynamics with large-scale seismic processes and tectonics so far treated separately despite a flurry of evidence suggests the opposite (e.g., Schorlemmer et al., 2005; Leonard, 2010). Each perturbation within a fault interface has a probability of growing an earthquake or not, depending on disorder within the fault zone and the energy accumulated in the adjoining volumes, mainly controlling the evolution of seismic sequences. Moreover, it implies that a relationship between fracturing regimes, “efficiency” of the seismic process, duration of the seismic sequences and geodynamic setting exists, with outstanding potential impact on seismic hazard. Our model also suggests that the parameter describing how the number of earthquakes decreases after a major seismic event, p, is positively correlated to the exponent of the frequency-size distribution of seismicity, b, according to the formula p ≈ 0.6 + 0.65 b, which is compatible with observations.
File allegati a questo prodotto
Non ci sono file associati a questo prodotto.