Presentations by Matthew Martino

American Schools of Oriental Research, 2016
Typologies are fraught with subjectivity; one characteristic is considered relevant, while anothe... more Typologies are fraught with subjectivity; one characteristic is considered relevant, while another may go unnoticed entirely, sometimes leading to more than one typology being created for the same set of data, each researcher claiming to better represent the data. The creation of a typology, however, is a creative process inspired by leaps of intuition. Ultimately, intuition and its inherent subjectivity ought to be explained by scientific reasoning.
In the interest of omitting subjectivity from typological analyses, archaeologists sometimes use statistical methods, yet such methods were originally created for the analysis of quantitative data rather than the qualitative data that archaeologists usually use to characterize objects. Here we will present a typology for a database of almost 2000 Chalcolithic and Early Bronze Age figurines from southeastern Europe and Anatolia. This typology was created by a computer program designed specifically for the creation of typologies for archaeological artifacts. This method was not only time saving given the large data set, it helped to assuage concerns of subjectivity given the known spatial and unknown chronological separation of the figurines. By weighing all characteristics equally - both formal and technical, significant factors emerged from the analysis rather than being defined by the researcher.
In our presentation we will explain the necessity of a collaborative process between programmer and archaeologist. We will also show how our program can quantify some of the remaining subjectivity of a typological analysis. Lastly, we will explain how the results from our program can quantify diversity in the data.
Papers by Matthew Martino

Physical Review D Particles and Fields, 2009
Modifications to the gravitational potential affect the nonlinear gravitational evolution of larg... more Modifications to the gravitational potential affect the nonlinear gravitational evolution of large scale structures in the Universe. To illustrate some generic features of such changes, we study the evolution of spherically symmetric perturbations when the modification is of Yukawa type; this is nontrivial, because we should not and do not assume that Birkhoff’s theorem applies. We then show how to estimate the abundance of virialized objects in such models. Comparison with numerical simulations shows reasonable agreement: When normalized to have the same fluctuations at early times, weaker large scale gravity produces fewer massive halos. However, the opposite can be true for models that are normalized to have the same linear theory power spectrum today, so the abundance of rich clusters potentially places interesting constraints on such models. Our analysis also indicates that the formation histories and abundances of sufficiently low mass objects are unchanged from standard gravity. This explains why simulations have found that the nonlinear power spectrum at large k is unaffected by such modifications to the gravitational potential. In addition, the most massive objects in models with normalized cosmic microwave background and weaker gravity are expected to be similar to the high-redshift progenitors of the most massive objects in models with stronger gravity. Thus, the difference between the cluster and field galaxy populations is expected to be larger in models with stronger large scale gravity.
We study the formation of voids in a modified gravity model in which gravity is generically stron... more We study the formation of voids in a modified gravity model in which gravity is generically stronger or weaker on large scales. We show that void abundances provide complementary information to halo abundances: if normalized to the CMB, models with weaker large-scale gravity have smaller large scale power, fewer massive halos and fewer large voids, although the scalings are not completely degenerate with $\sigma_8$. Our results suggest that, in addition to their abundances, halo and void density profiles may also provide interesting constraints on such models: stronger large scale gravity produces more concentrated halos, and thinner void walls. This potentially affects the scaling relations commonly assumed to translate cluster observables to halo masses, potentially making these too, useful probes of gravity.

Physical Review D, 2009
Modifications to the gravitational potential affect the nonlinear gravitational evolution of larg... more Modifications to the gravitational potential affect the nonlinear gravitational evolution of large scale structures in the Universe. To illustrate some generic features of such changes, we study the evolution of spherically symmetric perturbations when the modification is of Yukawa type; this is non-trivial, because we should not and do not assume that Birkhoff's theorem applies. We then show how to estimate the abundance of virialized objects in such models. Comparison with numerical simulations shows reasonable agreement: When normalized to have the same fluctuations at early times, weaker large scale gravity produces fewer massive halos. However, the opposite can be true for models that are normalized to have the same linear theory power spectrum today, so the abundance of rich clusters potentially places interesting constraints on such models. Our analysis also indicates that the formation histories and abundances of sufficiently low mass objects are unchanged from standard gravity. This explains why simulations have found that the nonlinear power-spectrum at large k is unaffected by such modifications to the gravitational potential. In addition, the most massive objects in CMB-normalized models with weaker gravity are expected to be similar to the high-redshift progenitors of the most massive objects in models with stronger gravity. Thus, the difference between the cluster and field galaxy populations is expected to be larger in models with stronger large-scale gravity.

Monthly Notices of the Royal Astronomical Society, 2007
Halo model interpretations of the luminosity dependence of galaxy clustering assume that there is... more Halo model interpretations of the luminosity dependence of galaxy clustering assume that there is a central galaxy in every sufficiently massive halo, and that this central galaxy is very different from all the others in the halo. The halo model decomposition makes the remarkable prediction that the mean luminosity of the non-central galaxies in a halo should be almost independent of halo mass-the predicted increase is about twenty percent while the halo mass increases by a factor of more than twenty. In contrast, the luminosity of the central object is predicted to increase approximately linearly with halo mass at low to intermediate masses, and logarithmically at high masses. We show that this weak, almost non-existent mass-dependence of the satellites is in excellent agreement with the satellite population in group catalogs constructed by two different collaborations. This is remarkable, because the halo model prediction was made without ever identifying groups and clusters. The halo model also predicts that the number of satellites in a halo is drawn from a Poisson distribution with mean which depends on halo mass. This, combined with the weak dependence of satellite luminosity on halo mass, suggests that the Scott effect, such that the luminosities of very bright galaxies are merely the statistically extreme values of a general luminosity distribution, may better apply to the most luminous satellite galaxy in a halo than to BCGs. If galaxies are identified with dark halo substructure at the present time, then central galaxies should be about 4 times more massive than satellite galaxies of the same luminosity, whereas the differences between the stellar mass-to-light ratios should be smaller. Therefore, a comparison of the weak lensing signal from central and satellite galaxies of the same luminosity should provide useful constraints on these models. We also show how the halo model may be used to constrain the stellar mass associated with intracluster light: the mass fraction in the ICL is expected to increase with increasing halo mass. -large scale structure of the universe conclusions. RAS thanks Sebastian Jester for help with using SDSS CasJobs to obtain photometric information for groups with fiber collided galaxies.

Monthly Notices of the Royal Astronomical Society, 2009
In studies of the environmental dependence of structure formation, the large scale environment is... more In studies of the environmental dependence of structure formation, the large scale environment is often thought of as providing an effective background cosmology: e.g. the formation of structure in voids is expected to be just like that in a less dense universe with appropriately modified Hubble and cosmological constants. However, in the excursion set description of structure formation which is commonly used to model this effect, no explicit mention is made of the effective cosmology. Rather, this approach uses the spherical evolution model to compute an effective linear theory growth factor, which is then used to predict the growth and evolution of nonlinear structures. We show that these approaches are, in fact, equivalent: a consequence of Birkhoff's theorem. We speculate that this equivalence will not survive in models where the gravitational force law is modified from an inverse square, potentially making the environmental dependence of clustering a good test of such models.
Uploads
Presentations by Matthew Martino
In the interest of omitting subjectivity from typological analyses, archaeologists sometimes use statistical methods, yet such methods were originally created for the analysis of quantitative data rather than the qualitative data that archaeologists usually use to characterize objects. Here we will present a typology for a database of almost 2000 Chalcolithic and Early Bronze Age figurines from southeastern Europe and Anatolia. This typology was created by a computer program designed specifically for the creation of typologies for archaeological artifacts. This method was not only time saving given the large data set, it helped to assuage concerns of subjectivity given the known spatial and unknown chronological separation of the figurines. By weighing all characteristics equally - both formal and technical, significant factors emerged from the analysis rather than being defined by the researcher.
In our presentation we will explain the necessity of a collaborative process between programmer and archaeologist. We will also show how our program can quantify some of the remaining subjectivity of a typological analysis. Lastly, we will explain how the results from our program can quantify diversity in the data.
Papers by Matthew Martino
In the interest of omitting subjectivity from typological analyses, archaeologists sometimes use statistical methods, yet such methods were originally created for the analysis of quantitative data rather than the qualitative data that archaeologists usually use to characterize objects. Here we will present a typology for a database of almost 2000 Chalcolithic and Early Bronze Age figurines from southeastern Europe and Anatolia. This typology was created by a computer program designed specifically for the creation of typologies for archaeological artifacts. This method was not only time saving given the large data set, it helped to assuage concerns of subjectivity given the known spatial and unknown chronological separation of the figurines. By weighing all characteristics equally - both formal and technical, significant factors emerged from the analysis rather than being defined by the researcher.
In our presentation we will explain the necessity of a collaborative process between programmer and archaeologist. We will also show how our program can quantify some of the remaining subjectivity of a typological analysis. Lastly, we will explain how the results from our program can quantify diversity in the data.