2018
List of past RPMs — 2018
Despite the tremendous success of the Standard Model (SM) of particle physics, it becomes more and more clear that the SM is far from complete. For example, the non-zero neutrino mass was the first solid evidence beyond the SM, but we still do not understand why neutrinos are so light, and if they are their own anti-particles. We know that the mass of the universe is dominated by dark matter, but we do not understand the nature of the dark matter. Exploring these unknowns may lead to fundamental science discoveries and deepen our understanding of the universe.
Due to their feeble interactions with normal matter, both neutrinos and dark matter are studied under low background environment in underground laboratories. This area of research is booming in China with a few underground facilities in operation or under construction. The first half of my talk will introduce the Jiangmen Underground Neutrino Observatory (JUNO), an experiment aiming to determine the neutrino mass ordering and to precisely measure oscillation parameters using a large liquid scintillator detector. I will then discuss the PandaX project, a series of experiments using dual-phase xenon for dark matter direct detection in the China Jin-Ping underground Laboratory.
Bio:
Mengjiao Xiao, Ph.D. of Shanghai Jiao Tong University in 2016, now is postdoc at University of Maryland, College Park. Working on PandaX and JUNO experiment.
Cosmology in the next decade will be driven by data. Exploiting the information one can extract from the ongoing and upcoming large surveys will give us the power to stress-test the LCDM model with unprecedented precision and open up windows for new physics. In this talk I will present some of our work in the Dark Energy Survey Collaboration and the Large Synoptics Survey Telescope Dark Energy Science Collaboration, to analyse state-of-the-art galaxy survey data as well as getting ready for the next generation of data. I will focus on topics surrounding weak lensing analyses, including cosmology from 2-point functions, generating weak lensing mass maps, and measuring the mass profiles at the outskirts of galaxy clusters.
I will discuss motivations for searching for di-Higgs production at the LHC. Recent results and projected sensitivities will be presented with emphasis on the dominant hh->4b channel.
The existence of dark matter is strong evidence for new physics beyond the Standard Model. While laboratory and collider searches for dark matter have advanced rapidly over the past several decades, astrophysical observations currently provide the only robust, positive, empirical measurement of dark matter. Astrophysical observables can be directly linked to the fundamental properties of dark matter, such as particle mass, self-interaction cross section, and self-annihilation rate. In this talk, I will discuss how the Fermi Large Area Telescope (LAT) and the Dark Energy Survey (DES) have advanced our understanding of dark matter from observations of the smallest and most dark-matter-dominated galaxies. In addition, I will discuss opportunities to build a cohesive dark matter program with the Large Synoptic Survey Telescope (LSST).
Music is nearly universal in human culture and yet it remains
mysterious. In order to help answer some of music’s fundamental
questions, we will briefly turn to archeology and early history before
examining some of music’s salient features from a physical and
mathematical perspective. Principles rooted in physics and pure
mathematics will provide a link to intercultural qualities of musical
tone and melody to the deep role that symmetry plays in human
perception, thus shedding light on the questions that we set out to
answer. To enhance clarity and familiarity, various concepts will be
illustrated with animations and sound bites.
The next decade will be the golden age of cosmology with transients. In this talk, I will present new analyses of Type Ia Supernovae that mark the most precise measurement of dark energy to date. I will go over how this analysis ties together with the analysis of the local value of the Hubble constant, for which tension persists with the inferred value from the CMB – an exciting hint at possible departures from the standard cosmological model. I will then discuss the first measurements of the Hubble constant with kilonovae and gravitational waves. I will review the large amount of overlap between the issues that must be tackled for future progress using supernovae and kilonovae to measure cosmological parameters. Finally, I will discuss the roles that surveys like LSST and WFIRST will play and how we can harness the millions of transients discovered to make generation-defining cosmological measurements.
The experiments at the Large Hadron Collider (LHC) at CERN are at the energy frontier of particle physics, searching for answers to fundamental questions of nature. In particular, dark matter (DM) presents strong evidence for physics beyond the standard model (SM). However, there is no experimental evidence of its non-gravitational interaction with SM particles. If DM has non-gravitational interactions with the SM particles, we could be producing the DM particles in the proton-proton collisions at the LHC. While the DM particles would not produce an observable signal in the detector, they may recoil with large transverse momentum against visible particles resulting in an overall transverse momentum imbalance in the collision event. In this talk, I will review the searches for DM particles in these missing momentum final states at the Compact Muon Solenoid (CMS) experiment. I will also discuss the prospects for discovering dark matter at the High Luminosity-LHC and other future experiments.
The top quark plays an important role in the collider phenomenology of the Higgs boson, and the study of the interplay between the top quark and the Higgs boson may provide key insights to some critical questions in particle physics. A milestone in the experimental study of the top-Higgs sector of the Standard Model will be the observation of the Higgs boson production in association with top quarks (ttH), a direct evidence for the top-Higgs Yukawa coupling. With 36.1 fb-1 pp collisions at 13 TeV, the ATLAS experiment performed searches for the ttH production in several channels and found the evidence for this production. In this talk, I start with a brief overview on the experimental understanding of the Higgs boson properties and motivate the study of the top-Higgs Yukawa coupling. I then present the ATLAS ttH searches with a focus on the diphoton channel. In the end, I discuss the prospect of the ttH study and its impact on other aspects of the LHC physics program.
In 2012 the ATLAS and CMS experiments discovered the Higgs Boson, firmly establishing the Standard Model as the dominant paradigm for subatomic particle interactions. Many expected the Higgs discovery to be one of several discoveries at the energy frontier, yet five years later it remains the single addition to the subatomic particle pantheon. Meanwhile, the LHC has begun to push the boundaries of the hadron collision intensity frontier, yielding large datasets for further understanding the Standard Model as well as continuing the increasingly difficult search for new physics. I will discuss a path through the intensity frontier, focusing on a hardware based, real-time pattern recognition engine which will enable the ATLAS experiment to fully exploit the delivered data by playing the worlds most complicated game of bingo.
The Dark Energy Spectroscopic Instrument is a multi-object spectrograph composed of a wide field corrector, a 5000 robotically positioned fiber system, and 10 3-arms spectrographs. The instrument is installed this year on the Mayall 4-m diameter telescope at Kitt Peak, Arizona. Operations will start next year. In 5 years, DESI will measure spectra and redshifts of more than 30 million galaxies and quasars. This catalog will be used to measure the expansion history of the Universe and the growth rate of structure in the past 10 billion years with sub-percent precision. I will present the construction status and give some insight on the Lyman-alpha BAO analysis.
May 20, 2019. In October 2017, the International Committee on Weights and Measures met
at the International Bureau of Weights and Measures near Paris and recommended a new
definition of the SI such that a particular set of constants would have certain values when
expressed in the new SI units. In particular, the new SI would be defined by the statement:
The International System of Units, the SI, is the system of units in which
- the unperturbed ground state hyperfine splitting frequency of the
- caesium 133 atom Cs is 9 192 631 770 Hz,
- the speed of light in vacuum c is 299 792 458 m/s,
- the Planck constant h is 6.626 070 15× 10?34 J/Hz,
- the elementary charge e is 1.602 176 634×10?19 C,
- the Boltzmann constant k is 1.380 649×10?23 J/K,
- the Avogadro constant NA is 6.022 140 76×1023 mol?1,
- the luminous efficacy Kcd of monochromatic radiation of frequency 540×1012 hertz is 683 lm/W
The numerical values of the constants were determined by a special CODATA adjustment of
the values of the constants using data in papers that were accepted for publication by July
1, 2017.
The Convention of the Meter (Convention du M`etre), a treaty that specifies international
agreement on how units are defined, was established in 1875 with 17 nations initially signing
on, including the U.S. The SI, established within the treaty in 1960, is more recent and
continues to evolve. Currently, the treaty is agreed to by fifty-eight Member States, including
all the major industrialized countries. Even though a majority of people in the U.S. still
use units such as inches and pounds, the official standards for these units are based on the
SI units, so the U.S. national measurement standards will also be redefined, although the
change will be imperceptible in every-day use.
The redefinition will have a significant impact on the fundamental constants when ex-
pressed in SI units. Not only will the defining constants be exact, but many others will also
be exact, and still others will have considerably reduced uncertainties. This reflects a shift
from macroscopic measurement standards to quantum based standards.
This talk will describe the new SI, review reasons for the change, and show how units
can be based on assigned values of certain physical constants.
I will discuss the synergies between upcoming redshift and CMB experiments and show that thanks to sample variance cancellation techniques, a large improvement on constraining power is possible even at fixed volume. I will highlight the role of cross-correlations and velocity fields in increasing the statistical power of future surveys, while at the same time allowing for greater control of systematics. I will show how the combination of large-scale structure and CMB experiments hold great promise to reveal the secrets of our mysterious Universe.
Starting in 2019, the Dark Energy Spectroscopic Instrument (DESI) will increase this dataset by an order of magnitude. In this talk I will review the challenges that we will face in order to provide an exquisite measurement of the expansion over cosmic history, and the opportunities that we will have to study other fundamental questions: the sum of the mass of the neutrino species, properties of dark matter particles, and the shape of the primordial power spectrum of density fluctuations.
The recent observations of gravitational waves have been enabled by a new generation of LIGO detectors, Advanced LIGO, the most sensitive laser interferometers ever built. In my talk I will review the main scienctific results from the first two Observing Runs, O1 and O2, and discuss the status of the Advanced LIGO detectors and plans for O3. I will also describe prospects for further extending the astrophysical reach of ground-based observations with future generations of detectors.
Though dismissed by most, we claim that strong gravitational lensing of the gravitational waves for merging black holes explains the high mass binary black hole mergers observed by LIGO/Virgo explains the apparent 30 M_Sun events better than any alternative models.
It turns out to be difficult to make large mass black hole binaries in sufficient number to explain LIGO’s results. However, strong gravitational lensing of cosmological distant mergers can naturally explain them while the redshift of the orbital frequencies amplifies the observed apparent masses.
to novel mechanisms for natural suppression of scalar Flavor-Changing-Neutral-Currents
and their imllementation in specific models. Some of the most salient implications of these models will be presented. The possibility of having realistic models of spontaneous CP violation will be studied.
The Belle experiment in Japan began taking data in the late 1990s and went on to record the worlds largest sample of B-anti-B meson pairs produced in a quantum correlated state. This initial state allowed Belle, and the BaBar experiment at SLAC, to measure CP violation in B decays with high accuracy. These measurements contributed to the awarding of the 2008 Nobel Prize in Physics to Kobayashi and Maskawa for their theory of CP violation. However, other measurements have exhibited discrepancies with the Standard Model, e.g., measurements of |Vub| and |Vcb| , R(D) and R(D*), etc. Over the past several years, the Belle detector and accelerator complex have been rebuilt and significantly upgraded to become the Belle II experiment. Belle II is designed to record 50 times the data set that Belle recorded, and with much improved detector performance. This forthcoming data should resolve several discrepancies observed by Belle and BaBar. Here we review some recent results from Belle and discuss the physics program and current status of Belle II.
I will discuss the event generator GENEVA, which for the first time combines fixed order and resummed perturbative calculations with parton showering and hadronization. I will explain the basic physics concepts underlying GENEVA, and show how they can be used to achieve in principle any accuracy desired. After presenting physics results on the production of vector boson + jets at the LHC, I will finish by giving an overview of how to use GENEVA.
Recent measurements of the Hubble Constant (H_0) through distance
ladder techniques have revealed a noticeable tension with the
Planck H_0 value that was obtained under the assumption of the cosmological
“standard model”, i.e., a flat Lambda cold dark matter cosmology. Is this
tension an indication that modifications to the standard model are
necessary, or is it the sign of unknown systematic effects in one or
both of the techniques? To address this question requires additional
high-precision measurements with techniques that are independent of
the distance ladder. The time delay strong lensing technique, in
which gravitational lensing by a massive galaxy produces multiple
images of a time-variable quasar, fulfills these requirements. I will
present recent results from the H0licow program, in which the analysis
of just three time-delay strong lenses has produced a 3.8% measurement
of H_0, and discuss the implications for other cosmological parameters,
including those describing dark energy. I will also discuss the
future prospects of this technique in the era of large sky surveys and
extremely large telescopes.
Many fundamental properties of neural networks are still not well
understood. This talk studies two of these from an adversarial perspective.
I begin with my main line of research and examine the apparently-fundamental
susceptibility of neural networks to adversarial examples. I develop effective
algorithms for generating adversarial examples and find that most most training
regimes are ineffective at increasing robustness. Then, I perform a brief
examination of neural network memorization, and demonstrate that training
data can be efficiently extracted from a trained model given only black-box
access to that model. I conclude with directions for future research.
Weak-scale Supersymmetry can help to resolve several puzzles presented by the Standard Model, including the nature of dark matter and the naturalness of the Higgs mass, and has motivated a robust search program at the Large Hadron Collider. Searches for direct production of Supersymmetric partners of weak bosons and leptons are particularly interesting as probes of dark matter and naturalness, but are challenged by small cross sections, low-momentum decay products, and multiple Standard Model backgrounds. I will review efforts by the ATLAS experiment to discover such new particles, with an emphasis on scenarios with “compressed” mass spectra containing nearly-mass-degenerate states. I will discuss recent LHC results, as well as prospects for discovery in Run 2 and beyond.
Cosmological inflation is the leading hypothesis to resolve the problems in the Big Bang theory.
It predicts that primordial gravitational waves were created during the inflationary era, which
then imprinted large-scale curl patterns in the cosmic microwave background (CMB)
polarization map called the B-modes.
Measurements of the CMB B-mode signals are known as the best probe
to detect the primordial gravitational waves.
LiteBIRD is a candidate for JAXAs strategic large mission to map the polarization of
the CMB radiation over the full sky at large angular scales with unprecedented precision,
which will offer us a crucial test of cosmic inflation. It will also serve as the first crucial test of
quantum gravity such as superstring theory. Precise polarization maps of LiteBIRD will also
provide us with valuable pieces of information on particle physics and astrophysics.
In this talk, I will give an overview of the science and design of LiteBIRD
A new era in astrometry begun. Accurate positions, proper motions
and parallaxes for over a billion stars are now available on the
sub-mas level. The properties of DR2 will be explained in context
to pre-Gaia data and the final Gaia results expected in a few years.
The impact of Gaia reaches far into almost all of astronomy, and a
few examples will be given.
This talk will present the first measurement of monoenergetic muon
neutrino charged current interactions. The MiniBooNE experiment at
Fermilab has been used to isolate and study 236 MeV muon neutrino
events originating from charged kaon decay at rest. The muon
kinematics and total cross section have been extracted from this data.
Notably, this result is the first known-energy, weak-interaction-only
probe of the nucleus to yield a measurement of omega (energy
transferred to the nucleus) using neutrinos, a quantity thus far only
accessible through electron scattering. I will discuss the
significance of this measurement, and these monoenergetic neutrinos in
general, for elucidating both the neutrino-nucleus interaction and
oscillations.
Slides here
The central puzzle that drives the field of quantum foundations is the measurement problem: what gives rise to the appearance of wave function collapse? While there is no single answer to this question that has wide acceptance, this is not for lack of available options. A variety of solutions to the measurement problem have been proposed, in the form of various interpretations or modifications of quantum mechanics. Some solutions eliminate collapse entirely, as in the many-worlds and de Broglie-Bohm interpretations. Others propose altering the dynamics of the theory to make collapse objective, as in GRW (stochastic collapse) and gravitational collapse theories. “Psi-epistemic” interpretations attempt to dissolve the measurement problem by claiming that the wave function isn’t something real in itself, but merely a representation of our knowledge of an underlying reality. Finally, Copenhagen-style interpretations claim that that there is no “reality” to be talked about at all. Bounding this menagerie of interpretations are several important theorems that restrict the options available. Bell’s theorem is the most famous (and most misunderstood) of these, but it is not the only one. In this talk, I will briefly lay out the measurement problem and its history, go over several of the theorems that constrain the possible solutions, and discuss a few of the families of quantum interpretations and the open problems that remain for each of them.
The radiation environment at the High Luminosity LHC requires CMS to
replace its silicon tracker with one that is more radiation hard and granular.
The new tracker is designed twenty years after the original one, and has new unique
capabilities. It spans 8 units in rapidity, is more then twice lighter, and is capable to
participate in the first level of trigger decisions. In this talk, after sketching the
HL-LHC physics case, I will describe the detector design, FPGA-based track reconstruction,
and focus on one of many possible new opportunities it provides for new physics searches.
Current cosmological measurements have left us with deep questions about our Universe: What caused the expansion of the Universe at the earliest times? How did structure form? What is Dark Energy and does it evolve with time? New experiments like CHIME, HIRAX, and ACTPol are poised to address these questions through 3-dimensional maps of structure and measurements of the polarized Cosmic Microwave Background. In this talk, I will describe how we will use 21cm intensity measurements from CHIME and HIRAX to place sensitive constraints on Dark Energy between redshifts 0.8 — 2.5, a poorly probed era corresponding to when Dark Energy began to impact the expansion history of the Universe. I will also discuss how we will use data from new instruments on the ACT telescope to constrain cosmological parameters like the total neutrino mass and probe structure at late times.
The identification of hadronically decaying tau leptons at the LHC is challenging. I will review the methods for tau reconstruction at ATLAS and how the identification of tau leptons has enabled several interesting and novel measurements in exotics and Higgs physics at the LHC. I will also look to the future and give some thoughts on next implementations for tau leptons at the LHC.
The anisotropies in the cosmic microwave background radiation have become our most important cosmological fossil. The study of these “echoes of gravity” has revolutionized cosmology, stringently tested our models and allowed precise measurement of a host of important cosmological parameters. I will discuss how far we’ve come since the early detections of CMB anisotropies and in particular the cosmological legacy of the Planck mission.
The detailed characterization of the intensity and polarization Cosmic Microwave Background (CMB) radiation provides a powerful tool to constrain the properties of early Universe. The polarization induced by a stochastic gravitational wave background in this epoch induces a distinctive and measurable signature of primordial inflationary processes. Presently the field has a healthy contingent of ground, balloon, and space based missions. In this talk I will discuss a brief history of CMB measurements, the current status of the field and outlook for the future. I will also discuss on-going work at NASA Goddard Space Flight Center on the development of polarization-sensitive detectors for a future satellite CMB polarization mission and their application in ground based instruments.
Liquid Argon Time Projection Chamber (LArTPC), with its mm-scale position resolution and the full-active-volume imaging-aided calorimetry, is an excellent device to detect accelerator neutrinos at GeV energy range. This technology may hold the key to search for new CP violation in the lepton sector, to determine the neutrino mass hierarchy, to precisely measure neutrino mixing parameters, to search for baryon number violation, and to search for sterile neutrino(s). In this talk, I will review the current status of the detector development. In particular, the challenges in TPC signal processing and event reconstruction will be discussed among other subjects.
There has been much recent interest at the intersection of the fields of Quantum Information and Quantum Gravity. In this talk, I will go over some of the motivating principles/important work that has been done in this area, and explain some current research promising research trends.
Upcoming new instruments to measure the polarized CMB promise to provide discriminatory limits on inflation, the number of light relic particles, and the sum of the neutrino masses, ushering in a new era of using the CMB as a probe of particle physics. Achieving these science goals requires highly sensitive instruments that are composed of enormous arrays of low noise detectors. In addition, systematic errors and foreground removal must be improved to lower the systematics floor below the statistical errors, necessitating dramatic improvements in calibration precision. In this talk, I will describe Simons Observatory (coming online in ~2021) and CMB-S4 (coming online in ~2027), their science goals, and how the twin requirements for sensitivity and systematics require a new approach to software for readout, data acquisition, and control systems. I will also discuss future directions for cosmology, including work in 21cm instruments to improve our understanding of the nature of Dark Energy.
The Higgs boson discovery at the LHC marked a historic milestone in the study of fundamental particles and their interactions. Over the last six years, we have begun measuring its properties, which are essential to build a deep understanding of the Higgs sector of the Standard Model and to potentially uncover new phenomena. The Higgs’ favored decay mode to beauty (b) quarks (~60%) had so far remained elusive because of the overwhelming background of b-quark production due to strong interactions. Observing the Higgs decay to b-quarks was one of the critical missing pieces of our knowledge of the Higgs sector. Measuring this decay is a fundamental step to confirm the mass generation for fermions and may also provide hints of physics beyond the Standard Model. The CMS observation of the decay of the SM Higgs boson into a pair of b-quarks exploiting an exclusive production mode (VH) is yet another major milestone. This experimental achievement at the LHC, considered nearly impossible in the past, makes use of several advanced machine learning techniques to identify the b-quark distinctive signature, improve the Higgs boson mass resolution, and discriminate the Higgs boson signal from background processes.
The radius of the proton, generally assumed to be a well measured and understood quantity has recently come under scrutiny due to highly precise, yet con?icting, experimental results. These new results have generated a host of interpretations, none of which are completely satisfactory.
I will discuss the existing results, focusing on the discrepancy between the various extractions. I will brie?y discuss some theoretical attempts at reso-lution and focus on new scattering measurements, both planned and already underway, that are attempting to resolve the puzzle
The Large Underground Xenon (LUX) and the LUX-ZEPLIN (LZ) detectors were designed to directly observe the interaction of dark matter (DM) with xenon target nuclei and thus probe much of the unexplored DM parameter space. I will discuss two novel direct detection methods, Bremsstrahlung and the Migdal effect, that were used by LUX to place limits on DM with masses of 0.4-5 GeV. I will also describe the development of a fully 3D model of electric fields that varied during the detectors final 332 live-days of data acquisition. Since direct measurement of electric fields was not possible, this work enabled a thorough understanding of the detector throughout its operation. Lastly, I will present new results from the Xenon Breakdown Apparatus (XeBrA) built at LBNL to characterize the dielectric breakdown under high voltage in liquid xenon and liquid argon. Results from XeBrA will inform LZ and the future of noble liquid detector engineering.
What is the roadmap for the discovery of new physics in the
post-Higgs-observation era? The concept of naturalness can provide a
useful guiding principle, considering the interplay between new, heavy
physics and the electroweak scale. I will describe a new search for
natural models of supersymmetry from the ATLAS experiment via the
production of light top squarks. I will also discuss the impact of
precision top-quark measurements on future searches, highlighting a
recent measurement of quantum interference in top-quark production.
Smoking gun signals for new physics at the LHC elude us.
What gaps remain under the lamppost? Where do we focus the next
generation of searches? This talk tackles these pressing problems by
highlighting new frontiers opened by recent theoretical surveys,
extending ATLAS beyond its design capability, and state-of-the-art
analysis strategies. Sensitivity to key supersymmetry targets
motivated by naturalness and WIMP dark matter are just surpassing two
decade old LEP limits. Striking blind spots remain around 100 GeV mass
scales that invite creative discovery strategies for future efforts.
Supersymmetry (SUSY) gives a solution to the naturalness problem of the Standard Model (SM) while providing a candidate for dark matter, solving two mysteries in modern physics. As the LHC collects more data and sets strong constraints on SUSY in the strong force sector within reach of the collider, new SUSY particles produced by electroweak (EWK) processes remain significantly less unconstrained. The EWK production of SUSY particles can be observed by their decay via the W and Z gauge bosons to final states with two or three leptons and missing transverse momentum from invisible particles. Two SUSY models are considered: one with a Wino next-to-lightest SUSY particle (NLSP) and Bino LSP, motivated by dark matter, and another with a Higgsino LSP, motivated by naturalness. The Higgsino LSP models are particularly challenging due to the small mass, or compressed, splittings, leading to low energy W and Z bosons. This requires specialized techniques for triggering, optimizing, and estimating backgrounds. I will present the latest ATLAS searches for both models, including the first result for Higgsino production since LEP, as well the outlook for future studies.
Most searches for new physics at the Large Hadron Collider assume that a new particle produced in pp-collisions decays almost immediately, or is non-interacting and escapes the detector. However, a variety of new physics models predict particles which decay inside the detector at a discernible distance from the interaction point. Such long-lived particles would create spectacular signatures and evade many prompt searches. In this talk I will focus on a search for long-lived particles in events with a displaced vertex and a muon. I will also discuss challenges for the Muon Spectrometer in the face of increasing LHC luminosity.
Experiments at nuclear reactors have played a key role in determining the properties of the weakly-interacting neutrinos. PROSPECT is a next-generation experiment studying reactor neutrinos at very short baselines (< 10m) in an environment with limited shielding from cosmogenic backgrounds. Commissioned in March 2018, the compact, segmented detector unambiguously observed neutrinos in its first 2 hours of operation. In the months following, PROSPECT has performed a world-leading search for “sterile” neutrino oscillations and made the world-leading measurement of the uranium-235 antineutrino energy spectrum. This talk will detail the detector design, construction, and first physics results.
The LZ (LUX-ZEPLIN) experiment is a second generation direct dark matter detector under construction at the Sanford Underground Research Facility (SURF) in Lead, South Dakota, USA. LZ will use a 7 tonne central liquid xenon target, arranged in a dual-phase time projection chamber (TPC), to seek evidence for nuclear recoils from a hypothesized galactic flux of weakly interacting massive particles (WIMPs). Surrounding the LZ TPC will be an outer detector (OD) consisting of 17.3 tonnes of LAB-based gadolinium-loaded liquid scintillator (GdLS). The ODs primary functions will be to tag neutron single-scatter events in the liquid xenon which could mimic a WIMP dark matter signal and to characterize the radiation environment of LZ. In this talk, I summarize the expected performance of the OD and report on the design and results of the Screener, a small liquid scintillator detector consisting of ? 23 kg of the GdLS to be used in the OD. The Screener was operated in the ultra-low-background environment of the former LUX water shield in the Davis Laboratory at SURF for radioassay of the GdLS. Careful selection of detector materials and use of ultra-low-background PMTs allows the measurement of a variety of radioimpurities. In particular, the 14C/12C ratio in the scintillator is measured, while the use of pulse shape discrimination allows the concentration of isotopes throughout the 238U, 235U, and 232Th chains to be measured by fitting the collected spectra from ? and ? events. The GdLS is found to meet the requirements for LZ, however, more aggressive purification is being implemented for the final GdLS product that ensures the OD will successfully carry out its role in the hunt for WIMP dark matter.
MicroBooNE is a single-phase liquid argon time projection chamber (LArTPC) short-baseline accelerator neutrino experiment located at Fermilab on the Booster neutrino beamline. MicroBooNEs foremost scientific objective is to address the low energy excess of single shower electromagnetic events seen by the precursor MiniBooNE experiment. Leveraging the fine-grained drifted ionization charge signal from particle interactions, LArTPCs provide detailed topological and calorimetric information for physics analyses. By capitalizing on the interplay between scintillation light and 3D ionization charge imaging, a high efficiency, low background analysis is in development to address MiniBooNEs anomalous result. The status of this analysis is described.
A robust detection of neutrino masses is avowedly among the key goals of several upcoming Cosmic Microwave Background (CMB) and Large-Scale Structure (LSS) surveys. In this talk, I will describe recent progress in neutrino cosmology on three fronts. Firstly, I will illustrate the wealth of information on the sum of the neutrino masses obtainable from current cosmological probes, focusing on LSS data. Current upper limits begin favoring the normal neutrino mass ordering, emphasizing the need to develop statistical tools for quantifying this preference. Next, I will discuss galaxy bias as a limitation towards fully capitalizing on neutrino information hidden in LSS data, proposing a method for calibrating the scale-dependent galaxy bias using CMB lensing-galaxy cross-correlations. Moreover, in massive neutrino cosmologies the bias as usually defined is scale-dependent even on large scales: neglecting this effect will lead to incorrectly inferred parameters. Finally, I will take on a different angle and discuss degeneracies between neutrinos and other cosmological parameters. I will show how in certain physically motivated dynamical dark energy models the neutrino mass upper limits tighten instead of broadening, discussing implications for future laboratory determinations of the mass ordering. I will also discuss how neutrino unknowns affect constraints on inflationary models.