Physics

This year marks the 60th anniversary of the publication Synthesis of the Elements in Stars, known as B2FH by the authors Burbage, Burbage, Fowler and Hoyle. A combination of new astronomical observations, improved nuclear data, and more realistic astrophysical modeling has revealed the origins of the elements heavier than iron to be more complicated than envisioned by B2FH. The synthesis of elements ranging between iron and tin is particularly nuanced, with many astrophysical sites likely contributing. I will review how our understanding has evolved in recent years, highlight some of the major questions still unresolved 60 years after B2FH, and focus on some measurements of nuclear data that are important for advancing our understanding of the origins of elements ranging from iron to tin.

About the Speaker

There is very little in life about which we are absolutely certain; maybe, only death and taxes as the saying goes. Moreover, the world is replete with things that seem to happen for no discernible reason. Indeed, physicists insist that at its core the world is intrinsically random. Given the ubiquity of uncertainty and randomness, it is not surprising that mathematicians and philosophers have tried, over the centuries, to make sense of both. The key idea, which everyone seems to accept, is that probability is somehow related to uncertainty and randomness. Beyond that views diverge.

In this talk, I trace the long often controversial development of the concept and applications of probability, from the early works of the seventeenth century to the present day. I end with speculations about how probability might be used in the not too distant future.

About the Speaker

Faculty Host: Manfred Paulini

In his lecture, Bialek, a theoretical physicist interested in the phenomena of life, will discuss his thoughts on how physics has been able to create accurate mathematical descriptions of the physical world, helping us to not only understand what we see, but predict what will happen in places we have never looked before. He will address questions including: Are there limits to this predictive power, particularly when applied to the complex phenomena of life? And are we missing some deep principles that will bring the living world under the predictive domain of physics? Bialek also will offer reflections on how physicists might be able to approach the complex and diverse phenomena of the living world and develop new theories to help explain the world around us.

Bialek, the John Archibald Wheeler/Battelle Professor in Physics at Princeton and Visiting Presidential Professor of Physics at The Graduate Center of the City University of New York, is known for his work emphasizing the approach of biological systems to the fundamental physical limits on their performance. In recent work, he and his collaborators have shown how the collective states of biological systems, such as the activity in a network of neurons or the flight directions in a flock of birds, can be described using ideas from statistical physics, connecting them in quantitative detail with new experimental data.

Bialek has been a member of the Princeton faculty since 2001. He has received the President’s Award for Distinguished Teaching at Princeton, and recently published a textbook, Biophysics: Searching for Principles. A member of the National Academy of Sciences and a fellow of the American Physical Society, he received the 2013 Swartz Prize for Theoretical and Computational Neuroscience from the Society for Neuroscience.

Reception to follow in Mellon Institute Lobby

2D layered materials are like paper: they can be colored, stitched, stacked, and folded to form integrated devices with atomic thickness. In this talk, I will discuss how different 2D materials can be grown with distinct electrical and optical properties (coloring), how they can be connected laterally to form pattered circuits (stitching), and how their properties can be controlled by the interlayer rotation (twisting). We will then discuss how these atomically thin papers and circuits can be folded to generate active 3D systems.

About the Speaker

One of the great triumphs of 19th- century science was the emergence of thermodynamics. This is a subject of great power and generality, setting down the rules for what is possible and, even more crucially, what is not possible: there can be no perpetual motion machines, heat flows from hot bodies to cold bodies, and any effort to convert energy from one form to another always involves a bit of waste. A central, if slightly mysterious concept in thermodynamics is the entropy, which is introduced first as a bookkeeping device but then becomes fundamental. In the formulation of statistical mechanics, the bridge connecting our microscopic description of atoms and molecules to the macroscopic phenomena of our everyday experience, entropy reappears as a measure of the number of states that are accessible to all the atoms and molecules.

In the mid-twentieth century, entropy makes yet another appearance, first as a quantitative measure of information, and then as a limit on the amount of space that we need to record that information. It is astonishing that the same concept reaches from steam engines to the internet, and from molecules to language. In this lecture I will try to give a sense for these four different notions of entropy, and their connections with one another, hoping to give a sense for the unifying power of mathematics.

As endpoints of cosmic structure formation that emerge in the era of dark energy domination, the population of clusters of galaxies offers insights into cosmology and the gravitational growth of large-scale structure. The composition of clusters — dark matter and baryons in multiple phases co-evolving within a hierarchical cosmic web of massive halos — is being scrutinized observationally across the electromagnetic spectrum and with increasingly sophisticated numerical simulations.  In this presentation, I will outline the phenomenological framework of cluster cosmology, emphasizing multi-wavelength population statistics and support from astrophysical simulations, then discuss some of the challenges associated with early 21st century reality.

About the Speaker.

Nearly 400 years ago, Galileo gave us the image of the great Book of Nature, lying open before us. We could read it, he said, only if we understood its language - the language of mathematics. The search for a mathematical description of nature, an activity we now call theoretical physics, has been extraordinarily successful. In a real sense, what we see around us are the consequences of equations that can be written on one sheet of paper. This tremendous success encourages physicists to keep searching for simplicity, even in apparently complex systems. Why do we believe that the world should be described by simple models? Is this just an extrapolation from past successes, liable to fail at any moment? Faced with the evident complexity of the world, is the search for simple mathematical descriptions just a matter of guessing, or are there principles to guide our search? I’ll address these questions with lessons from history of the subject, then turn to one of the modern frontiers: the search for a physicist’s understanding of the brain and mind.

Discussions of the infamous measurement problem of quantum foundations tend to focus on how the output of a measurement, the pointer position, can be thought of in consistent quantum mechanical terms, while ignoring the equally important issue of what this outcome says about the earlier microscopic situation the apparatus was designed to measure. An experimental physicist is typically much more interested in the path followed by a particle before it triggered his detector than in what happened later, and if quantum mechanics cannot provide a clear explanation, how can one claim that this theory has been confirmed by experiment? The talk will use Wheeler's delayed choice paradox to identify the fundamental conceptual issues underlying this second measurement problem, and then sketch the resolution provided by the consistent histories interpretation, using a modification of Birkhoff and von Neumann's quantum logic.

For over three decades, the giant elliptical galaxy Messier 87 in the Virgo Cluster has hosted the most massive known black hole in the local universe. New observational data and improved stellar orbit models in the past several years have substantially expanded and revised dynamical measurements of black hole masses at the centers of nearby galaxies.

I will describe recent progress in discovering black holes up to twenty billion solar masses in ongoing surveys of massive elliptical galaxies.  I will discuss the implications of this new population of ultra-massive black holes, including its impact on our understanding of the symbiotic relationships between black holes and galaxies and on the gravitational waves signals from merging supermassive black hole binaries targeted by ongoing pulsar timing array experiments.

In  his 2000 Buhl Lecture, Barry Barish, then the director of LIGO, discussed gravitational waves, the ripples in the fabric of space and time whose existence was predicted by Einstein in 1916.  At the  time, LIGO had recently been constructed under Barish’s leadership and had begun to collect data.  On February 11, 2016, it was announced that LIGO’s upgraded detectors had made the first-ever observation of gravitational waves from a pair of merging black holes.  Barish returns to CMU for an encore Buhl Lecture in which  he will discuss the physics of gravitational waves, the techniques used to detect gravitational waves, and the implications of the new observations.

Barry Barish is the Linde Professor of Physics, Emeritus, at the California Institute of Technology.  He is a leading expert on gravitational waves, having led the Laser Interferometer  Gravitational-Wave Observatory(LIGO) project as the principal investigator and director from the beginning of construction in 1994 until 2005.  During that period, LIGO detectors reached design sensitivity and set many significant limits on astrophysical sources.  The more sensitive Advanced LIGO proposal was developed and approved while Barish was director, and he continues to play an active leading role in LIGO.  His other noteworthy experiments include an experiment at Fermilab using high-energy neutrino collisions to reveal the quark substructure of the nucleon.  These experiments were among the first to observe the weak neutral current, a linchpin of the electroweak unification theories of Glashow, Salam, and Weinberg.  Barry Barish is also the former director of the Global Design Effort for the international Linear Collider(ILC), the highest priority future project for particle physics worldwide.

Sponsored by the Carnegie Mellon University Department of Physics.

Additional Info

Pages

Subscribe to Physics