Context. It has been observed for a long time that unnoculted and occulted stellar heterogeneities can have
an impact on the retrieval of transit parameters from exoplanet transit lightcurves. The case of
occulted stellar heterogeneities has been extensively discussed in the literature (see, e.g.,
the models of Beky et al., 2014,
and Tregloan-Reed et al. 2013), and has
served to understand not only anomalies in transiting exoplanet lightcurves, but also of the underlying properties of the
stellar spots, including their contrast distribution, sizes and positions (see, e.g., Morris et al. 2017 and references therein). However, the more common case of unnoculted
stellar spots and its impact on the retrieval of physical parameters from transit lightcurves has not been as extensively studied
and modelled. The largest impact is on the transit depth: the apparent dimming of the star as the planet passes in front of it. During periods of
low cool spot coverage, the star will appear brigther, and thus the observed dimming will be smaller than when there is a larger spot coverage,
where the star is dimmer and thus the transit depth will be larger. However, as has been recently shown by Rackham et al. (2017), researchers usually assume that there is a one-to-one correspondance between the level of
activity a star shows in its long-term photometric signatures (that arise as spots go in and out of view by the observer) and the
spot coverage of the stars, while the situation is much more complex due to the unknown size and contrast of spots, which might include
both dark and bright spots. This assumption is usually used to correct for stellar activity in exoplanet atmosphere studies (see,
e.g., Sing et al., 2010 and references therein),
but the level at which this correction is acceptable has not been empirically studied, and only put to question by the simulations performed
by Rackham et al. (2017). In this work, we aim at performing an empirical study to quantify this, and inform future exoplanet atmospheric studies
on its impact.
Specific tasks. Lightcurves of transiting extrasolar planets from the Kepler mission will be modelled using state-of-the-art modelling techniques, as well as rotational modulation signals from the stars they transit (the first using known models of transiting exoplanet lightcurves, the latter using gaussian process regression, both using Markov Chain Monte Carlo (MCMC) methods in Python). These two pieces of information will allow us to assess how the transit parameters evolve as a function of the level of photometric activity of the star, and quantify how far off from this empirical "truth" the correction for exoplanet atmospheric studies actually is. Additionally, important stellar parameters such as limb-darkening and its evolution will be able to be extracted from these transit lightcurves, providing an additional piece of information that will be analyzed in order to understand how the brightness profile of the star changes as a function of time.
Context. Transiting exoplanets provide a perfect clock with which we can measure their periods: their transit times. In an idealized system with only one transiting
exoplanet, the transit times will always occur exactly at the predicted times given by the planetary period. However, if more planets are in the system, these might gravitationally
interact with the planet perturbing this period slightly, creating what we call "transit timing variations". In this work, we aim at extracting these transit-timing variations in
two ways: (1) empirically, from the data of transiting exoplanets and (2) via dynamical models. The idea in this project is to implement both of these in
juliet, a recently published algorithm for analyzing transiting exoplanet lightcurves, in order to extend its capabilities.
In particular, once implemented, we will use data from the Transiting Exoplanet Survey Satellite (TESS) mission in order to test it and find evidence for transit timing variations
in current and future planetary candidates, helping to unveil the possible multi-planet nature of exoplanetary systems.
Specific tasks. The research will being with an understanding of how juliet works and how to implement empirical fitting of transit-timing variations. This will then be tested with planets both with known and unknown transit timing variations. Next, we will implement dynamical models within juliet to do this directly, which will in turn allow us to write a coupled model that includes both radial-velocities and transits into the mix.
Context. Ground-based photometry is key to discover and refine the properties of transiting exoplanetary systems and other time-evolving phenomena. In the era of big datasets,
automated algorithms for reducing and analyzing this data is fundamental in order to perform efficient analysis. In this work, the interested researcher will work in finishing up an
already tested and working pipeline for the Las Cumbres Observatory Global Telescope (LCOGT) network, which makes the data analysis from this automated global telescope network much
more efficient. The idea is to compare the precision attained by the pipeline with known noise limits using already reduced data, and also to restructure the code of the pipeline
so it can be easily ported to other instruments.
Specific tasks. The research will being with an understanding of how the automated LCOGT pipeline works and what needs to be restructured in order to make it easier to handle by other teams that want to use it for their own instruments. In particular, the interested researcher will have to analyze the scatter of already reduced datasets and compare it with estimated noise levels for the LCOGT network, in order to compare the performance of the pipeline and the instrument itself.
Context. If an exoplanet is cloudy on one side and not on the other, this should imprint signatures on observed transit lightcurves. In this project,
the interested researcher will use Kepler, HST and TESS data in order to look for those signatures in distant extrasolar worlds using codes already implemented
for this project.
Specific tasks. The interested researcher will have to download and analyze data, in order to identify the most precise lightcurves taken either by Kepler, HST and/or TESS to date. For them, transit lightcurve models will have to be used in order to find evidence for assymetries in the lightcurves.
Context. Transmission spectroscopy, the change of the size of a planet during transit as a function of wavelength due to the different constituents and opacity sources in its
atmosphere, has revolutionized our understanding of exoplanet atmospheres, allowing us to retrieve important molecules such as water from wavelengt-dependant transit
lightcurves. Its modelling through atmospheric retrievals, where either the ratio between the constituents in the atmospheres or the abundances of the different elements that
make it up are left as free parameters, has been in turn a huge advancement towards the interpretation of this data. Recently, Heng
and Kitzmann (2017) has provided a set of analytical formulae to compute model transmission spectra given a set of elemental abundances and opacities, that we have recently
implemented Espinoza et al. (2018). However, this retrieval suite currently assumes no chemical or physical consistency, in a so-called
``free" retrieval which directly fits the abundances of different atoms/molecules from the data. In this project, the idea is to couple this retrieval with a chemical equilibrium and condensation
suite in order to impose chemical constraints that will serve to obtain more precise estimates of important observables in the atmospheres of these distant extrasolar worlds.
Specific tasks. The first task would be to study known chemical equilibrium codes (e.g., pyCEA) and the retrieval code implemented in Espinoza et al. (2018). With this, the identification of the most important parameters that define a given equilibrium setting will have to be done, and later implemented into this retrieval suite. Finally, testing of the algorithms in known systems will have to be done.
Context. Transiting exoplanets are one of the best studied exoplanets. If combined with radial-velocity measurements, the mass and radius can be
extracted for them, allowing us to search for correlations between their fundamental parameters, aiding us in the understanding of their formation and
evolution, which in turn allows us to understand their present states in terms of their shapes and composition. This has allowed us to understand, for example,
that giant exoplanets are very metal enriched (Throngren et al., 2016), which has huge
implications for their compositions (Espinoza et al., 2017) and to the impact that, e.g., stellar
irradiation has on them (see, e.g., Throngren and Fortney, 2017,
Enoch et al., 2012). The big problem, however, is that the properties of the hundreds of exoplanets
known to date have been obtained with different methods and tools, and thus, not only we could be currently missing important correlations on this dataset, but also many
spurious correlations between the parameters might be polluting this sample. In this project, the aim is to perform the first stage of a homogeneous study of transiting
exoplanets using state-of-the-art analysis tools.
Specific tasks. The main engine for the joint analysis of radial velocities and transits will be juliet (Espinoza et al., 2018; see the code in GitHub). The task will be to perform a data collection search on transit lightcurves and radial velocities for the systems in this homogeneous study, and perform a joint analysis with juliet, and use them to search for (new and known) correlations between the parameters of the systems.
Context. Exoplanet atmosphere studies are difficult to perform mainly because of systematic effects on time-series data, which are hard to model due do the poor knowledge
of their causes. It has been shown by Gibson (2014) that both model averaging (a technique
which averages our ignorance about the correct model of the systematic effects present in a time-series) and gaussian process regression can alleviate the problem and produce
reliable inference, with the latter method gaining large popularity in the recent literature. It is unclear, however, how gaussian process regression as compared to model averaging
compare against real datasets in the low number of datapoints regime, as simulations usually have added systematic noise close to the model used to account for it. In this
project, we will inject model transit lightcurves to real time-series obtained in the context of the ACCESS survey
(see this for a description of the survey) in order to understand which one of the
two methods is the most reliable in extracting the real underlying injected system parameters.
Specific tasks. A gaussian process regression is already implemented alongside MultiNest (Feroz et al., 2008) in order to analyze the lightcurves. This same posterior sampling technique will be used for model averaging, which is also already implemented. The task of the interested researcher would have to perform injection and recovery test on real data in order to understand how well each method performs, allowing us to understand the real setbacks of each of the analysis techniques used to analyze the data, helping us understand which one is the "best" for the applications of the ACCESS survey.