Theoretical physics and applied mathematics

Introduction | Theoretical work is necessary for the proper interpretation of data, for modelling biological systems, for the understanding of the physical phenomena exploited for sensing applications and for engineering the optimal detection systems. Fundamental questions that can be addressed with theoretical work are, for instance: which is the information content of images in fluorescence microscopy? How many photons are required to correctly estimate a fluorescence lifetime? In recent years, I worked to use biophysical imaging techniques as a tool for systems biology of cancer. The next big theoretical question I wish to address is: how biochemical networks encode for cellular decisions and maintain functional states? To address this question, we will have to develop accurate models of our data and biology under study.

Recent and ongoing theory-based projects

All our projects require some modelling efforts and specialized data analysis. However, we have a few projects that are specifically focused on the theoretical aspects of physics or biology we are studying. Here, a brief and incomplete summary of these projects:

  • Paracrine oncogenesis. We are investigating how cell-to-cell communication plays a role in the earliest steps during oncogenesis. Our first work is publicly available as a preprint [BIORXIV/2018/431478]. We hypothesise that cell-to-cell communication not only plays an important role during the early steps in oncogenesis but it is also essential to establish and maintain tumour heterogeneity.
  • Machine-learning. We have recently established a small team of people working on machine-learning algorithms to establish data pipelines that could make our advanced imaging tools accessible to an increasing number of laboratories. At the same time, these novel computational tools will be utilized to better understand our complex data.
  • Optogenetics. The capability to trigger biochemical reactions by genetically encoded probes is permitting us to perform better and less invasive single-cell biochemical assays. We are investigating theoretical aspects of optogenetics, specifically for activation of CRE recombinase, that might enable us to develop better assays. Our aim is to seed oncogenic mutations in a well-defined spatiotemporal way, so to study the earliest behaviour of mutant cells within a three-dimensional tissue.


Milestones

Over the last few years, we have developed several theoretical models. A detailed description of some of the milestones is available on separate posts (click on links).

  • 2019 | In our effort to understand the limitations of biochemical imaging techniques, to discern between barriers imposed by engineering or physics, we are often developing theoretical models for FLIM/FRET techniques. This year, we have published a series of three papers that focus on multi-dimensional techniques:
    • There is often a bit of a divide between those who use time-resolved techniques and who use intensity-based techniques for FRET measurements. But which is the real benchmark of these techniques? We offer a theoretical comparison to explore the theoretical limits of seFRET and FLIM,  providing hints on how systems should be optimized. [pre-print available at BIORXIV/2019/774919]. 
    • Fast FLIM… but how fast? In a technology-paper about our open-source ELIS system, we explore in-depth theoretical aspects of fast FLIM systems, issues likely to be of fundamental importance in the next years with fast FLIM systems becoming commercially available.
  • 2013-2019 | Photon-partitioning theorem. What is the biochemical resolving power in fluorescence microscopy and how can we boost it? In 2013, we provided the community with a theoretical analysis of this issue, founded on the photon-partitioning theorem, described in this blog-post and related paper. In 2019, we demonstrate with a new microscope that follows former developments of spectropolarimetry, that multi-dimensional imaging can indeed enhance the biochemical resolution of microscopes. as discussed in this other blog-post and paper.
  • 2010 |Is it just high resolution volume rendering or super-resolution? When you fit an object of known shape or size, to experimental data, you can achieve resolutions that are not limited by diffraction but by photon-statistics. In this blog-post “High (super?) resolution volume rendering“, I discuss resolution and volume rendering, related to a Biophysical Journal paper where theory and computational framework where used to study the volume and shape of red blood cells infected by P. falciparum.
  • 2007-2011 | Model-free analysis of FLIM data. During this period, we have published a series of theoretical papers demonstrating how the graphical representation of FLIM data can be exploited for a simpler visualization and faster fit-free data analysis. First we developed a new mathematical framework, Lifetime Moment Analysis (LiMA) graphical representations and analytical solutions for the analysis of frequency-domain FLIM data. At the same period, Andrew Clayton, Quentin Hanley and Peter Verveer independently rediscovered what we now call the ‘phasor’ approach to fluorescence lifetime analysis, originally described in the nineties by Enrico Gratton, then popularized by the seminal paper by Michelle Digman in Gratton’s lab.  Since then, most of our data analysis relies on versions of phasor-transforms, most frequently generalized multi-dimensional version of phasor transforms that are helping us and the community to analyse complex datasets in a computationally efficient and intuitive manner.
  • 2007 | Maximization of photon-economy and acquisition throughput in FLIM applications. This was my introduction to Fisher Information Theory and my first paper of a theoretical series then culminating in 2013 with the photon-partitioning theorem and more recent work. The community is often split between time- and frequency- domain. Which is better? Which is faster? Well, once you grind some maths, you discover they can deliver the very same precision and speed… it is just a matter of the specific engineering we often use to implement those systems, not hard-limits imposed by physics.

%d bloggers like this: