A virtual tour of my labs at the MRC Cancer Unit

All photos are 3D pictures, it was fun to try out the technology.

Regrettably, the MRC defunded the MRC Cancer Unit and the School of Clinical Medicine could no longer support our Department. In the current academic job market, I am experiencing some uncertainties about where I will relocate. This is thus one of my last opportunities to show where the discoveries we have done over the last years (in 2022 hopefully you will see much more in print).

Biolabs and 3d printing. Most people know me for my work in microscopy. However, more than half of my group is dedicated to cancer cell biology. This is the 3D picture of one of my wet lab bays. Opposite Suzan’s and Anna’s workplaces, we have our rapid prototyping workshop. Some of you might notice an old Maker Bot Replicator 2. My first introduction to 3D printing. But the most notable printer is the Form3B by Formlabs in our custom (orange) enclosure, with which we print both biocompatible scaffolds and chambers for our light-sheet microscope. Then the Ultimaker, a workhorse for all the needs. The last entry is a CellInk BioX, for 3D bioprinting that we are still integrating into our 3D culture workflow.

Laser lab. Ready for more tech? This is the room that bridges past and future developments. This laser lab was the first lab I could call ‘my lab’, where I started to work in 2010 as an EPSRC LSI fellow on my own. All new tech I produced over the last 12 years came from this room, at least initially. One side of the room serves as our electronics and optical workshop and then we have two optical tables dedicated to prototyping. Certainly a crowded room. The development of our SIM/FLIM system (ATLAS.ONE) was delayed by the pandemics and now we disassembled it getting ready for relocating. Before ATLAS.ONE, this room hosted the various iterations of confocal spectropolarimetry I developed over the years. The centrepiece of this room is now an open-top light-sheet microscope (ATLAS.TWO – CRUK funded). This is just the first preview of a system that hopefully will be the protagonist of many future papers.

Optogenetics lab. After a major infrastructural refurbishment, I was able to get a second laboratory, which I dedicated to optogenetics. Here, we prep samples in a (blue-light or red-light) darkroom. We also have small incubators to ensure keeping cells in light-controlled areas. There up on the shelves, there is the skeleton of our first OptoFarm, a system to culture cells under tightly controlled light (biochemical) protocols. Now, this is discontinued and replaced by a much simpler and more flexible system that we’ll publish soon. And yes, of course, you see also our workhorse for single-cell fluorescence dynamics, integrated with multiple cameras, photoactivation capabilities, multiple light sources, and microfluidics. This commercial system was bought and then modified with an MRF grant and despite being very temperamental, it gave us a lot of good data!

Biophotonics lab. I hear you asking… what about FLIM. Of course, we are almost there. Here, we dive into one of the rooms of the imaging facilities where I customized a multi-photon / confocal microscope with time-resolved technologies. In this front view, you see the Leica SP5 and the Chameleon Vision 2 (to the right). Two instruments that gave me a lot of satisfaction. The blue boxes are a custom-built FLIM system (ELIS) that I built when FLIM was still relatively slow. But now commercial systems are also super-fast and I have packed ELIS for good. To be unpacked, once I will have new laboratories, the PicoQuant rapid FLIM of the latest generation.

Let’s go around the table because this room is full of tech. In this room, I hosted several generations of HDIM systems. Some published, others not. The black box on the table is a streamlined and efficient version of HDIM, a time-resolved spectropolarimeter. Coupled with the multi-photon microscope, we get high efficiency in detecting fluorescence with 16 concurrent spectral channels, 2 polarizations and 64 time-bins. Under the table, is the amazing SPC152 – the heart of the system by Becker&Hickl.

Yes, a lot of boxes around because we need to pack up! The back of the table hosts a pulse-picker we used with SPAD arrays, beam conditioning optics and HDIM Gen 4 (I think!) 🙂 I stopped its development because of COVID but soon or late I will resume. Hopefully, this will be fully automated also in its alignment and will integrate fast FLIM electronics.

This first virtual tour ends here. Hopefully, the next tour – with a bit of luck – will be from my new labs.

It is yellow, the two proteins must interact!

In fluorescence microscopy, colocalization is the spatial correlation between two different fluorescent labels. Often, we tag two proteins in a cell with distinct fluorescent labels,  and we look if and where the staining localizes. When there is a “significant overlap” between the two signals we say that the two molecules “colocalize” and we might use this observation as possible evidence for a “functional association”. We might argue that measuring colocalization in microscopy is one of the simplest quantitation we can do. Yet, many horror stories surround colocalization measurements.  This post is not a review of how to do colocalization, but a brief casual discussion about a few common controversies that is – as often I do – aimed to junior scientists.

coloc
This is a slide I often use in a presentation to introduce FRET but useful to understand colocalization. You can see the average size of a globular protein, fused to a fluorescent protein compared to the typical resolution of diffraction-limited and super-resolving fluorescence microscopy. When the signals from two molecules are within the same pixel, these two molecules can be really far apart from each other. However, the spatial correlation of distinct labelling can inform us about possible functional associations.

***

“I am imaging GFP, but the image is blue, can you help me?”. Well, this is not a question related to colocalization but it illustrates a fundamental issue. In truth, cell biology is such an inherent multidisciplinary science that – in most cases – a researcher might require the use of tens of different techniques on a weekly basis. It is thus not surprising that many researchers (I dare say most) will be an expert on some of the techniques they use but not all. Microscopy is particularly tricky. To be a true expert, you need to handle a feast of physical, engineering and mathematical knowledge alongside experimental techniques that might span chemistry, cell culture and genetic engineering. However, the wonderful commercial systems we have available permit us to get a pretty picture of a cell with just a click of a button. Here the tricky bit, you want to study a cell, you get a picture of a cell. One is lead to confusing the quantity that intends to measure with the information that is actually gathering and with its representation. This is true for any analytical technique but as ‘seeing is believing’, imaging might misrepresent scientific truth in very convincing ways. Hence, with no doubts that upon reflection the non-expert user would have understood why the picture on the screen was ‘blue’, the initial temptation was to believe the picture.

Question what you set out to measure, what the assay you have setup is actually measuring and what the representation is showing. Trivial? Not really. It is an exercise we explicitly do in my lab when we have difficulties to interpret data.

***

“It is yellow, they colocalize, right?”. Weeeeeeeeellll… may be, may be not. Most of you will be familiar with this case. Often researchers acquire two images of the same sample, the pictures of two fluorescent labels, one then is represented in green and the other in red. With an overlay of the red and green channels, pixels that are bright in both colours will appear yellow. I would not say that this approach is inherently flawed but we can certainly state that it is misused most of the times and, therefore, I try to discourage its use. One issue is that colour-blindness, not as rare as people think, renders this representation impractical for many colleagues (so my colour highlights!), but even people with perfect vision will see colours with lower contrast than grey-scale representations, and green more than red. Eventually, to ‘see yellow’ is almost unavoidable to boost the brightness of the underlying two colours to make the colocalization signal visible. This can be done either during the acquisition of the image often saturating the signal (bad, saturated pixels carry very little and often misleading information) or during post-processing (not necessarily bad, if declared and properly done). Either way, at the point you are doing this, your goal to be quantitative has been probably missed. The truth is that a lot of biological work is non-quantitative but faux-quantitative representations or statistics are demanded by the broader community even when unnecessary. Let’s consider one example with one of the stains being tubulin and the other a protein of interest (PoI). Let’s assume the PoI is localizing at nicely distinguishable microtubules in a few independent experiments. Once the specificity of the stain is confirmed, the PoI can be considered localized at the microtubules (within the limitations of the assay performed) without the need for statistics or overlays. Unfortunately, it is not very rare to see papers, also after peer-review, to show diffuse stainings of at least one of the PoI and perhaps a more localised stain of the second PoI and a ‘yellow’ signal emerging from an overlay is considered colocalization, instead of what it is: just noise. Another common issue is localization in vesicles. Again, any cytoplasmic PoI would appear to colocalize with most organelles and structures within the cytoplasm with diffraction-limited techniques. Sometimes punctuated stainings might partially overlap with known properly marked vesicles, let’s say lysosomes, but not all. Then the issue is to prove that, at least, the overlap is not random and, therefore, statistics in the form of correlation coefficients are necessary.

***

“The two proteins do not colocalise, two molecules cannot occupy the same volume” Really!? Well, from a quantum mechanics standpoint…. No, do not worry, I am not going there. I have received that criticism during peer-review in the past and until recently I thought this was a one-off case. However, I have recently realised that I was not the only person reading that statement. I am really uncertain why a colleague would feel the need to make such an obvious statement except for that condescending one-third of the community. I should clarify that to my knowledge no one implies physical impossibilities with the term colocalization. That statement is perfectly ok in a casual discussion or to make a point to teach beginners the basics. Some of us also might enjoy discussing definitions,  philosophical aspects related to science, controversial (real or perceived) aspects of techniques, but better at a conference or in front of a beer, rather than during peer-review.  The issue here is that while it is reasonable to criticise certain sloppy and not too uncommon colocalization studies, in general colocalization can be informative when properly done. 

***

So, is measuring colocalization useful?” Homework. Replace ‘colocalization’ with your preferred technique. Done? Now try to make the same positive effort for colocalization. Every technique is useful when used properly.

You might have noticed I marked some words in my introduction: colocalize, significant overlap and functional association. It is important we understand what we mean with those words. Colocalization means co-occurrence at the same structure, a non-trivial correlation between the localization of two molecules of interest, within the limits defined by the resolution of the instrumentation. The “significant overlap” should be really replaced by “non-trivial correlation”. Non-trivial, as diffuse stainings, unspecific stainings, saturated images can very easily result in meaningless colocalization of the signals but not of the molecules of interest. Correlation, as the concept of overlap might be improper in certain assays, for instance in some studies based on super-resolution microscopy. After we did everything properly, we still cannot say that if protein A and protein B colocalize they interact (see slide). However, we can use colocalization to disprove the direct interaction of two proteins (if they are not in the same place, they do not interact) and we can use high-quality colocalization data to suggest a possible functional association that might be not a direct interaction, and that should be then proven with additional functional assays.

Then, my friends, do make good use of colocalization as one of the many tools you have in your laboratory toolbox but beware that just because it is simple to acquire two colourful pretty pictures, there are many common errors that people do when acquire, analyse and interpret colocalization data.

 

P.S.: if I cited your question or statement, please do not take it personally. As I have written, not everyone can be an expert of everything and the discussion between experts and non-experts is very useful, so making real-life anonymous examples.

A ‘hyper-dimensional radio’ to listen to the biochemical communications of the cell

hdim_resIndustry, academia and healthcare often rely on fluorescence microscopy to see the fine architecture of materials, including biological ones. Fluorescence microscopy is particularly suited for biomedical studies because it can be gentle with biological materials permitting investigators to study biology in a non-destructive manner. Chemistry and genetic engineering then provide useful strategies to make samples fluorescent so to report about mechanisms that we need to study aiming to understand how biological systems work in normal conditions, during disease or therapy.

Thanks to two-decades of fast-paced innovation in fluorescence microscopy, we can now see the smallest features of a biological sample, approaching molecular resolution. However, the capability of fluorescence microscopy to observe small changes in the chemical or physical properties of biological samples is not as well-optimised as its capability to peek into small structures. In our recent paper entitled “Enhancing biochemical resolution by hyper-dimensional imaging microscopy” – now available at the Biophysical Journal – we demonstrate how to recover information that permits us to make better measurements.

We can think of a fluorescence microscope like a radio broadcaster that transmits useful information through different radio channels. When we listen to one individual radio channel, we lose information transmitted over the other frequencies. If we attempt to listen to several broadcasts at the same time, the scrambled voices will limit our understanding of the several messages that were originally broadcasted. Similarly, the lasers we use to make samples shine, and the fluorescence emitted by samples, transmit information spread over the different properties of light, for example in its colour, in the time when light is emitted (the fluorescence lifetime) and in which plane is vibrating (polarisation).

In our recent work, we describe theoretically and experimentally how all this information could be measured separately but simultaneously enhancing our capabilities to observe biological processes. By breaking conceptual barriers and showcasing possible technological implementations with hyper-dimensional imaging microscopy, we aim to catalyse advances in several applications, spanning material sciences, industrial applications, basic and applied biomedical research, and improved sensing capabilities for medical diagnostics.

The New&Notable commentary by Prof. Suhling on Biophysical Journal

Our open-access work on Biophysical Journal

Snap opinion on deep-learning for super-resolution and denoising

I am personally conflicted on this topic. I have recently started to work on machine learning and deep-learning specifically. Therefore, I am keen to explore the usefulness of these technologies, and I hope they will remove bottlenecks from our assays.

My knowledge about CNNs is rather limited, even more so for SR and denoising applications. My first opinion was not very positive. After all, if you do not trust a fellow scientist guessing objects from noisy or undersampled data, why should you trust a piece of software? That appeared to be also the response of many colleagues.

After the machine learning session at FoM, I partially changed opinion, and I am posting this brief -very naĂŻve – opinion after a thread of messages I read on twitter by colleagues. Conceptually, I always thought of machine learning as ‘guessing’ the image, but suddenly I realise that CNNs are perhaps learning a prior or a set of possible priors.

I have mentioned in a previous post about the work by Toraldo di Francia on resolving power and information, often cited by Alberto Diaspro in talks. Di Francia, in his paper, states “The degrees of freedom of an image formed by any real instrument are only a finite number, while those of the object are an infinite number. Several different objects may correspond to the same image. It is shown that in the case of coherent illumination a large class of objects corresponding to a given image can be found very easily. Two-point resolution is impossible unless the observer has a priori an infinite amount of information about the object.”

Are CNNs for image restoration and denoising learning the prior? If so, issues about possible artefacts might be not put aside but at least handled a bit better conceptually by me. The problem would then shift to understand which priors a network is learning and how robust these sets are to typical variations of biological samples.

Great talks today at FoM. Eventually, we will need to have tools to assess the likelihood that an image represents the ground-truth and some simple visual representation that explain what a CNN is doing to a specific image that is restored and ensure good practise. Nothing too different from other techniques, but I feel it is better to deal with these issues earlier rather than later in order to build confidence in the community.

Related twitter thread: https://twitter.com/RetoPaul/status/1118435878270132225?s=19

The backstage story of a paper. Highs, lows, lessons to learn

Since a few months, the manuscript entitled “Multiplexed biochemical imaging reveals caspase activation patterns underlying single cell fate“, and authored by Maximilian W Fries, Kalina T Haas, Suzan Ber, John Saganty, Emma K Richardson, Ashok R Venkitaraman, Alessandro Esposito, is available as pre-print at the bioRxiv repository. It has started its journey through the peer-review process, but here I wished to explain to students and young scientists what happened behind the scenes as, I believe, can be instructive.

The inception of the idea | I am unsure if it will be evident from the manuscript, but this is the culmination of a huge effort that started more than a decade ago. I was about to leave the Cell Biophysics Group led by Prof. Fred Wouters after I completed my PhD, on a train from Goettingen to Bonn where my partner used to work,  thinking: “What should I do next? … something that while capitalizing on my training can make my work distinct from my mentors and others? Where can I have the highest impact?” Moment that stuck in my memory.

I believe I read Santos et al. (2007) “Growth factor-induced MAPK network topology shapes Erk response determining PC-12 cell fate.” in that period, a paper that influenced me significantly. It made me thinking of cells as if they were computational machines, interpreting various inputs from the extra- and intra- cellular environment to trigger appropriate outputs, cell states or transition between cell states, i.e. cellular (fate) decisions. Everyone working with microscopy knows that cells treated equally often behave differently and, therefore,  I started to formulate ideas around the following question: “How does a network of biochemical reactions encodes for cellular decisions? Why do genetically identical cells take a different decision faced by a similar stimulus?” Basic principles, the science I love the most, but questions worth answering also to obtain mechanistic insights, questions also quite relevant to disease.

As a matter of fact, it is  of fundamental importance to understand how cells trigger pathological states or if differences in biochemical networks can be used as diagnostic markers for patient stratification or targeted for therapy, concepts that I started to work only later. Certainly, I thought back then, with my unique blend of physics, engineering, mathematics, molecular and cell biology I can do, in this area, what others might not be able to. Therefore, since 2007, my aim is to image not just a biochemical reaction, but biochemical networks within intact living cells, while they undertake decisions.

Finding the resources, the initial success | Perhaps other students start less naĂŻvely than me, but soon I would discover that having a good idea (let’s suppose it is a good idea) and having the right skills is only a tiny part of the job. First, aiming to coordinate my work with that of my partner (now wife), I accepted a job offer at the University of Cambridge to work with Prof. Clemens Kaminski and Dr. Virgilio Lew to study one exciting but quite unrelated project. While working on the homeostasis of P. falciparum infected red blood cells, I set up collaborations and wrote an EPSRC fellowship which was funded. Therefore, in 2009, two years after my first idea, I got the funding to work on biochemical multiplexing. With this fellowship, I was able to refine my expertise in biochemical multiplexing, permitting me to build advanced technologies for fluorescence sensing such as confocal  spectro-polarimetry and fast SPAD-based spectral FLIM. This EPSRC fellowship, together with my expertise and vision, and the benefit to have already established my name in the community thanks to the work I had done with and the support of Prof. Fred Wouters and Prof. Hans Gerritsen, were an excellent platform that permitted me to do the next jump and accepted a senior position at the MRC Cancer Unit.

Finding the resources, the struggle |  Rather than focusing just on technology, I then broaden my research to a research program that would require theoretical developments, engineering of new pairs of fluorescent proteins to achieve multiplexing, coding and, of course, biological applications. I recognize that expanding my research before seizing the appropriate resources was a significant mistake or at least a huge risk. Working within Prof. Ashok Venkitaraman group, I started to write ambitious EU grants. Some of them would receive excellent feedback (14 out of 15 points, first or second not funded…) but fall short of being funded. Hans once told me that “at this level of competition and quality, often it is just noise that decides the final outcome“. Probably true, even funny if you knew we worked together on photon-statistic (‘noise’). But great feedback does not replace funds, and thus I wrote an ERC grant.

I did not get ERC funding but, once again, ERC is very competitive and I was not sufficiently experienced, thus no drama. However, I started to notice one big issue. Physicists would judge my physics not great physics, biologists would judge my biology not great biology. Some colleagues would find my objectives impossible to reach. This is what I have then discovered to be the challenge of doing multi-disciplinary research (well, technically is called trans-disciplinary research, but this is the topic for another post). When your proposal is both trivial and impossible, you might have an issue that is not necessarily related only on your science. One referee commented that “A number of groups have being trying to improve the technologies for many years and although some of them have an enormous experience they are not anywhere close to where he intends to be in five years“. Around the same time, a renown scientist commented on the description of my work “It is impossible”, but then added in a wonderfully supportive and very appreciated manner “but if there is someone that could do it, it is Alessandro” – well, if funding-proposals could be judged with the human touch that people have when speaking in person knowing and respecting each others work…  I’ll cut an even longer story short, but with significantly less resources than I was asking and struggling to increase my funding, with the financial backing of Prof. Ashok Venkitaraman, we did everything we wanted to do in… five years!

The great technical success (NyxBits and NyxSense) | I wished to tell you a story of great success in a broader sense, but this has to be still written… if it will ever be. I did waste significant amount of time in looking for resources in what I found an amazingly inefficient system. However, from the end of my EPSRC fellowship since this year (~6 years), we have done a huge amount of work to realize what it was thought not to be possible:

  • Molecular Biology. I wished to develop two platforms, one based on spectrally multiplexed time-resolved anisotropy (open for collaborations here!) and one for spectral FLIM to manage the cross-talk between multiple FRET pairs and making biochemical multiplexing possible. With the limited resources I had, and initial help from Bryn Hardwick, Meredith Roberts-Thomson and David Perera in Ashok’s lab, we kick-started the project. The mole of work started to overwhelm me. Occupied with grant writing, training in a new field, engineering, software development and mathematics, I could not push this forward as fast as I wished. A great help then arrived from Max Fries who did 6 months with me as master student. Once he left, I was short of resources again, with the FRET pairs misbehaving and exhibiting aggregation or spurious signals, we abandoned one of the two sensing platforms.  Emma Richardson then joined me as a Research Assistant dedicated to cloning and testing FRET pairs and then Max came back to work with me for another four years as a PhD student. Committed and skilled, he tested tens and tens of FRET pairs. The work was a huge task, but a couple of paragraphs in the manuscript. We even have better pairs then we used in this work, all described in the Supporting Information. Indeed, under the pressure for publishing on high impact journals, I decided (probably anoher mistake of mine) to progress to applications, settling for what we recently baptized as NyxBits: mTagBFP, sREACh, mAmetrine, msCP576, mKeima and tdNirFP, so to focus on biological applications. NyxBits and NyxSense? Well, I have explained the choice of names elsewhere.
  • Mathematics and software. There is something I could not really write in the manuscript so explicitly and it is appreciated only by the experts in the field. There is something I also find impossible to communicate to review panels. As a testimony to this, I report here a comment I was once relayed to, something like: “Why do we need to offer him a carreer, once he has built the instruments we really need one person just clicking a button, no?” (I am sure I remember it much worst then it was. May be). The integration of technologies is so new and challenging, that we had to formulate new theoretical frameworks and write all new software, including how to acquire data, data format, and analysis. Also, some aspects of our work are difficult to appreciate. Let me tell you another small event that would push me in a particular direction. I really enjoy the conference Focus on Microscopy, even when criticized. Presenting new ideas, a colleague – respectfully – questioned the possibility for multiplexed imaging to be capable to measure several FRET pairs at the same time. This stimulated me to resume studying the Fisher information content in biochemical imaging. What is the biochemical resolution in microscopy? Can we enhance it? After years of thinking about this topic, in 2013 I cracked the problem, and published the mathematics in PLOS ONE where I formulate what I defined ‘the photon-partitioning theorem’. Then, with the increasing financial backing of my Director, Kalina Haas joined my growing team. Kalina implemented unmixing algorithms  and complex data analysis pipelines. Max and Kalina then became my dream-team to progress the project to the shape you can read today.
  • Technology. I mentioned some earlier technology platform that were designed for biochemical multiplexing. In my recent and first release of manuscripts on bioRxiv, we  also published a full implementation of Hyper-Dimensional Imaging Microscopy (HDIM)  with which we backed the photon-partitioning theorem with experimental evidence. We have done much more in that direction, but when we started biological applications, we realized the need for faster FLIM systems. Uncapable to wait for commercial solutions or to gain the benefits of other prototypes we had developed, I decided to build my own fast multiplexed electronics. This development was fostered by a negative criticism of a referee. During a PNAS submission of our spectral FLIM system, a referee mentioned we could do the same utilizing Hybrid PMTs. I disagreed, as achieving 64 channel spectral FLIM with the capability to run at hundreds of millions of photon-counts per second is all-together a very different application; however, there is merit in most referees’ criticisms, even the most negative ones. Only then I have realized PMT are now very fast and the bottleneck was just the electronics. Therefore, I got in touch with Surface Concept  who supported me wonderfully and  sold me one of their multi-hit TDC platforms. After several months of software development, we were then capable to run FLIM measurements with the quality of TCSPC and the speed of FD-FLIM. As usual, I presented this work at FoM where it was greatly received by colleagues and companies, but we did not publish the imaging platform as we were fully committed to pursue biological applications.
  • The biology. The bottleneck of our experiments was and still is data analysis and, with tens of experiments, thousands of biochemical traces to be painfully manually curated, we moved ahead very slowly, but working hardly. Mostly Max, Kalina and myself, suffered years of hard work, the occasional worry when something stopped working, and the excitement of seeing things that others could not see, for the first time. In this manuscript, we reveal the extent of non-genetic heterogeneity that  biochemical networks can exhibit and that eventually result into difference cellular decisions. Here, we focused on multiplexing simple biosensors for caspases as we aimed to de-risk and very ambitious project. We also decided to work with HeLa cells, again for the same reason. Despite the simplicity of the model system under study, we realized how complex and heterogeneous the response of biochemical pathways is, the cross-talk between enzymes, signaling pathways and cellular metabolism. All of this is, for me, fascinating and it shows that whenever we do ensemble measurements, we really see only the behavior of the average cell. It is then important to understand that the ‘average cell’, most of the times, does not really exist. If we are lucky, the bulk of the population responds with one phenotype and the measured ‘average cell’ will indeed represent the ‘most frequent cell’. However, in other instances when there are significant populations behaving in distinct ways, we would not just miss important information. The model inferred from the ‘average cell’ would be simply the wrong model of a non-existing cell. This is why it would be important to know, for any assay, if the sample behave synchronously with a stimulus and homogeneously. In this sense, single cell biochemistry, could bring not just an additional layer of information, but inform us if what the observations we obtain on a given model system with ensemble measurements can be reliable.

Enduring the struggle | I hope you did not mind I spoke so positvly about my own work. If you know me, you also know I am not so self-centered. However, I wished to let the younger scientists to know what there might be between a ‘good idea’ and its realization, passing through frequent failures and some success. Probably, one of the most precious quality of a scientist is resilience. We need thick skin to confront the constant failures that lead us to discoveries, the constant struggles in getting resources and eventually publishing good work in a highly competitive environment. Turning a negative event in something negative is part of this process. Understanding why one experiment did not work enables us to make troubleshooting, why an experiments falsified our hypothesis to build new and better models, why funding was not awarded or a manuscript was not published how we can improve our scientific proposals and reporting. Of course this is easier said than done.

The work we presented in bioRxiv is not the end of the story. The work, wonderfully-received in conferences, is still not peer-reviewed. Will colleagues appreciate and understand the vision of our work, its possible impact and the mole of work we had to do? Were we able to communicate properly? And even if we did it, we still have a long way in front of us. My dream is to establish a single cell systems biology of cell fate. A huge amount of work, from maths to biology, from biotechnology to physics, all still needed to be able to understand why cells do what they do, how physiological states are maintained and how pathological states emerge.

[Open hardware] A safe laser by-pass

Well, I remember when I started this business, a beam stop was done with a recycled block of lead and reflections stopped with carton boxes 😉 Brown boxes, black carton catches fires, of course (tell this to my undergrad-self). Not any longer, of course!

About ten years ago, I started the procurement and development of my first two-photon microscope. For the first time, I was directly responsible of laser safety and I had to take decisions about how to build a system that was safe for a user facility in a biomedical research institute. As I was coupling commercially sourced systems (Leica SP5, Chameleon Vision 2 and Pulse Select) and I was not planning much customization for the excitation path of this instrument (I heavily develop assays and detection), I opted to fully enclose the laser in lens tubes. The resulting system is safe, stable, and no more difficult to align compared to other enclosures.

I think that enclosures around the complete table might make sense in many instances, particularly when compartmentalized in sub-sections, but this is the system that worked best for me at the time. One solution I wish to share, is a bypass for the Pulse Picker we had used to develop spectrally resolved FLIM utilizing smart SPAD arrays (detectors that integrate photon counting electronics with them).

20181112_184730As I start planning replacement of this systems, I wished to share this design, in case some of you might find it useful. In the image on the left, you can see the Ti:Sapphire on the top, the pulse-picker on the right and the first enclosure by Leica used to steer the beam to their in-coupling optics (bottom-right).

In the middle, the laser bypass we utilize to direct the laser through the pulse-picker or around it.

In the image below, you see a close-up photo of the by-pass. The black box with the rectangular aluminum cover is the Leica spectral flattener used to reduce power of the Chameleon Vision at the peak wavelength. One of the few customization I needed here was simply to have a hole on a Thorlabs SM2 lens tube to accommodate this filter. This is screwed in a C4W-CC cube that can host a movable turning mirror with high reproducibility. The alignment of the microscope without the pulse-picker is done with the pair of mirrors provided by Leica. The alignment of the Pulse Picker is done with the kinematic mirrors visible on the left (M1 and M2). I placed a light-block behind them just in case one would become lose or to block the small amount of light transmitted through them. A kinematic cube is used to host ultrafast beam sampler by Newport to direct a small fraction of light to the Thorlabs PIN diode I use to feed the electronics of the pulse picker. In front of the PIN diode I have an xy-translating cage element. An empty four-way cube is used to allow the laser beam to pass from top to bottom (bypassed) or from left to right (coupled pulse picker). The aluminum block tagged as L1 is just a cover for the C4W-CC when empty.

20181112_184735

At the output of the pulse-picker, you see the mirror image of this bypass (on the right) and the two steering mirrors by Leica (the cylindrical towers). On the far right of the picture there is the in-coupling optics by Leica, preceded by two diagnostics ports.

20181112_184720

Below, you can see a close-up picture of this part of the coupling. Because of the layout, I needed to add one extra mirror (top left) and aiming to isolate users (placed on the top of the image) from accidental damages of the in-coupling optics, I added a light barrier.

Both diagnostics ports are based on a 4-way kinematic cube from Thorlabs hosting Newport beam samplers. The first port is used to sample the pulses after the pulse-picker and to feed our FLIM electronics. The second has two scopes. First, for course alignment of the system. I have two irises in the system that are aligned when the laser is aligned (roughly) to the in-coupling optics of Leica.

I usually remove a cover at the exit of this diagnostic port and use a fluorescent card to verify alignment, but in this picture you see the fiber coupling a spectrograph we occasionally use to diagnose faults of the laser.

 

20181112_184714

The alignment is simpler that it seems. First we start with a microscope that is fully aligned without pulse-picker as per normal operations. Then, when we need the pulse picker, we insert the two turning mirrors (L1 and R1). We do this with the laser off and with the pulse-picker crystal retracted (coarse alignment) or protected by an alignment card (fine alignment). M1 and M2 are then used to align the beam with the crystal. Then we align the PIN diode and proceed with the fine alignment of the pulse-picker cavity.  Once this is done, we align the cavity with the microscope utilizing M4 and M5. For course alignment, the signals from the two diagnostics ports is very useful until some signal is picked on the microscope monitor, after which the final fine tuning of all the optics can proceed.

Be aware, alignment of Class 4 lasers can be dangerous. Therefore, do your own risk assessments and think carefully about the logistics of your system. Now that I am starting to consider the redevelopment of the system, I thought to share these notes with you, hoping that it could be of some use.