Why Deep Learning and why in Dresden?
The Dresden GPU Center of Excellence (GCOE), the Max Planck Institute for molecular Cell Biology and Genetics (MPI CBG), the Helmholtz Center Dresden-Rossendorf (HZDR) and the Center for Systems Biology and Genetics (CSBD) form a unique creative melting pot, where performance meets data analytics for the sake of scientific progress. In this atmosphere, where people with experience in image analysis in the life and natural sciences work closely with performance experts that accelerate their applications to yield a higher turnaround of scientific results. This environment and the observation that the hallmarks of modern data analysis methods like Deep Learning motivated us to organize a workshop to bring the fundamentals of Deep Learning to an ever growing community. This page was created to expose the people behind this workshop.
Since 2002 the Myers lab has focused on analyzing and extracting information from images obtained by various forms of microscopy. We believe that such data will reveal more about the function of the entities encoded in the genome then any other approach and will eventually become a prevailing paradigm of investigation, like sequence-based discovery is today. In support of this concept, the group further develops its own customized microscopes.
Gene acts as the local patron of this workshop. His lab is more and more adapting Deep Learning algorithms and approaches for analysis data from developing living organism.
The overarching goal of research conducted in my lab aims at pushing the boundary of what image analyses and machine learning can do for quantifying biological data. Projects we are currently pursuing aim at understanding e.g. apical constriction during gastrulation in C.elegans, mitotic cluster formation in Drosophila and Tribolium, or morphological tissue changes in the green alga Volvox. The common denominator of such projects is the undisputable necessity to analyze large amounts of light microscopy data without causing impossible amounts of manual data curation and processing – often the one major bottleneck.
Guido Juckeland (HZDR, GCOE)
The Computational Science Group at Helmholtz-Zentrum Dresden-Rossendorf (HZDR) is responsible for helping scientists with end-to-end IT workflows for scientific computing. This ranges from software co-design to developing data management concepts for on-campus experiments. Many of these solutions benefit from many-core processors and HZDR has established itself as the GPU knowledge hub within the Helmholtz Federation. HZDR operates quite a number of experiment facilities that generate very large data streams, but even larger amounts of experiment monitoring data. In order to offer better initial analysis of these data streams, HZDR has taken on deep learning, e.g. for improved detection of impeding optics failures in beam profiles (images) of high power lasers.
Since 2012, I am supporting and consulting scientists at the MPI CBG in questions that relate to code performance, parallelisation on multi-core and many-core archtectures as well as for GPGPU, using HPC systems effectively and analysis of big(ger) data. As a service unit, we see the rising need for expertise in using modern frameworks of deep learning and understanding their potential both in terms of performance on modern computer architectures as well as their scientific yield. For this, I am organising this workshop.