cryoEDU: An online curriculum and software platform for hands-on learning in single-particle cryo-EM and cryo-ET
In brief:
There are unmet needs in our growing community:
- Curriculum and workshops that teach data processing pipelines for cryoEM/ET
- Standardized and accessible ways to process cryoEM/ET data, and
- Insider perspective on particular cryoEM/ET data set analysis
So, our cryoEDU platform will provide:
- Self-paced online modular curricula,
- Hands-on data analysis in a cloud desktop environment, and
- Community engagement to facilitate information exchange related to cryoEM/ET data analysis.
Purpose
Single-particle cryogenic electron microscopy (cryoEM) is a technique that’s rising in popularity and has become a mainstream structural biology technique. Due to the reduced burden of sample preparation, the capacity to solve tricky protein structures, like membrane protein or heterogeneous flexible macromolecular complexes, becomes possible when rapidly plunging samples preserves their native environment at cryogenic temperatures. As cryoEM expands, the neighboring field of cryogenic electron tomography (cryoET) continues to achieve milestones, paving it’s own way into becoming a core technology for structural and cell biologists for in vitro and in vivo characterizations.
Despite the excitement surrounding these techniques, especially after winning the Nobel Prize in 2017, the barrier to enter cryoEM/ET is steep, requiring users to understand concepts that range from biochemistry to physics of electron microscopes to computational image alignment algorithms. Many resources, like CryoEM101 and EM-learning, provide great introduction to concepts and theories behind cryoEM/ET, but oftentimes data processing information is transferred locally, passed down internally within an established lab. As such, there can be a significant gap in training materials and resources that help users understand, analyze, interpret, and validate cryoEM data and structure determination.
So if you’ve found yourself wondering why particular data processing decisions were made and how to discern that for your unique dataset, you’ve come to the right place! CryoEDU will train users on cryoEM/ET data interpretation and analysis using practical examples, including known pitfalls, to provide an interactive data processing experience. Using an integrated online platform, you can access online self-paced materials, interactive data analysis, and job submission through a cloud computing environment alongside engagement with the cryoEM community. Our chapters and modules will provide the fundamentals of data processing, and access to a cloud-based RELION GUI will give users hands-on experience with real data. In a “choose your own adventure” style, you will go through data sets and figure out best practices in data processing. In addition, we will provide you with expert testimonials and connect you with the growing cryoEM/ET community.
Learning goals and chapters
Go to our Curriculum page for more details on each section and to sign up for the course. Here, we’ll outline the four core steps in data processing that we’ll focus on in each course chapter.
Pre-processing refers to the steps required to reduce raw cryoEM/ET movies into selected particles for 2D/3D analysis. At this step, users subjectively assess sample data in several areas, including sample quality and ice thickness to make sure the data is worth using. For each image in cryoEM/ET, the contrast transfer function (CTF) must be estimated and carried forward throughout the remaining steps of analysis to correct for image aberrations, unless in-focus data collection was performed using phase plates. The final preprocessing step involves identifying particles for further analysis using manual or automatic particle picking approaches.
At the completion of modules in Chapter 1, learners will understand 1) how and why motion correction occurs for cryoEM/ET micrographs; 2) how sample quality is judged via visual inspection and quantitative measures such as CTF resolution limits; 3) how to align tilt series using gold fiducials; 4) how to judge particle picking approaches and subsequent extracted particle stacks.
Data curation involves visual inspection of micrographs and picked particles, in addition to cleaning particle stacks via 2D classification. During data curation, users subjectively determine if the dataset is amenable for higher resolution analysis by selecting micrographs and particles for subsequent 3D analysis.
At the completion of modules in Chapter 2, learners will understand 1) features of micrographs and tilt series images amenable for downstream analysis; 2) how to pick particles from micrographs/tomograms; 3) how to identify 2D averages for subsequent 3D analysis.
After identifying particles amenable for 3D analysis, processing pipelines employ a combination of 3D classification, refinement, and aberration correction to identify homogeneous populations of particles within the dataset. During 3D classification, users must determine the number of classes, whether or not to employ masks, and the signal weighting factor for optimal classification. Once a homogeneous class is selected, 3D refinement is used to obtain higher-resolution structures. Further particle polishing and CTF aberration refinement is then used to push the data toward higher resolution. Importantly, changes in particle orientation distribution or particle quality (i.e. including particles from ‘sub-optimal’ 2D classes) result in pathological effects on 3D reconstruction.
At the completion of modules in Chapter 3, learners will understand 1) approaches for 3D classification in cryoEM/ET; 2) how to identify and apply symmetry to reconstructions; 3) which CTF aberration refinement parameters to use; 4) how to identify preferred orientation datasets in cryoEM; 5) how the missing wedge impacts sub-tomogram averages.
Unlike X-ray crystallography, cryoEM/ET does not have a rigorous statistical test such as R-free to determine if a given 2D average or 3D reconstruction is ‘correct.’ Instead, cryoEM/ET relies on a correlation-based approach, Fourier Shell Correlation (FSC), that measures the correspondence between two images or reconstructions in Fourier space. While the use of the ‘gold-standard’ approach and the FSC is capable of providing valid measures of resolution when used appropriately, there are a variety of ways that users can arrive at incorrect reconstructions through seemingly innocuous mistakes.
At the completion of modules in Chapter 4, learners will understand 1) how to interpret FSC curves for well-behaved and pathological reconstructions; 2) what is sharpening and why it needs to be applied to 3D reconstructions; 3) how to visually assess resolution of sharpened 3D reconstructions.