Mastering CT: Advanced Techniques in Practice

2. Automating Workflows for X-ray CT

WB_25_XCTwebinarSeriesGraphics_SummaryBanners_9507885443_1200 x 300_2

This is a written summary of a live webinar presented on September 17, 2025. The recording and resources are available on the recording page.

Presented by:

Rigaku
Director of X-ray Imaging
Co-presenter: Ted Huang
Rigaku
CT Applications Scientist
Katelynn Stuchlik
Rigaku
X-ray Imaging Account Manager

Webinar summary

The presenters of this webinar, the second in the series Mastering CT: Advanced Techniques in Practice, emphasized that the heart of automation in CT lies in repeatability. If scans of the same sample are collected under inconsistent conditions, automated analysis tools can fail or produce misleading results. By ensuring consistency—through standardized mounting, labeling, and scanning parameters—labs can save time, reduce user bias, and achieve higher throughput while maintaining reliable data across projects.

The discussion began with practical aspects of automating the scanning process. Angela highlighted the importance of labeling, noting that when labs work with dozens or even hundreds of samples, embedded identifiers or persistent marks ensure that data can always be traced back to the correct specimen. Sample mounting was another major theme: stability during scanning is critical, and custom 3D-printed holders or simple improvised mounts can minimize movement and make workflows more efficient. Audience members were reminded that orientation matters, with certain geometries—such as mounting batteries vertically—providing better imaging results. Even in labs without automated changers, creative approaches to stacking or arranging samples can approximate automation and reduce hands-on time.

The presenters then moved into the challenge of defining scan parameters. Rather than prescribing a one-size-fits-all method, they explained how sample size, composition, and scientific goals should guide decisions on resolution, X-ray energy, filters, and scan length. Rules of thumb, such as voxel size scaling with the field of view, were paired with practical examples involving rock cores, batteries, and biological samples. They also showed how software tools like NIST’s XCOM or MuCalc can help determine optimal energy ranges. Importantly, the webinar clarified tradeoffs between scan quality and efficiency: higher energy or longer scans can improve signal but may reduce contrast or cause blurring. Participants were encouraged to experiment with subsets of samples to establish scan “recipes” that could then be reused consistently across a full batch.

Once data are collected, automation continues into analysis. The presenters stressed that processing and quantifying CT data can be even more time-consuming than scanning itself, making automation essential. By mapping out each step—data import, corrections, filtering, segmentation, measurement, and even simulation—users can capture their workflows in macros or job scripts within software like Dragonfly or VG Studio Max. Examples showed how a simple macro could automate porosity analysis on rock cores or surface determination for mechanical testing samples, turning hours of manual work into repeatable tasks executed in minutes. These programs also allow for customization through Python scripting or visual programming, giving users flexibility to adapt automation to their specific needs.

The Q&A session underlined how diverse the applications are, with questions ranging from biological samples and dual-energy imaging to submicron resolution challenges. The presenters consistently returned to the same principle: know your sample, define your goals, and then match the scan and analysis strategy to those needs. While automation tools are powerful, they only work as well as the consistency and thoughtfulness built into the setup.

Overall, the webinar offered a clear and practical message: automation in X-ray CT is not about replacing human expertise but about removing variability and speeding up routine tasks. By standardizing labeling, mounting, scanning, and analysis, researchers can focus their energy on interpretation and discovery rather than repetitive setup. For non-experts, the session made it clear that the first steps toward automation are accessible, and even modest changes—like consistent labeling or adopting macros—can transform workflows into something far more efficient and reliable

Key questions answered in the webinar

Repeatability ensures that scans of the same type of sample produce comparable results. If scan conditions vary—such as exposure time, resolution, or energy—automated routines can fail or produce inconsistent measurements. By standardizing parameters, you save time, improve throughput, reduce user bias, and guarantee data consistency across batches.

Always label samples clearly and, if possible, integrate the label so it appears in the scan. Options include embossing identifiers into tubes, adding 3D-printed labels, or using a silver pen for visibility in scans. Photographs of each sample and detailed metadata are equally important, ensuring that scans remain traceable even months or years later.

The key is preventing sample movement. Custom holders, often 3D-printed, can keep specimens stable and in the right orientation. For example, batteries scan best when mounted vertically to maintain a uniform X-ray path length. Even without commercial automatic changers, labs can improvise by stacking samples in tubes or containers, taking advantage of the instrument’s vertical travel to simulate automation.

Start by considering the scientific question you want to answer and the size and composition of your sample. Resolution should be high enough to capture the features of interest, which may require multiple scans at different magnifications. X-ray energy (KV) depends on material density and atomic number; lower energies give better contrast for light elements, while heavier materials require higher energies and sometimes filters. Scan time involves a tradeoff: longer scans improve signal-to-noise ratio but cost more time and may cause blurring if power is too high.

The NIST XCOM database and MuCalc (an Excel-based tool) are recommended for calculating attenuation and plotting absorption curves. These help you choose energies that maximize contrast between sample components. Filters, such as aluminum for lighter materials or tin for heavier ones, can also fine-tune energy ranges.

Automation speeds up processing, reduces variation between operators, and ensures consistency across sample sets. Most CT analysis software—including Dragonfly and VG Studio Max—lets you record macros of your workflow. These macros can then be replayed or used in batch mode on new data sets, cutting hours of manual work down to minutes while maintaining reproducibility.

Typical steps include importing data, applying corrections (like intensity normalization), filtering to reduce noise, segmenting phases (often with Otsu binarization or machine learning), and then extracting quantitative measurements such as porosity, grain size, or crack locations. Documenting the workflow step-by-step ensures you don’t overlook essential operations when automating.

Unstained biological samples with low atomic number elements require lower energies, but if stained with iodine or tungsten, higher energies and filters may be needed. For very high-resolution imaging, sample size relative to the field of view is critical—ideally the sample should be no larger than three times the field of view. In some cases, you may need to cut samples or use techniques like half-scans to achieve the necessary resolution.

Subscribe to the X-ray CT Email Updates newsletter

Stay up to date with CT news and upcoming events and never miss an opportunity to learn new analysis techniques and improve your skills.

Contact Us

Whether you're interested in getting a quote, want a demo, need technical support, or simply have a question, we're here to help.