At his 2019 Trends from the Trenches address at the recent Bio-IT World Congress & Expo in Boston, BioTeam co-founder and senior director of infrastructure Chris Dagdigian shared his pearls of wisdom on the current state of scientific computing, a field where incompetence is now an “existential survival threat” to life science organizations. “We’ve done ‘OK’ entering the data-intensive science era, he says. “The hard part is managing what we have.”
Dagdigian began by sharing a few general observations, notably that leadership still views scientific computing as a “cost center to be minimized” rather than a core competitive differentiator and HR recruitment and retention tool where insights and value routinely get extracted from data. The user base is also climbing and includes both seasoned scientists forced away from familiar, laptop-scale analytical methods and new hires often showing up with prior high-performance computing (HPC) and cloud expertise. Companies are “pretty bad at training,” he adds, especially when it comes to helping intermediate-level users become experts.
The definition of HPC is also “being stretched in extreme ways,” Dagdigian says, putting it “in danger of becoming a dumping ground for problems that don’t fit on cheap leased laptops.” The rate of software and tooling innovation is also happening faster than IT can curate and maintain development and execution environments.