Concurrent Scientific Session (Core Administration): Open Mic - Research Community Action Post FASEB Report on Maximizing the Value of Shared Research Resources
Research Community Action Post FASEB Report on Maximizing the Value of Shared Research Resources
In 2017 the Federation of American Societies for Experimental Biology (FASEB) published a report on maximizing the value of Shared Research Resources.
In its report, FASEB made recommendations on a wide range of topics including; improving funding and business operations of shared resource laboratories, increasing discoverability and access, ensuring rigor and reproducibility, and keeping technology and scientific expertise up-to-date to meet evolving research needs.
The report was very well-received by the shared research resources community.
During this Open Mic session, participants will discuss and share with the ABRF community what their institutions or core facilities are doing in response to the recommendations of the FASEB report. These might include new policies, strategies or initiatives adopted in response to the recommendations of the report, or new thinking and approaches to pre-existing policies, strategies, initiatives or processes.
Embracing Error: using a systems approach to improve workflow and workplace
Research cores are complex environments of highly skilled humans interacting with sophisticated instrumentation through multistep workflows. Cores often operate with significant constraints in staffing and resources and yet must consistently and efficiently deliver high-quality results to a wide range of customers. Mistakes and errors can be costly for both cores and researchers, and management of error risk is a critical component of core operation. A traditional view of error attributes adverse events primarily to individual misbehavior or frailty: forgetfulness, poor motivation, inattention, or noncompliance with established protocols. A response to error in the old model typically includes developing additional rules, adding automation, or administering disciplinary actions or training. These old-view solutions do not effect systematic improvement and are unlikely to reduce risk of future errors. A new view of error analysis accepts error as inevitable and systematically connected to the tools, tasks, and operational environment of the human actors. In a systems approach to error, complex systems are understood to force trade-offs between multiple conflicting goals. Safety improvements are developed at the systems or organizational and operational level from an understanding of mismatch between procedures and expected outcomes, and practice and actual results.
Cores can create a climate of embracing error by establishing a reporting culture in which mishaps, near-misses, and small incidents are openly discussed without reprobation and blame. Using examples from other high-reliability organizations like nuclear power plants, air-traffic control centers, and hospital trauma units, cores can develop an approach to error analysis that avoids “bad apple” explanations, hindsight bias, cherry-picking, and mistake micromanagement. Cores can apply different accident models, collect human factors data, and develop real safety and reliability intelligence that leverages the creativity and intelligence of humans for a robust and resilient high-reliability organization.