One of our sponsors at BioData World Congress– Intel, prepared a post- event article with insights on the most burning topics that were discussed last week.
Last week I was pleased to join a host of partners, customers and industry experts at the leafy Genome Campus in Cambridge, UK, for this year’s BioData World Congress.
As one of the world’s leading events in genomics and big data, the two-day conference set out to explore how healthcare organizations can capitalize on the vast amounts of data available to them in an effort to advance towards precision medicine.
It’s a subject which led much of the discussion at the conference 12 months ago but, this year, conversations had noticeably matured.
Most evident was the continuing debate around data management and interpretation; the need to move beyond legacy infrastructure; and the importance of an industry-wide collaborative approach to improving patient outcomes.
The Data Challenge Continues
There was clear consensus that the challenges around data are not going to disappear. In fact, they are likely to become increasingly complex as processes such as whole-scale genomic sequencing become more of a reality.
Many speakers and panellists accepted that they were now in possession of, or had access to, a vast amount of data, and recognised this as a strength. However, they were also quick to acknowledge that they lacked the ability to process, store and interpret this data effectively, and at speed. This pointed to a bigger issue – how can we use this data to actually deliver better care, improve our understanding of patient’s needs and re-evaluate whether investment is being directed at the right treatment?
If the ability to extract key insights from these increasingly large data sets remains a challenge, the ability to transform the level of care patients receive will stall, and in a context where bottlenecks are already testing care services, technological innovation will need to work harder to transform the healthcare provision…