|Name:||Multiscale Systems Biology: Big Data Challenges in Supercomputing Enabling Translational Medicine in Cardiology|
|Time:||Tuesday, June 24, 2014
12:30 pm - 01:00 pm
CCL - Congress Center Leipzig
|Breaks:||01:00 pm - 02:15 pm Lunch|
|Speaker:||Matthias Reumann, IBM Research Zurich|
|Abstract:||Data in healthcare will soon be augmented by simulation of biophysical models tailored to each patient. Multi-scale cardiac models have shown potential for clinical impact. Our recent high-resolution heart simulations created the big data challenge that very large data volumes are generated at a fast rate. If translated to a clinical setting, it would yield Zeta bytes of data per year in the USA alone. Simulating at lower resolution a block of tissue (wedge) from the ventricular wall, generating the Electrocardiogram at run time or run time visualization reduces output data. The first two strategies yield lower data volumes and computational load. It remains to be demonstrated though that they have the same predictive value as high-resolution models. The other two strategies reduce the data volume dramatically but simulation data is lost post simulation. In conclusion, routine use of these simulations in a clinical setting might yield a very large volume of data at high speeds. Reducing the data output to only the minimal number of valid predictive parameters will solve this big data challenge.