|Name:||Adaptive Computing: HPC, Cloud & Big Workflow: The Evolution of Big Data Analytics|
|Time:||Wednesday, June 25, 2014
11:40 am - 12:00 pm
|Room:||Exhibition Hall #660
CCL - Congress Center Leipzig
|Speaker:||Daniel Hardman, Adaptive Computing|
|Abstract:||A surprising two-thirds of HPC sites are now performing big data analysis as part of their HPC workloads, according to IDC. In this new age where big data is transforming the uses of cloud and supercomputing, how can organizations solve big data challenges more rapidly, accurately and cost-effectively?
In this session, Adaptive Computing Chief Architect Daniel Hardman will share how big data analysis needs a Big Workflow approach to more efficiently process intense simulations and deliver valuable insights from massive quantities of data. While current solutions solve big data challenges with just cloud or HPC, Daneil will share how organizations need to utilize all available resources – including bare metal and virtual machines, technical computing environments (e.g., HPC, Hadoop), cloud (public, private and hybrid) and even agnostic platforms that span multiple environments (e.g., OpenStack) – as a single ecosystem that adapts as workloads demand. Armed with insights from this discussion, attendees will be able to leverage big data to accelerate insights and shorten the time to discovery.