Keynote: Big Data – Hype or Disruptive Innovation?
Felix Wortmann, University of St. Gallen & Bosch Internet of Things Lab
The big data hype took off about three years ago. A new era of data analytics promises tremendous value on the basis of breakthrough technology. Over 20 years of business intelligence and data warehousing (existing data analytics solutions) are challenged. But are big data technologies really the end of today's business intelligence stack? Where is the additional business value that goes beyond the value of existing solutions? Will big data be the next "SOA" or will it have lasting impact as a disruptive innovation?
Keynote: Big Data for Big Science – Data-Intensive Discovery with the Co-Evolution of Hardware & Software Platforms
Stephan Gillich & Girish Juneja, Intel
Under the torrent of news and commentary on the convergence of big data in the enterprise with high performance computing (HPC) in science, lies the true potential of data-intensive discovery made possible by new high performing and cost-effective platforms. Intel’s long-standing investments in research and development at the intersection of HPC, enterprise computing, and open source software sets the context for innovations in the heart of this platform. Learn how Intel is investing in core infrastructure technologies such as Apache™ Hadoop® and Lustre® to bring computation to wherever data lives, not only by driving the technology but also by delivering software with support and services. The two speakers will focus on the exciting co-evolution of hardware and software for big data for science and enterprises.
Keynote: HPC & Data Intensive Computing: The Road to Convergence
Flavio Villanustre, HPCC Systems & LexisNexis
HPC is a well-established discipline with widespread application spanning across multiple industries and research areas. The roots for Data Intensive Computing (DIC), on the other hand, can be traced back to the late 90's, where increasing volume and complexity of structured and unstructured data became a challenge to the established data processing frameworks. While there is some resemblance between the architectural design of Beowulf clusters widely used for HPC, and that of a data intensive cluster, specific needs associated with the data sizes and types of processing in each case, require different approaches: while HPC usually deals with iterative algorithms applying Single Instruction Multiple Data (SIMD) operations on vectorized representations of numbers, DIC generally requires transformation, re-distribution and manipulation of large volumes of strings and other text/binary representations. While moving numbers in HPC is usually inexpensive and required by the algorithms, preserving data locality is a must in DIC, for performance reasons. However, the recent increase in use of certain data algorithms that rely on complex mathematical frameworks, have started to blur the boundaries. Lessons learned in the HPC world are beginning to be applied to DIC and the convergence of both schools.
Keynote: HPC Meets Enterprise: SAS High-Performance Analytics in Action – Concepts & Examples How Big Data Changes the Way of Business
Mark Torr, SAS Institute
Recent technological enhancements are starting to allow business to look to make use of new technological approaches to utilize big data and execute big data analytics. SAS has built a high performance computing platform that enables business to leverage massively parallel processing, grid technologies and more to derive business benefit. With the adoption of these technologies SAS has been able to take much of what it offers to the market, move it into memory or distribute it onto a grid and deliver radically new ways to execute algorithms at extreme speed, with a sensible price point, directly to the business. This presentation will give an overview of the SAS technologies with a focus on the technology underpinnings and share some real work use cases.