SlideShare a Scribd company logo
High-Performance Computing Ecosystemin EuropeJuly 15th, 2009Kimmo KoskiCSC – The Finnish IT Center for Science
TopicsTerminology and definitionsEmerging trendsStakeholdersOn-going Grid and HPC activitiesConcluding remarks
Terminology and pointersHPC High Performance ComputingHET, http://www.hpcineuropetaskforce.eu/High Performance Computing in Europe Taskforce, established in June 2006 with a mandate to draft a strategy for European HPC ecosystemPetaflop/sPerformance figure 1015 floating point operations (calculations) in seconde-IRG, http://www.eirg.eue-Infrastructure Reflection Group. e-IRG is supporting the creation of a framework (political, technological and administrative) for the easy and cost-effective shared use of distributed electronic resources across Europe - particularly for grid computing, storage and networking.ESFRI, http://cordis.europa.eu/esfri/European Strategy Forum on Research Infrastructures. The role of ESFRI is to support a coherent approach to policy-making on research infrastructures in Europe, and to act as an incubator for international negotiations about concrete initiatives. In particular, ESFRI is preparing a European Roadmap for new research infrastructures of pan-European interest.RIResearch Infrastructure
Terminology and pointers (cont.)PRACE, http://www.prace-project.eu/Partnership for Advanced Computing in EuropeEU FP7 project for preparatory phase in building the European petaflop computing centers, based on HET workDEISA-2, https://www.deisa.org/Distributed European Infrastructure for Supercomputing Applications. DEISA is a consortium of leading national supercomputing centers that currently deploys and operates a persistent, production quality, distributed supercomputing environment with continental scope. EGEE-III, http://www.eu-egee.org/Enabling Grid for E-sciencE. The project provides researchers in academia and industry with access to a production level Grid infrastructure, independent of their geographic location. EGI_DS, http://www.eu-egi.org/An effort to establish a sustainable grid infrastructure in EuropeGÉANT2, http://www.geant2.net/Seventh generation of pan-European research and education network
Computational science infrastructure
Performance PyramidNational/regional centers, Grid-collaborationLocal centersEuropeanHPC center(s)e-IRGPRACETIER 0DEISA-2Capability ComputingEGEE-IIITIER 1Capacity ComputingTIER 2
Need to remember about petaflop/s…What do you mean with petaflop/s?Theoretical petaflop/s?LINPACK petaflop/s?Sustained petaflop/s for a single extremely parallel application?Sustained petaflop/s for multiple parallel applications?Note that between 1 and 4 there might be several yearsPetaflop/s hardware needs petaflop/s applications, which are not easy to program, or not even possible in many casesDo we even know how to scale over 100000 processors …
Emerging trends
Session 50 - High Performance Computing Ecosystem in Europe
ResearchCommunity-2ResearchCommunity-1ResearchCommunity-3Human interactionHumaninteractionHuman interactionWorkspaceWorkspaceWorkspaceLabsLabsLabsScientific DataScientific DataScientific DataComputing, GridComputing, GridComputing, GridNetworkNetworkNetworkGlobal Virtual Research Communities
VirtualCommunityVirtualCommunityVirtualCommunityHuman interactionHuman interactionHumaninteractionWorkspaceWorkspaceWorkspaceVirtual LabsVirtual LabsVirtual LabsScientific DataScientific DataScientific DataGridGridGridEconomiesof ScaleEfficiencyGainsNetworkNetworkNetworkGlobal Virtual Research CommunitiesScientific DataGridNetwork
Data and information explosionPetascale computing  produces exascale data
HPC is a part of a larger ecosystemDISCIPLINARIES, USER COMMUNITIESCOMPETENCESOFTWARE DEVELOPMENTDATA INFRASTRUCTURES AND SERVICESHPC AND GRID INFRASTRUCTURES
HPC Ecosystem to support the topThe upper layers of the pyramid HPC centers / servicesEuropean projects (HPC/Grid, networking,  …)Activities which enable efficient usage of upper layersInclusion of national HPC infrastructuresSoftware development and scalability issuesCompetence developmentInteroperability between the layers
Stakeholders
Stakeholder categories in PRACEProviders of HPC servicesEuropean HPC and grid projectsNetworking infrastructure providersHardware vendorsSoftware vendors and the software developing academic communityEnd users and their access through related Research InfrastructuresFunding bodies on a national and international levelPolicy setting organisations directly involved in developing the research infrastructure and political bodies like parliaments responsible for national and international legislation
Policy and strategy work HET: HPC in Europe Taskforce http://www.hpcineuropetaskforce.eu/e-IRG: e-Infrastructure Reflection Grouphttp://www.e-irg.org/ESFRI: European Strategy Forum on Research Infrastructureshttp://www.cordis.lu/esfri/ERA Expert Group on Research InfrastructuresESFRI
Some focus areasCollaboration between research and e-infrastructure providersHorizontal ICT servicesBalanced approach: more focus on data, software development and competence developmentInclusion of different countries, different contribution levelsNew emerging technologies, innovative computing initiativesGlobal collaboration, for example Exascale computing initiativePolicy work, resource exchange, sustainable services etc.
On-going Grid and HPC activities
EU infrastructure projectsGEANTNumber of data infrastructure projects
Session 50 - High Performance Computing Ecosystem in Europe
22Supercomputing Drives Science through SimulationEnvironmentWeather/ ClimatologyPollution / Ozone HoleAgeing SocietyMedicineBiologyEnergyPlasma PhysicsFuel CellsMaterials/ Inf. TechSpintronicsNano-science
23PRACE InitiativeHistory and First StepsProduction of the HPC part ofthe ESFRI Roadmap;Creation of a vision,involving 15 European countriesSignature of the MoU Submission ofan FP7 project proposalBringing scientists togetherCreation of the Scientific Case Approval of the project Project startHPCEURHET20042005200620072008
24HET: The Scientific CaseWeather, Climatology, Earth Sciencedegree of warming, scenarios for our future climate.understand and predict ocean properties and variationsweather and flood eventsAstrophysics, Elementary particle physics, Plasma physicssystems, structures which span a large range of different length and time scalesquantum field theories like QCD, ITERMaterial Science, Chemistry, Nanoscienceunderstanding complex materials, complex chemistry, nanosciencethe determination of electronic and transport propertiesLife Sciencesystem biology, chromatin dynamics, large scale protein dynamics, protein association and aggregation, supramolecular systems, medicineEngineeringcomplex helicopter simulation, biomedical flows, gas turbines and internal combustion engines, forest fires, green aircraft, virtual power plant
25First success: HPC in ESFRI RoadmapThe European Roadmap for Research Infrastructures is the first comprehensive definition at the European levelResearch Infrastructures areone of the crucial pillars of the European Research AreaA European HPC service – impact foreseen: strategic competitiveness
 attractiveness for researchers
 supporting industrial    development26Second success: The PRACE Initiative Memorandum of Understanding signed by 15 States in Berlin, on  April 16, 2007France, Germany, Spain,       The Netherlands, UKcommitted funding for a European HPC Research Infrastructure (LoS)New:
27Third success: the PRACE projectPartnership for Advanced Computing in EuropePRACEEU Project of the European Commission 7th Framework Program Construction of new infrastructures - preparatory phaseFP7-INFRASTRUCTURES-2007-1Partners are 16  Legal Entities from 14 European countriesBudget: 20 Mio €EU funding: 10 Mio €Duration: January 2008 – December 2009Grant no: RI-211528
28PRACE Partners
PRACE Work PackagesWP1 ManagementWP2 Organizational conceptWP3 Dissemination, outreach and trainingWP4 Distributed computingWP5 Deployment of prototype systemsWP6 Software enabling for prototype systemsWP7 Petaflop systems for 2009/2010WP8 Future petaflop technologies29
30PRACE Objectives in a NutshellProvide world-class systems for world-class scienceCreate a single European entity Deploy 3 – 5 systems of the highest performance level (tier-0)Ensure diversity of architecturesProvide support and trainingPRACE will be created to stay
31Representative Benchmark SuiteDefined a set of applications benchmarksTo be used in the procurement process for Petaflop/s systems12 core applications, plus 8 additional applicationsCore:NAMD, VASP, QCD, CPMD, GADGET, Code_Saturne, TORB, ECHAM5, NEMO, CP2K, GROMACS, N3DAdditional: AVBP, HELIUM, TRIPOLI_4, PEPC, GPAW, ALYA, SIESTA, BSITEach application will be ported to appropriate subset of prototypesSynthetic benchmarks for architecture evaluationComputation, mixed-mode, IO, bandwidth, OS, communicationApplications and Synthetic benchmarks integrated into JuBEJuelich Benchmark Environment
32Mapping Applications to ArchitecturesIdentified affinities and priorities
Based on the application analysis - expressed in a condensed, qualitative way
Need for different “general purpose” systems
There are promising emerging architectures
Will be more quantitative after benchmark runs on prototypes E = estimated
33Installed prototypesIBM BlueGene/P (FZJ)01-2008IBM Power6 (SARA)07-2008Cray XT5 (CSC)11-2008IBM Cell/Power (BSC)12-2008NEC SX9, vector part (HLRS)02-2009Intel Nehalem/Xeon (CEA/FZJ)06-200933
34Status June 200934Summary of current Prototype Status
35Web site and the dissemination channelsThe PRACE web presence with news, events, RSS feeds etc. http://www.prace-project.euAlpha-Galileo service: 6500 journalists around the globe: http://www.alphagalileo.orgBelief Digital LibraryHPC-magazinesPRACE partner sites, top 10 HPC usersThe PRACE website, www.prace-project.eu
36PRACE Dissemination PackagePRACE WP3 has created a dissemination package including templates, brochures, flyers, posters, badges, t-shirts, USB-keys, badges etc.The PRACE logoHeavy Computing 10^15: the PRACE t-shirtPRACE USB-key
37
SC'08 Austin                      2008-11-19Andreas Schott, DEISA 38DEISA:	May 1st, 2004 – April 30th, 2008 DEISA2:	May 1st, 2008 – April 30th, 2011DEISA Partners
SC'08 Austin                      2008-11-19Andreas Schott, DEISA 39DEISA PartnersBSCBarcelona Supercomputing Centre 			SpainCINECA	      Consortio Interuniversitario per il Calcolo Automatico 	ItalyCSCFinnish Information Technology Centre for Science 		FinlandEPCCUniversity of Edinburgh and CCLRC 	                             	UKECMWFEuropean Centre for Medium-Range Weather Forecast  	UK (int)FZJ  Research Centre Juelich					GermanyHLRS High Performance Computing Centre Stuttgart 	   	GermanyIDRIS Institut du Développement et des Ressources 	        	France	      en Informatique Scientifique - CNRSLRZ Leibniz Rechenzentrum Munich			   	GermanyRZG Rechenzentrum Garching of the Max Planck Society		GermanySARADutch National High Performance Computing 		NetherlandsCEA-CCRT   Centre de Calcul Recherche et Technologie, CEA 		FranceKTH Kungliga Tekniska Högskolan 			   	SwedenCSCSSwiss National Supercomputing Centre 		   	SwitzerlandJSCCJoint Supercomputer Center of the Russian 		Russia	      Academy of Sciences
DEISA 2008Operating the European HPC Infrastructure>1 PetaFlop/s Aggregated peak performanceMost powerful European supercomputers  for most challenging projectsTop-level Europe-wide application enablingGrand Challenge projects performed on a regular basis
SC'08 Austin                      2008-11-19Andreas Schott, DEISA 41DEISA Core Infrastructure and ServicesDedicated High Speed Network Common AAASingle sign onAccounting/budgetingGlobal Data ManagementHigh performance remote I/O and data sharing with		  global file systemsHigh performance transfers of large data sets User Operational InfrastructureDistributed Common Production Environment (DCPE)Job management service Common user support and help deskSystem Operational InfrastructureCommon monitoring and information systemsCommon system operationGlobal Application Support
SC'08 Austin                      2008-11-19Andreas Schott, DEISA 42FUNETSURFnetDFNUKERNAGARRRENATER1 Gb/s GRE tunnel 10 Gb/s wavelength10 Gb/s routed10 Gb/s switchedRedIrisDEISA dedicated high speed network
SC'08 Austin                      2008-11-19Andreas Schott, DEISA 43Super-UXNQS IIDEISA Global File System(based on MC-GPFS)IBM P6 (& BlueGene/P)NEC SX8IBM P6 (& BlueGene/P)AIXLL-MCAIXLL-MCGridFTPIBM P6Cray XT4AIXLLUNICOS/lcPBS ProUNICOS/lcPBS ProCray XT4 & XT5SGI ALTIXLINUXPBS ProAIXLL-MCAIXLL-MCIBM P5LINUXMaui/SlurmLINUXLLIBM P6 & BlueGene/PIBM P6IBM PPC
SC'08 Austin                      2008-11-19Andreas Schott, DEISA 44Multipleways toaccessPresen-tationlayerCommonproductionenvironm.SinglemonitorsystemCo-reservationand co-allocationJob manag.layer and monitor.DEISASitesDatatransfer toolsJobreroutingWorkflowmanagem.Data stagingtoolsWANsharedfilesystemData manag.layerUnifiedAAANetworkconnec-tivityNetworkandAAAlayersDEISA Software Layers
SC'08 Austin                      2008-11-19Andreas Schott, DEISA 45SupercomputerHardware PerformancePyramidSupercomputerApplication EnablingRequirementsPyramidCapability computing will always need expert support for application enabling and optimizationsThe more resource demanding one single problem is, the higherare generally the requirements for application enabling including enhancing scalability EUNationalLocal
SC'08 Austin                      2008-11-19Andreas Schott, DEISA 46DEISA Organizational StructureWP1 – ManagementWP2 – Dissemination, External Relations, TrainingWP3 – OperationsWP4 – TechnologiesWP5 – Applications EnablingWP6 – User Environment and SupportWP7 – Extreme Computing (DECI) and Benchmark SuiteWP8 – Integrated DEISA Development EnvironmentWP9 – Enhancing Scalability
SC'08 Austin                      2008-11-19Andreas Schott, DEISA 47Evolution of Supercomputing ResourcesDEISA partners´ compute resources at DEISA project start:~ 30 TF aggregated peak performance2004DEISA partners´ resources at DEISA2 project start:Over 1 PF aggregated peak performance on state-of-the art supercomputers2008Cray XT4 and XT5, LinuxIBM Power5, Power6, AIX / LinuxIBM BlueGene/P, Linux (frontend)IBM PowerPC, Linux (MareNostrum)SGI ALTIX 4700 (Itanium2 Montecito), LinuxNEC SX8 vector system, Super UXSystems interconnected with dedicated 10Gb/s network links provided by GEANT2 and NRENsFixed fraction of resources dedicated to DEISA usage
SC'08 Austin                      2008-11-19Andreas Schott, DEISA 48DEISA Extreme Computing Initiative			(DECI)DECI launched in early 2005 to enhance DEISA’s impact    on science and technology Identification, enabling, deploying and operation of “flagship” applications   in selected areas of science and technologyComplex, demanding, innovative simulations requiring the exceptional    capabilities of DEISAMulti-national proposals especially encourageProposals reviewed by national evaluation committeesProjects chosen on the basis of innovation potential,    scientific excellence, relevance criteria, and national priorities Most powerful HPC architectures in Europe for the most challenging projectsMost appropriate supercomputer architecture selected for each project Mitigation of the rapid performance decay of a single national supercomputer   within its short lifetime cycle of  typically about  5 years, as implied by Moore’s law
SC'08 Austin                      2008-11-19Andreas Schott, DEISA 49DEISA Extreme Computing InitiativeInvolvements in projects from DECI calls 2005, 2006, 2007:157 research institutes and universities from15 European countriesAustria 		Finland		France		Germany         	HungaryItaly		Netherlands 	Poland		Portugal 	Romania Russia		Spain		Sweden 	Switzerland	UKwith collaborators fromfour other continentsNorth America, South America, Asia, Australia
SC'08 Austin                      2008-11-19Andreas Schott, DEISA 50DEISA Extreme Computing InitiativeCalls for Proposals for challenging supercomputing projects from all areas of science DECI call 200551 proposals, 12 European countries involved, co-investigator from US)	30 mio cpu-h requested	29 proposals accepted, 12 mio cpu-h awarded (normalized to IBM P4+)DECI call 200641 proposals, 12 European countries involved	     co-investigators from N + S America, Asia  (US, CA, AR, ISRAEL)	28 mio cpu-h requested 	23 proposals accepted, 12 mio cpu-h awarded (normalized to IBM P4+)DECI call 200763 proposals, 14 European countries involved, co-investigators from                      N + S America, Asia, Australia  (US, CA, BR, AR, ISRAEL, AUS)	70 mio cpu-h requested	45 proposals accepted, ~30 mio cpu-h awarded (normalized to IBM P4+)DECI call 2008 66 proposals, 15 European countries involved, co-investigators from 	     N + S America, Asia, Australia	134 mio cpu-h requested (normalized to IBM P4+)Evaluation in progress
SC'08 Austin                      2008-11-19Andreas Schott, DEISA 51DECI Project POLYRESCover Story of Nature - May 24, 2007Curvy membranes make proteins attractiveFor almost two decades, physicists have been on the track of membrane mediated interactions. Simulations in DEISA have now revealed that curvy membranes make proteins attractive.Nature 447 (2007), 461-464proteins (red) adhere on a membrane 	(blue/yellow) and  locally bend it; this triggers a growing invagination. cross-section through an almost        complete vesicle  B. J. Reynwar et al.: Aggregation and vesiculation of membrane proteins by curvature mediated interactions, NATURE Vol 447|24 May 2007| doi:10.1038/nature05840
SC'08 Austin                      2008-11-19Andreas Schott, DEISA 52Achievements and Scientific ImpactBrochures can be downloaded from http://www.deisa.eu/publications/results
SC'08 Austin                      2008-11-19Andreas Schott, DEISA 532003200420052006200720082009201020112002Evolution of User Categories in DEISAStart ofFP7 DEISA2Start ofFP6 DEISADEISA EoISupport of Virtual Communitiesand EU projectsSingle project supportDEISA Extreme Computing InitiativeEarly adopters(Joint Research Activities)FP6 DEISAFP7 DEISA2Preparatory Phase
SC'08 Austin                      2008-11-19Andreas Schott, DEISA 54Tier0 / Tier1 CentersAre there implications for the services?Main difference between T0 and T1 centers:  policy and usage models !  T1 centers can evolve to T0 for strategic/political reasonsT0 machines automatically degrade to T1 level by agingT0 Centers Leadership-class European systems in competition to the leading systems     worldwide, cyclically renewedGovernance structure to be provided by European organization(PRACE)T1 CentersLeading national Centers, cyclically renewed, optionally      surpassing the performance of older T0 machines  National Governance structureServices have to be the same in T0/T1Because of the change of the status of the systems, over timeFor user transparency of the different systems 	(Only visible: Some services could have different flavors for T0 and T1)
SC'08 Austin                      2008-11-19Andreas Schott, DEISA 55SummaryEvolvement of this European infrastructure towards a robust and persistent European HPC ecosystem Enhancing the existing services, by deploying new services including support for European Virtual Communities, and by cooperating and collaborating with new European initiatives, especially PRACEDEISA2 as the vector for the integration of Tier-0 and Tier-1 	systems in EuropeTo provide a lean and reliable turnkey operational solution 	for a persistent European HPC infrastructureBridging worldwide HPC projects: To facilitate the support of international science communities with computational needs traversing existing political boundaries
EGEE Status April 2009 InfrastructureNumber of sites connected to the EGEE infrastructure: 268
Number of countries connected to the EGEE infrastructure: 54
Number of CPUs (cores) available to users 24/7: ~139,000
Storage capacity available: ~ 25 PB disk + 38 PB tape MSSUsersNumber of Virtual Organisations using the EGEE infrastructure: > 170
Number of registered Virtual Organisations: >112
Number of registered users: > 13000
Number of people benefiting from the existence of the EGEE infrastructure: ~20000
Number of jobs: >390k jobs/day
Number of application domains making use of the EGEE infrastructure: more than 15 ProductiveUseUtilityTestbedsAre we ready for the demand?NationalEuropean e-InfrastructureGlobal57

More Related Content

PPT
Berlin 6 Open Access Conference: Deirdre Furlong
PPTX
Gergely Sipos, Claudio Cacciari: Welcome and mapping the landscape: EOSC-hub ...
PDF
Reducing Infrastructure and Service Fragmentation
PDF
Building a European policy framework
PDF
Governance and Sustainability of EOSC: ambitions, challenges and opportunities
PDF
Plan4all newsletter 5
PDF
The value of EOSC from a user perspective: Key themes and actions from Day 1
PPTX
Science Demonstrator Session: Physics and Astrophysics
Berlin 6 Open Access Conference: Deirdre Furlong
Gergely Sipos, Claudio Cacciari: Welcome and mapping the landscape: EOSC-hub ...
Reducing Infrastructure and Service Fragmentation
Building a European policy framework
Governance and Sustainability of EOSC: ambitions, challenges and opportunities
Plan4all newsletter 5
The value of EOSC from a user perspective: Key themes and actions from Day 1
Science Demonstrator Session: Physics and Astrophysics

What's hot (20)

PDF
Final rossana borello alps4_eu presentation_bruxelles
PDF
PaNOSC Overview - ExPaNDS kick-off meeting - September 2019
PPTX
ESCAPE cluster of Astronomy & Particle physics RIs,
PDF
Interoperability in practice and FAIR data principles
PPTX
Science Demonstrator Session: Social and Earth Sciences
PDF
Plan4all final conference - Practical Information, Access & Accommodation
PDF
National scale research computing and beyond pearc panel 2017
PDF
The EOSC-hub: Integrating and managing services for the European Open Science...
PDF
Soap box session - Intermediaries, Research communities & Libraries
PPT
OpenAIRE short presentation
PPTX
Gergely Sipos (EGI): Exploiting scientific data in the international context ...
PPTX
The European Open Science Cloud
PPTX
ESCAPE Kick-off meeting - Welcome (Feb 2019)
PPTX
LandCity Revolution - L'evoluzione del segmento di terra per sostenere l'era ...
PPTX
OSFair2017 Workshop | The European Open Science Cloud Pilot
PPTX
PaNOSC: EOSC for Photon and Neutron Facilities Users
PPT
Perx and TechXtra
PPT
Pedro medeiros citi-cloudviews
ODP
Pedro medeiros citi-cloudviews
PDF
Data analytics and downscaling for climate research in a big data world
Final rossana borello alps4_eu presentation_bruxelles
PaNOSC Overview - ExPaNDS kick-off meeting - September 2019
ESCAPE cluster of Astronomy & Particle physics RIs,
Interoperability in practice and FAIR data principles
Science Demonstrator Session: Social and Earth Sciences
Plan4all final conference - Practical Information, Access & Accommodation
National scale research computing and beyond pearc panel 2017
The EOSC-hub: Integrating and managing services for the European Open Science...
Soap box session - Intermediaries, Research communities & Libraries
OpenAIRE short presentation
Gergely Sipos (EGI): Exploiting scientific data in the international context ...
The European Open Science Cloud
ESCAPE Kick-off meeting - Welcome (Feb 2019)
LandCity Revolution - L'evoluzione del segmento di terra per sostenere l'era ...
OSFair2017 Workshop | The European Open Science Cloud Pilot
PaNOSC: EOSC for Photon and Neutron Facilities Users
Perx and TechXtra
Pedro medeiros citi-cloudviews
Pedro medeiros citi-cloudviews
Data analytics and downscaling for climate research in a big data world
Ad

Viewers also liked (14)

PPT
Integrating Practical2009
PDF
Session 58 - Cloud computing, virtualisation and the future
PDF
Session 58 :: Cloud computing, virtualisation and the future Speaker: Ake Edlund
PDF
PPTX
Session 33 - Production Grids
PDF
Session 23 - Intro to EGEE-III
PDF
Session10part1 Server Intro
DOC
Application Form
PDF
Session5 T Infr Access Emidio
PDF
Issgc Welcome
PDF
Session 40 : SAGA Overview and Introduction
PDF
Session10part2 Servers Detailed
PPT
Session 48 - Principles of Semantic metadata management
PPT
Session 49 Practical Semantic Sticky Note
Integrating Practical2009
Session 58 - Cloud computing, virtualisation and the future
Session 58 :: Cloud computing, virtualisation and the future Speaker: Ake Edlund
Session 33 - Production Grids
Session 23 - Intro to EGEE-III
Session10part1 Server Intro
Application Form
Session5 T Infr Access Emidio
Issgc Welcome
Session 40 : SAGA Overview and Introduction
Session10part2 Servers Detailed
Session 48 - Principles of Semantic metadata management
Session 49 Practical Semantic Sticky Note
Ad

Similar to Session 50 - High Performance Computing Ecosystem in Europe (20)

PPTX
High-Performance Computing Research in Europe
PPT
European Policies for High Performance Computing
PDF
OpenPOWER System Marconi100
PDF
e-Infrastructures as a key enabler for virtual research communities
PDF
EuroHPC - The EU Strategy in HPC
PPTX
e-Infrastructure available for research, using the right tool for the right job
PDF
EuroHPC and European HPC Strategy
PDF
EuroHPC Joint Undertaking. Accelerating the convergence between Big Data and ...
PPTX
Communication Frameworks for HPC and Big Data
PDF
RDA Europe 4.0 - Kick-Off Spanish Node - BSC presentation
PPTX
High Performance Computing - The Future is Here
PDF
Exascale Update from Hyperion Research
PDF
PRACE ETHZ
PDF
Egi.cf.2014
PPT
Cross-cutting Informatics in Horizon 2020
PDF
MVAPICH2 and MVAPICH2-X Projects: Latest Developments and Future Plans
PDF
European Processor Initiative: The EuroHPC Industrial Cornerstone
PDF
Designing Software Libraries and Middleware for Exascale Systems: Opportuniti...
PDF
Practitioner's perspective on High Performance Computing services for innovat...
PPT
Science in a Digital Age
High-Performance Computing Research in Europe
European Policies for High Performance Computing
OpenPOWER System Marconi100
e-Infrastructures as a key enabler for virtual research communities
EuroHPC - The EU Strategy in HPC
e-Infrastructure available for research, using the right tool for the right job
EuroHPC and European HPC Strategy
EuroHPC Joint Undertaking. Accelerating the convergence between Big Data and ...
Communication Frameworks for HPC and Big Data
RDA Europe 4.0 - Kick-Off Spanish Node - BSC presentation
High Performance Computing - The Future is Here
Exascale Update from Hyperion Research
PRACE ETHZ
Egi.cf.2014
Cross-cutting Informatics in Horizon 2020
MVAPICH2 and MVAPICH2-X Projects: Latest Developments and Future Plans
European Processor Initiative: The EuroHPC Industrial Cornerstone
Designing Software Libraries and Middleware for Exascale Systems: Opportuniti...
Practitioner's perspective on High Performance Computing services for innovat...
Science in a Digital Age

More from ISSGC Summer School (18)

PPT
Session 49 - Semantic metadata management practical
PPT
Session 46 - Principles of workflow management and execution
PPT
Session 42 - GridSAM
PPT
Session 37 - Intro to Workflows, API's and semantics
PPT
Session 43 :: Accessing data using a common interface: OGSA-DAI as an example
PPT
Session 36 - Engage Results
PDF
Social Program
PPT
Session29 Arc
PDF
Session 24 - Distribute Data and Metadata Management with gLite
PDF
Session 23 - gLite Overview
PPTX
General Introduction to technologies that will be seen in the school
PPT
Session 3-Distributed System Principals
PPT
Session18 Madduri
PDF
Session6 Security Emidio
PDF
Session9part1
PDF
Session19 Globus
PDF
Session11 Ucc Intro
PDF
Session9part2 Servers Detailed
Session 49 - Semantic metadata management practical
Session 46 - Principles of workflow management and execution
Session 42 - GridSAM
Session 37 - Intro to Workflows, API's and semantics
Session 43 :: Accessing data using a common interface: OGSA-DAI as an example
Session 36 - Engage Results
Social Program
Session29 Arc
Session 24 - Distribute Data and Metadata Management with gLite
Session 23 - gLite Overview
General Introduction to technologies that will be seen in the school
Session 3-Distributed System Principals
Session18 Madduri
Session6 Security Emidio
Session9part1
Session19 Globus
Session11 Ucc Intro
Session9part2 Servers Detailed

Recently uploaded (20)

PDF
NewMind AI Weekly Chronicles - August'25-Week II
PDF
Univ-Connecticut-ChatGPT-Presentaion.pdf
PDF
August Patch Tuesday
PPTX
A Presentation on Touch Screen Technology
PPTX
TechTalks-8-2019-Service-Management-ITIL-Refresh-ITIL-4-Framework-Supports-Ou...
PDF
7 ChatGPT Prompts to Help You Define Your Ideal Customer Profile.pdf
PDF
A comparative analysis of optical character recognition models for extracting...
PDF
Encapsulation_ Review paper, used for researhc scholars
PPTX
A Presentation on Artificial Intelligence
PDF
MIND Revenue Release Quarter 2 2025 Press Release
PPTX
Chapter 5: Probability Theory and Statistics
PDF
project resource management chapter-09.pdf
PDF
gpt5_lecture_notes_comprehensive_20250812015547.pdf
PDF
Transform Your ITIL® 4 & ITSM Strategy with AI in 2025.pdf
PPTX
OMC Textile Division Presentation 2021.pptx
PDF
A novel scalable deep ensemble learning framework for big data classification...
PDF
Zenith AI: Advanced Artificial Intelligence
PDF
Profit Center Accounting in SAP S/4HANA, S4F28 Col11
PDF
DP Operators-handbook-extract for the Mautical Institute
PDF
Approach and Philosophy of On baking technology
NewMind AI Weekly Chronicles - August'25-Week II
Univ-Connecticut-ChatGPT-Presentaion.pdf
August Patch Tuesday
A Presentation on Touch Screen Technology
TechTalks-8-2019-Service-Management-ITIL-Refresh-ITIL-4-Framework-Supports-Ou...
7 ChatGPT Prompts to Help You Define Your Ideal Customer Profile.pdf
A comparative analysis of optical character recognition models for extracting...
Encapsulation_ Review paper, used for researhc scholars
A Presentation on Artificial Intelligence
MIND Revenue Release Quarter 2 2025 Press Release
Chapter 5: Probability Theory and Statistics
project resource management chapter-09.pdf
gpt5_lecture_notes_comprehensive_20250812015547.pdf
Transform Your ITIL® 4 & ITSM Strategy with AI in 2025.pdf
OMC Textile Division Presentation 2021.pptx
A novel scalable deep ensemble learning framework for big data classification...
Zenith AI: Advanced Artificial Intelligence
Profit Center Accounting in SAP S/4HANA, S4F28 Col11
DP Operators-handbook-extract for the Mautical Institute
Approach and Philosophy of On baking technology

Session 50 - High Performance Computing Ecosystem in Europe

  • 1. High-Performance Computing Ecosystemin EuropeJuly 15th, 2009Kimmo KoskiCSC – The Finnish IT Center for Science
  • 2. TopicsTerminology and definitionsEmerging trendsStakeholdersOn-going Grid and HPC activitiesConcluding remarks
  • 3. Terminology and pointersHPC High Performance ComputingHET, http://www.hpcineuropetaskforce.eu/High Performance Computing in Europe Taskforce, established in June 2006 with a mandate to draft a strategy for European HPC ecosystemPetaflop/sPerformance figure 1015 floating point operations (calculations) in seconde-IRG, http://www.eirg.eue-Infrastructure Reflection Group. e-IRG is supporting the creation of a framework (political, technological and administrative) for the easy and cost-effective shared use of distributed electronic resources across Europe - particularly for grid computing, storage and networking.ESFRI, http://cordis.europa.eu/esfri/European Strategy Forum on Research Infrastructures. The role of ESFRI is to support a coherent approach to policy-making on research infrastructures in Europe, and to act as an incubator for international negotiations about concrete initiatives. In particular, ESFRI is preparing a European Roadmap for new research infrastructures of pan-European interest.RIResearch Infrastructure
  • 4. Terminology and pointers (cont.)PRACE, http://www.prace-project.eu/Partnership for Advanced Computing in EuropeEU FP7 project for preparatory phase in building the European petaflop computing centers, based on HET workDEISA-2, https://www.deisa.org/Distributed European Infrastructure for Supercomputing Applications. DEISA is a consortium of leading national supercomputing centers that currently deploys and operates a persistent, production quality, distributed supercomputing environment with continental scope. EGEE-III, http://www.eu-egee.org/Enabling Grid for E-sciencE. The project provides researchers in academia and industry with access to a production level Grid infrastructure, independent of their geographic location. EGI_DS, http://www.eu-egi.org/An effort to establish a sustainable grid infrastructure in EuropeGÉANT2, http://www.geant2.net/Seventh generation of pan-European research and education network
  • 6. Performance PyramidNational/regional centers, Grid-collaborationLocal centersEuropeanHPC center(s)e-IRGPRACETIER 0DEISA-2Capability ComputingEGEE-IIITIER 1Capacity ComputingTIER 2
  • 7. Need to remember about petaflop/s…What do you mean with petaflop/s?Theoretical petaflop/s?LINPACK petaflop/s?Sustained petaflop/s for a single extremely parallel application?Sustained petaflop/s for multiple parallel applications?Note that between 1 and 4 there might be several yearsPetaflop/s hardware needs petaflop/s applications, which are not easy to program, or not even possible in many casesDo we even know how to scale over 100000 processors …
  • 10. ResearchCommunity-2ResearchCommunity-1ResearchCommunity-3Human interactionHumaninteractionHuman interactionWorkspaceWorkspaceWorkspaceLabsLabsLabsScientific DataScientific DataScientific DataComputing, GridComputing, GridComputing, GridNetworkNetworkNetworkGlobal Virtual Research Communities
  • 11. VirtualCommunityVirtualCommunityVirtualCommunityHuman interactionHuman interactionHumaninteractionWorkspaceWorkspaceWorkspaceVirtual LabsVirtual LabsVirtual LabsScientific DataScientific DataScientific DataGridGridGridEconomiesof ScaleEfficiencyGainsNetworkNetworkNetworkGlobal Virtual Research CommunitiesScientific DataGridNetwork
  • 12. Data and information explosionPetascale computing produces exascale data
  • 13. HPC is a part of a larger ecosystemDISCIPLINARIES, USER COMMUNITIESCOMPETENCESOFTWARE DEVELOPMENTDATA INFRASTRUCTURES AND SERVICESHPC AND GRID INFRASTRUCTURES
  • 14. HPC Ecosystem to support the topThe upper layers of the pyramid HPC centers / servicesEuropean projects (HPC/Grid, networking, …)Activities which enable efficient usage of upper layersInclusion of national HPC infrastructuresSoftware development and scalability issuesCompetence developmentInteroperability between the layers
  • 16. Stakeholder categories in PRACEProviders of HPC servicesEuropean HPC and grid projectsNetworking infrastructure providersHardware vendorsSoftware vendors and the software developing academic communityEnd users and their access through related Research InfrastructuresFunding bodies on a national and international levelPolicy setting organisations directly involved in developing the research infrastructure and political bodies like parliaments responsible for national and international legislation
  • 17. Policy and strategy work HET: HPC in Europe Taskforce http://www.hpcineuropetaskforce.eu/e-IRG: e-Infrastructure Reflection Grouphttp://www.e-irg.org/ESFRI: European Strategy Forum on Research Infrastructureshttp://www.cordis.lu/esfri/ERA Expert Group on Research InfrastructuresESFRI
  • 18. Some focus areasCollaboration between research and e-infrastructure providersHorizontal ICT servicesBalanced approach: more focus on data, software development and competence developmentInclusion of different countries, different contribution levelsNew emerging technologies, innovative computing initiativesGlobal collaboration, for example Exascale computing initiativePolicy work, resource exchange, sustainable services etc.
  • 19. On-going Grid and HPC activities
  • 20. EU infrastructure projectsGEANTNumber of data infrastructure projects
  • 22. 22Supercomputing Drives Science through SimulationEnvironmentWeather/ ClimatologyPollution / Ozone HoleAgeing SocietyMedicineBiologyEnergyPlasma PhysicsFuel CellsMaterials/ Inf. TechSpintronicsNano-science
  • 23. 23PRACE InitiativeHistory and First StepsProduction of the HPC part ofthe ESFRI Roadmap;Creation of a vision,involving 15 European countriesSignature of the MoU Submission ofan FP7 project proposalBringing scientists togetherCreation of the Scientific Case Approval of the project Project startHPCEURHET20042005200620072008
  • 24. 24HET: The Scientific CaseWeather, Climatology, Earth Sciencedegree of warming, scenarios for our future climate.understand and predict ocean properties and variationsweather and flood eventsAstrophysics, Elementary particle physics, Plasma physicssystems, structures which span a large range of different length and time scalesquantum field theories like QCD, ITERMaterial Science, Chemistry, Nanoscienceunderstanding complex materials, complex chemistry, nanosciencethe determination of electronic and transport propertiesLife Sciencesystem biology, chromatin dynamics, large scale protein dynamics, protein association and aggregation, supramolecular systems, medicineEngineeringcomplex helicopter simulation, biomedical flows, gas turbines and internal combustion engines, forest fires, green aircraft, virtual power plant
  • 25. 25First success: HPC in ESFRI RoadmapThe European Roadmap for Research Infrastructures is the first comprehensive definition at the European levelResearch Infrastructures areone of the crucial pillars of the European Research AreaA European HPC service – impact foreseen: strategic competitiveness
  • 26. attractiveness for researchers
  • 27. supporting industrial development26Second success: The PRACE Initiative Memorandum of Understanding signed by 15 States in Berlin, on April 16, 2007France, Germany, Spain, The Netherlands, UKcommitted funding for a European HPC Research Infrastructure (LoS)New:
  • 28. 27Third success: the PRACE projectPartnership for Advanced Computing in EuropePRACEEU Project of the European Commission 7th Framework Program Construction of new infrastructures - preparatory phaseFP7-INFRASTRUCTURES-2007-1Partners are 16 Legal Entities from 14 European countriesBudget: 20 Mio €EU funding: 10 Mio €Duration: January 2008 – December 2009Grant no: RI-211528
  • 30. PRACE Work PackagesWP1 ManagementWP2 Organizational conceptWP3 Dissemination, outreach and trainingWP4 Distributed computingWP5 Deployment of prototype systemsWP6 Software enabling for prototype systemsWP7 Petaflop systems for 2009/2010WP8 Future petaflop technologies29
  • 31. 30PRACE Objectives in a NutshellProvide world-class systems for world-class scienceCreate a single European entity Deploy 3 – 5 systems of the highest performance level (tier-0)Ensure diversity of architecturesProvide support and trainingPRACE will be created to stay
  • 32. 31Representative Benchmark SuiteDefined a set of applications benchmarksTo be used in the procurement process for Petaflop/s systems12 core applications, plus 8 additional applicationsCore:NAMD, VASP, QCD, CPMD, GADGET, Code_Saturne, TORB, ECHAM5, NEMO, CP2K, GROMACS, N3DAdditional: AVBP, HELIUM, TRIPOLI_4, PEPC, GPAW, ALYA, SIESTA, BSITEach application will be ported to appropriate subset of prototypesSynthetic benchmarks for architecture evaluationComputation, mixed-mode, IO, bandwidth, OS, communicationApplications and Synthetic benchmarks integrated into JuBEJuelich Benchmark Environment
  • 33. 32Mapping Applications to ArchitecturesIdentified affinities and priorities
  • 34. Based on the application analysis - expressed in a condensed, qualitative way
  • 35. Need for different “general purpose” systems
  • 36. There are promising emerging architectures
  • 37. Will be more quantitative after benchmark runs on prototypes E = estimated
  • 38. 33Installed prototypesIBM BlueGene/P (FZJ)01-2008IBM Power6 (SARA)07-2008Cray XT5 (CSC)11-2008IBM Cell/Power (BSC)12-2008NEC SX9, vector part (HLRS)02-2009Intel Nehalem/Xeon (CEA/FZJ)06-200933
  • 39. 34Status June 200934Summary of current Prototype Status
  • 40. 35Web site and the dissemination channelsThe PRACE web presence with news, events, RSS feeds etc. http://www.prace-project.euAlpha-Galileo service: 6500 journalists around the globe: http://www.alphagalileo.orgBelief Digital LibraryHPC-magazinesPRACE partner sites, top 10 HPC usersThe PRACE website, www.prace-project.eu
  • 41. 36PRACE Dissemination PackagePRACE WP3 has created a dissemination package including templates, brochures, flyers, posters, badges, t-shirts, USB-keys, badges etc.The PRACE logoHeavy Computing 10^15: the PRACE t-shirtPRACE USB-key
  • 42. 37
  • 43. SC'08 Austin 2008-11-19Andreas Schott, DEISA 38DEISA: May 1st, 2004 – April 30th, 2008 DEISA2: May 1st, 2008 – April 30th, 2011DEISA Partners
  • 44. SC'08 Austin 2008-11-19Andreas Schott, DEISA 39DEISA PartnersBSCBarcelona Supercomputing Centre SpainCINECA Consortio Interuniversitario per il Calcolo Automatico ItalyCSCFinnish Information Technology Centre for Science FinlandEPCCUniversity of Edinburgh and CCLRC UKECMWFEuropean Centre for Medium-Range Weather Forecast UK (int)FZJ Research Centre Juelich GermanyHLRS High Performance Computing Centre Stuttgart GermanyIDRIS Institut du Développement et des Ressources France en Informatique Scientifique - CNRSLRZ Leibniz Rechenzentrum Munich GermanyRZG Rechenzentrum Garching of the Max Planck Society GermanySARADutch National High Performance Computing NetherlandsCEA-CCRT Centre de Calcul Recherche et Technologie, CEA FranceKTH Kungliga Tekniska Högskolan SwedenCSCSSwiss National Supercomputing Centre SwitzerlandJSCCJoint Supercomputer Center of the Russian Russia Academy of Sciences
  • 45. DEISA 2008Operating the European HPC Infrastructure>1 PetaFlop/s Aggregated peak performanceMost powerful European supercomputers for most challenging projectsTop-level Europe-wide application enablingGrand Challenge projects performed on a regular basis
  • 46. SC'08 Austin 2008-11-19Andreas Schott, DEISA 41DEISA Core Infrastructure and ServicesDedicated High Speed Network Common AAASingle sign onAccounting/budgetingGlobal Data ManagementHigh performance remote I/O and data sharing with global file systemsHigh performance transfers of large data sets User Operational InfrastructureDistributed Common Production Environment (DCPE)Job management service Common user support and help deskSystem Operational InfrastructureCommon monitoring and information systemsCommon system operationGlobal Application Support
  • 47. SC'08 Austin 2008-11-19Andreas Schott, DEISA 42FUNETSURFnetDFNUKERNAGARRRENATER1 Gb/s GRE tunnel 10 Gb/s wavelength10 Gb/s routed10 Gb/s switchedRedIrisDEISA dedicated high speed network
  • 48. SC'08 Austin 2008-11-19Andreas Schott, DEISA 43Super-UXNQS IIDEISA Global File System(based on MC-GPFS)IBM P6 (& BlueGene/P)NEC SX8IBM P6 (& BlueGene/P)AIXLL-MCAIXLL-MCGridFTPIBM P6Cray XT4AIXLLUNICOS/lcPBS ProUNICOS/lcPBS ProCray XT4 & XT5SGI ALTIXLINUXPBS ProAIXLL-MCAIXLL-MCIBM P5LINUXMaui/SlurmLINUXLLIBM P6 & BlueGene/PIBM P6IBM PPC
  • 49. SC'08 Austin 2008-11-19Andreas Schott, DEISA 44Multipleways toaccessPresen-tationlayerCommonproductionenvironm.SinglemonitorsystemCo-reservationand co-allocationJob manag.layer and monitor.DEISASitesDatatransfer toolsJobreroutingWorkflowmanagem.Data stagingtoolsWANsharedfilesystemData manag.layerUnifiedAAANetworkconnec-tivityNetworkandAAAlayersDEISA Software Layers
  • 50. SC'08 Austin 2008-11-19Andreas Schott, DEISA 45SupercomputerHardware PerformancePyramidSupercomputerApplication EnablingRequirementsPyramidCapability computing will always need expert support for application enabling and optimizationsThe more resource demanding one single problem is, the higherare generally the requirements for application enabling including enhancing scalability EUNationalLocal
  • 51. SC'08 Austin 2008-11-19Andreas Schott, DEISA 46DEISA Organizational StructureWP1 – ManagementWP2 – Dissemination, External Relations, TrainingWP3 – OperationsWP4 – TechnologiesWP5 – Applications EnablingWP6 – User Environment and SupportWP7 – Extreme Computing (DECI) and Benchmark SuiteWP8 – Integrated DEISA Development EnvironmentWP9 – Enhancing Scalability
  • 52. SC'08 Austin 2008-11-19Andreas Schott, DEISA 47Evolution of Supercomputing ResourcesDEISA partners´ compute resources at DEISA project start:~ 30 TF aggregated peak performance2004DEISA partners´ resources at DEISA2 project start:Over 1 PF aggregated peak performance on state-of-the art supercomputers2008Cray XT4 and XT5, LinuxIBM Power5, Power6, AIX / LinuxIBM BlueGene/P, Linux (frontend)IBM PowerPC, Linux (MareNostrum)SGI ALTIX 4700 (Itanium2 Montecito), LinuxNEC SX8 vector system, Super UXSystems interconnected with dedicated 10Gb/s network links provided by GEANT2 and NRENsFixed fraction of resources dedicated to DEISA usage
  • 53. SC'08 Austin 2008-11-19Andreas Schott, DEISA 48DEISA Extreme Computing Initiative (DECI)DECI launched in early 2005 to enhance DEISA’s impact on science and technology Identification, enabling, deploying and operation of “flagship” applications in selected areas of science and technologyComplex, demanding, innovative simulations requiring the exceptional capabilities of DEISAMulti-national proposals especially encourageProposals reviewed by national evaluation committeesProjects chosen on the basis of innovation potential, scientific excellence, relevance criteria, and national priorities Most powerful HPC architectures in Europe for the most challenging projectsMost appropriate supercomputer architecture selected for each project Mitigation of the rapid performance decay of a single national supercomputer within its short lifetime cycle of typically about 5 years, as implied by Moore’s law
  • 54. SC'08 Austin 2008-11-19Andreas Schott, DEISA 49DEISA Extreme Computing InitiativeInvolvements in projects from DECI calls 2005, 2006, 2007:157 research institutes and universities from15 European countriesAustria Finland France Germany HungaryItaly Netherlands Poland Portugal Romania Russia Spain Sweden Switzerland UKwith collaborators fromfour other continentsNorth America, South America, Asia, Australia
  • 55. SC'08 Austin 2008-11-19Andreas Schott, DEISA 50DEISA Extreme Computing InitiativeCalls for Proposals for challenging supercomputing projects from all areas of science DECI call 200551 proposals, 12 European countries involved, co-investigator from US) 30 mio cpu-h requested 29 proposals accepted, 12 mio cpu-h awarded (normalized to IBM P4+)DECI call 200641 proposals, 12 European countries involved co-investigators from N + S America, Asia (US, CA, AR, ISRAEL) 28 mio cpu-h requested 23 proposals accepted, 12 mio cpu-h awarded (normalized to IBM P4+)DECI call 200763 proposals, 14 European countries involved, co-investigators from N + S America, Asia, Australia (US, CA, BR, AR, ISRAEL, AUS) 70 mio cpu-h requested 45 proposals accepted, ~30 mio cpu-h awarded (normalized to IBM P4+)DECI call 2008 66 proposals, 15 European countries involved, co-investigators from N + S America, Asia, Australia 134 mio cpu-h requested (normalized to IBM P4+)Evaluation in progress
  • 56. SC'08 Austin 2008-11-19Andreas Schott, DEISA 51DECI Project POLYRESCover Story of Nature - May 24, 2007Curvy membranes make proteins attractiveFor almost two decades, physicists have been on the track of membrane mediated interactions. Simulations in DEISA have now revealed that curvy membranes make proteins attractive.Nature 447 (2007), 461-464proteins (red) adhere on a membrane (blue/yellow) and locally bend it; this triggers a growing invagination. cross-section through an almost complete vesicle B. J. Reynwar et al.: Aggregation and vesiculation of membrane proteins by curvature mediated interactions, NATURE Vol 447|24 May 2007| doi:10.1038/nature05840
  • 57. SC'08 Austin 2008-11-19Andreas Schott, DEISA 52Achievements and Scientific ImpactBrochures can be downloaded from http://www.deisa.eu/publications/results
  • 58. SC'08 Austin 2008-11-19Andreas Schott, DEISA 532003200420052006200720082009201020112002Evolution of User Categories in DEISAStart ofFP7 DEISA2Start ofFP6 DEISADEISA EoISupport of Virtual Communitiesand EU projectsSingle project supportDEISA Extreme Computing InitiativeEarly adopters(Joint Research Activities)FP6 DEISAFP7 DEISA2Preparatory Phase
  • 59. SC'08 Austin 2008-11-19Andreas Schott, DEISA 54Tier0 / Tier1 CentersAre there implications for the services?Main difference between T0 and T1 centers: policy and usage models ! T1 centers can evolve to T0 for strategic/political reasonsT0 machines automatically degrade to T1 level by agingT0 Centers Leadership-class European systems in competition to the leading systems worldwide, cyclically renewedGovernance structure to be provided by European organization(PRACE)T1 CentersLeading national Centers, cyclically renewed, optionally surpassing the performance of older T0 machines National Governance structureServices have to be the same in T0/T1Because of the change of the status of the systems, over timeFor user transparency of the different systems (Only visible: Some services could have different flavors for T0 and T1)
  • 60. SC'08 Austin 2008-11-19Andreas Schott, DEISA 55SummaryEvolvement of this European infrastructure towards a robust and persistent European HPC ecosystem Enhancing the existing services, by deploying new services including support for European Virtual Communities, and by cooperating and collaborating with new European initiatives, especially PRACEDEISA2 as the vector for the integration of Tier-0 and Tier-1 systems in EuropeTo provide a lean and reliable turnkey operational solution for a persistent European HPC infrastructureBridging worldwide HPC projects: To facilitate the support of international science communities with computational needs traversing existing political boundaries
  • 61. EGEE Status April 2009 InfrastructureNumber of sites connected to the EGEE infrastructure: 268
  • 62. Number of countries connected to the EGEE infrastructure: 54
  • 63. Number of CPUs (cores) available to users 24/7: ~139,000
  • 64. Storage capacity available: ~ 25 PB disk + 38 PB tape MSSUsersNumber of Virtual Organisations using the EGEE infrastructure: > 170
  • 65. Number of registered Virtual Organisations: >112
  • 66. Number of registered users: > 13000
  • 67. Number of people benefiting from the existence of the EGEE infrastructure: ~20000
  • 68. Number of jobs: >390k jobs/day
  • 69. Number of application domains making use of the EGEE infrastructure: more than 15 ProductiveUseUtilityTestbedsAre we ready for the demand?NationalEuropean e-InfrastructureGlobal57
  • 71. www.eu-egi.org59EGI Objectives (1/3)Ensure the long-term sustainability of the European infrastructureCoordinate the integration and interaction between National Grid InfrastructuresOperate the European level of the production Grid infrastructure for a wide range of scientific disciplines to link National Grid InfrastructuresProvide global services and support that complement and/or coordinate national servicesCollaborate closely with industry as technology and service providers, as well as Grid users, to promote the rapid and successful uptake of Grid technology by European industry
  • 72. www.eu-egi.org60EGI Objectives (2/3)Coordinate middleware development and standardization to enhance the infrastructure by soliciting targeted developments from leading EU and National Grid middleware development projectsAdvise National and European Funding Agencies in establishing their programmes for future software developments based on agreed user needs and development standardsIntegrate, test, validate and package software from leading Grid middleware development projects and make it widely available
  • 73. www.eu-egi.org61EGI Objectives (3/3)Provide documentation and training material for the middleware and operations. Take into account developments made by national e-science projects which were aimed at supporting diverse communitiesLink the European infrastructure with similar infrastructures elsewherePromote Grid interface standards based on practical experience gained from Grid operations and middleware integration activities, in consultation with relevant standards organizationsEGI Vision Paperhttp://www.eu-egi.org/vision.pdf
  • 74. Integration and interoperabilityPRACE and EGI targeting a sustainable infrastructureDEISA-2 and EGEE-III project basedSometimes national stakeholders are partners in multiple initiativesUsers do not necessarily care where they get the service as long as they get itIntegration PRACE-DEISA and transition EGEE-EGI possible, further on requires creative thinking
  • 75. New HPC Ecosystem is being built…
  • 76. New market for European HPC44 ESFRI list new research infrastructure projects, 34 running a preparatory phase project1-4 years1-7 MEUR * 2 (petaflop computing 10 MEUR * 2)Successful new research infrastructures start construction 2009-201110-1000 MEUR per infrastructureFirst ones start to deploy: ESS in Lund etc.Existing research infrastructures are also developingCERN, EMBL, ESA, ESO, ECMWF, ITER, …Results:Growing RI market, considerably rising funding volume Need for horizontal activities (computing, data, networks, computational methods and scalability, application development,…) Real danger to build disciplinary silos instead of searching IT synergySeveral BEUR for ICT
  • 77. Some Key Issues in building the ecosystemSustainabilityEGEE and DEISA are projects with an endPRACE and EGI are targeted to be sustainable with no definitive endESFRI and e-IRGHow do the research side and infrastructure side work together?Two-directional input requestedRequirement for horizontal servicesLet’s not create disciplinary IT silosSynergy required for cost efficiency and excellenceICT infrastructure is essential for researchThe role of computational science is growingRenewal and competenceWill Europe run out of competent people?Will training and education programs react fast enough?
  • 78. Requirements of a sustainable HPC EcosystemHow to guarantee access to the top for selected groups?How to ensure there are competent users which can use the high end resources?How to involve all countries who can contribute?How to develop competence in home ground?How to boost collaboration between research and e-infrastructure providers?What are the principles of resource exchange (in-kind)?European centersNational /regional centers,Grid-collaborationUniversities and local centers
  • 80. Some conclusionsThere are far too many acronyms in this fieldWe need to collaborate in providing e-infrastructureFrom disciplinary silos to horizontal servicesBuilding trust between research and service providersMoving from project based work to sustainable research infrastructuresBalanced approach: focus not only on computing but also on data, software development and competenceDriven by user community needs – technology is a tool, not a targetESFRI list and other development plans will boost the market of ICT services in researchInteroperability and integration of initiatives will be seriously discussed
  • 81. Final words to remember“The problems are not solved by computers nor by any other e-infrastructure, they are solved by people”Kimmo Koski, today