Skip to content
Snippets Groups Projects
Commit 519979f6 authored by Alessandro Costantini's avatar Alessandro Costantini
Browse files

typo corrected

parent e2817b6d
Branches deep
No related tags found
1 merge request!5Deep
Pipeline #22124 passed
......@@ -15,7 +15,7 @@
\begin{abstract}
DEEP Hybrid DataCloud is an Horizon 2020 project that addresses the need to support intensive computing
techniques that require specialized HPC hardware, like GPUs or low-latency interconnects, to explore very large datasets.
techniques that require specialized HPC hardware, like GPUs or low-latency interconnects, to explore very large data sets.
Launched in November 2017 the H2020 DEEP Hybrid-DataCloud - DEEP-HDC is lasting for 30 months and is
combining the expertise of 10 large European research organisations.
The project proposes to deploy under the common label of “DEEP as a Service” a set of building blocks that
......@@ -37,7 +37,7 @@ for a proper design of the learning task at hand, additional tasks result in the
distributed, the orchestration of various infrastructure components at different places. It is obvious that such
learning tasks cannot be managed by a user with domain knowledge in the field of application, only. Therefore,
support by the infrastructure layer must break down the complexity of the task and allow the user to focus on
what she/he is skilled on, i.e., modelling of the problem, evaluating and interpreting the results of the machine
what she/he is skilled on, i.e., modeling of the problem, evaluating and interpreting the results of the machine
learning algorithms.
As a consequence, infrastructure providers have to understand the needs of their user communities and help
them to combine their services in a way that encapsulates technical details the end user does not have to deal with.
......@@ -62,7 +62,7 @@ The DEEP-Hybrid-DataCloud project started with the global objective of promoting
computing services by different research communities and areas, and their support by the corresponding
e-Infrastructure providers and open source projects. Other objectives followed by the project are:
\begin{itemize}
\item Focus on intensive computing techniques for the analysis of very large datasets considering highly demanding use cases.
\item Focus on intensive computing techniques for the analysis of very large data sets considering highly demanding use cases.
\item Evolve up to production level, intensive computing services exploiting specialized hardware.
\item Integrate intensive computing services under a hybrid cloud approach.
\item Define a “DEEP as a Service” solution to offer an adequate integration path to developers of final applications.
......@@ -71,7 +71,7 @@ e-Infrastructure providers and open source projects. Other objectives followed b
The DEEP-Hybrid-DataCloud project aims to provide a bridge towards a more flexible exploitation of intensive
computing resources by the research community, enabling access to the latest technologies that require also
last generation hardware and the scalability to be able to explore large datasets. It is structured into six
last generation hardware and the scalability to be able to explore large data sets. It is structured into six
different work packages, covering Networking Activities (NA) devoted to the coordination, communication
and community liaison; Service Activities (SA) focused on the provisioning of services and resources for the
execution of the data analysis challenges; and Joint Research Activities (JRAs), dealing with the development
......@@ -135,12 +135,12 @@ management, maintenance and support.
In particular, INFN-CNAF is coordinating Task 3.2 - Software quality assurance, release, maintenance and support.
Expressed in terms of Software Quality Assurance and Software Release and Maintenance, CNAF is coordinating
the management of those software products that became officially part of the first DEEP releases, codenamed
Genesis \cite{deep-genesis}, foreseen for late 2018 and effectivley released in January 2019.
Genesis \cite{deep-genesis}, foreseen for late 2018 and effectively released in January 2019.
INFN CNAF is also coordinating the implementation of the continuous software improvement process,
following a DevOps approach, through the definition and realization of an innovative Continuous
Integration (CI) and Delivery (CD) system.
INFN-CNAF is contributing also to Task 3.1 - Pilot testbeds and integration with EOSC platform and their services - by
providing and maintainig the testbeds dedicated to developers, software integration and software preview.
providing and maintaining the testbeds dedicated to developers, software integration and software preview.
In particular, the activities were focused in implementing the services needed to support the software development and release
management and included among others the source code repository, and continuous integration system.
......@@ -171,7 +171,7 @@ INFN-CNAF contributes to the JRA3 activities and task by supporting integration
composing a set of defined building blocks that will model the user application and deploying these applications as services that
can be offered to final users, as a way to deliver scientific results to a wider scope of stakeholders.
In particular, INFN-CNAF is testing the DEEP-Alien4Cloud \cite{A4C} plugin, elected as the tool able to provide an
easy to use and intuitive application composition to deliver the DEEPssS solutions.
easy to use and intuitive application composition to deliver the DEEaaS solutions.
......@@ -188,7 +188,7 @@ The architecture is depicted in Figure~\ref{fig-arch} and the main components ar
The PaaS Orchestrator is the core component of the PaaS layer. It receives high-level deployment requests and coordinates
the deployment process over the IaaS platforms.
The Identity and Access Management (IAM) Service provides a layer where identities, enrolment, group membership, attributes
The Identity and Access Management (IAM) Service provides a layer where identities, enrollment, group membership, attributes
and policies to access distributed resources and services can be managed in an homogeneous and interoperable way.
The Monitoring Service is in charge of collecting monitoring data from the targeted clouds, analysing and transforming them into
......@@ -196,7 +196,7 @@ The Monitoring Service is in charge of collecting monitoring data from the targe
The Cloud Provider Ranker (CPR) is a rule-based engine that allows to rank cloud providers in order to help the Orchestrator
to select the best one for the requested deployment. The ranking algorithm can take into account preferences specified by
the user and other information like SLAs and monitoring data.
The SLA Management (SLAM) Service allows the handshake between users and a site on a given SLA.
The SLA Management (SLAM) service allows the handshake between users and a site on a given SLA.
The Managed Service/Application (MSA) Deployment Service is in charge of scheduling, spawning, executing and monitoring
applications and services on a distributed infrastructure; the core of this component consists of an elastic Mesos cluster with
slave nodes dynamically provisioned and distributed on the IaaS sites.
......@@ -243,11 +243,11 @@ remaining backwards compatible with previous versions.
In the present contribution the DEEP-XDC project and its objectives have been presented and discussed.
These objective, together with the related needs proper of the research communities involved in the project, are
the real driver to develop innovative and reliable open source solutions able to fill up the technology gaps that currently
prevent effective exploitation of distributed computing and storage resources by many scienti c communities.
prevent effective exploitation of distributed computing and storage resources by many scientific communities.
For the second part of project, the activities carried on at INFN-CNAF will continue to ensure the fulfilment of the
For the second part of project, the activities carried on at INFN-CNAF will continue to ensure the fulfillment of the
project objectives.
In particular, the already available softare solutions will be enriched by advanced functionalities (provided by JRAs)
In particular, the already available software solutions will be enriched by advanced functionalities (provided by JRAs)
aimed at addressing the use case requirements provided by NA2.
The implementation and related testing of those new solutions will be performed in the testbeds maintained by SA1.
SA1 will also continue its activities aimed at further validate the software, its robustness and scalability and will follow
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment