SlideShare a Scribd company logo
1 of 27
1 © Hortonworks Inc. 2011 – 2016. All Rights Reserved
Apache Zeppelin + Livy:
Bringing Multi Tenancy
to Interactive Data Analysis
Prabhjyot Singh & Jeff Zhang
April, 14 2016
2 © Hortonworks Inc. 2011 – 2016. All Rights Reserved
Web-based notebook
that enables interactive
data analytics.
You can make beautiful
data-driven, interactive
and collaborative
documents with SQL,
Scala and more
What’s Apache Zeppelin?
3 © Hortonworks Inc. 2011 – 2016. All Rights Reserved
Interactive Analysis 1.0 (Spark-shell)
4 © Hortonworks Inc. 2011 – 2016. All Rights Reserved
Interactive Analysis 2.0 (Zeppelin)
Spark Interpreter
5 © Hortonworks Inc. 2011 – 2016. All Rights Reserved
Interactive Analysis 3.0 (Zeppelin + Livy)
Livy Interpreter
6 © Hortonworks Inc. 2011 – 2016. All Rights Reserved
Open Source Activity
7 © Hortonworks Inc. 2011 – 2016. All Rights Reserved
Quick Stats: Zeppelin
 Incubated by Apache Foundation, first PR – Mar, 2015
 Github history dates to Jul 2013, pre-incubation
 9 Committers, 100+ contributors, growing list
 ~800 JIRAs filed
 ~900 PRs via the community
 Zeppelin just got a new friend “R”
 Zeppelin graduation in on the cards
8 © Hortonworks Inc. 2011 – 2016. All Rights Reserved
Architecture & Usage
9 © Hortonworks Inc. 2011 – 2016. All Rights Reserved
Zeppelin Architecture
Current Interpreter Support
 HDFS
 PySpark, SparkR, Spark
 Hive
 HBase
 Phoenix
 Shell
 SQL
 …
10 © Hortonworks Inc. 2011 – 2016. All Rights Reserved
Zeppelin Features
Collate/Load
Data
Collate/Load data from existing data sources, load from external
CSVs. i.e. Eureka, Smartsense
Visualize Robust visualization mechanism to visualize data, and enable
insights
Collaborate Notebook base collaboration, export Notebooks, soon to be
added, tagging to Notebook generated data
11 © Hortonworks Inc. 2011 – 2016. All Rights Reserved
Popular Usage Scenarios
Customized
Dashboards
Intended for usage towards customized dashboards for Big Data
clusters
Security
Analytics
Understanding nature of data coming through multiple sources
and analyzing the effects of it
Bio-sciences Medical research companies are interested in using this for their
research
12 © Hortonworks Inc. 2011 – 2016. All Rights Reserved
Bringing Multi-tenancy to Zeppelin
13 © Hortonworks Inc. 2011 – 2016. All Rights Reserved
Multi-Tenancy: Motivation
 Supporting workloads of multiple
customers
 Supporting multiple LOBs (lines of
business), on a single data systems
 Support fine grained audits
 Inability to provision capacity for multiple
user groups
 Inability to Audit user actions, as all jobs
are run via ‘zeppelin’ proxy user
 Inability to share state/data with other
users as well
Objectives Requirements
14 © Hortonworks Inc. 2011 – 2016. All Rights Reserved
Zeppelin Livy Interaction
LDAP
Zeppelin
Shiro
Spark
Yarn
Livy
Ispark Group
Interpreter
SPNego: Kerberos Kerberos
Security Across Zeppelin-Livy-Spark
Livy APIs
15 © Hortonworks Inc. 2011 – 2016. All Rights Reserved
Deep dive on Livy
16 © Hortonworks Inc. 2011 – 2016. All Rights Reserved
What is Livy
Livy ServerLivy Client
Http
Http (RPC)
Http (RPC)
Livy is an open source REST interface for interacting with Spark from
anywhere.
Spark Interactive
Session
SparkContext
Spark Batch
Session
SparkContext
17 © Hortonworks Inc. 2011 – 2016. All Rights Reserved
Why we need Livy with Zeppelin
Reduce the pressure on client machine
Make the job submission/monitoring easy
Customize the job schedule
18 © Hortonworks Inc. 2011 – 2016. All Rights Reserved
Interactive Session – Create Session
2
1
3
4
curl -X POST --data '{"kind": "spark"}' -H "Content-Type: application/json" localhost:8998/sessions
{"state":"starting","proxyUser":”null","id":1,"kind":"spark","log":[]}
Request
Response
Livy Client
Livy Server
Spark Interactive
Session
SparkContext
19 © Hortonworks Inc. 2011 – 2016. All Rights Reserved
Interactive Session – Execute Code
{"id":0,"state":"running","output":null}
Request
Response
curl http://localhost:8998/sessions/0/statements -X POST -H 'Content-Type: application/json' -d
'{"code":"sc.parallelize(0 to 100).sum()"}'
2
1
3
4
Livy Client
Livy Server
Spark Interactive
Session
SparkContext
20 © Hortonworks Inc. 2011 – 2016. All Rights Reserved
SparkContext Sharing
Livy Server
Client 1
Client 2
Client 3
Session-1
Session-1
Session-2 Session-2
Session-1
SparkSession-1
SparkContext
SparkSession-2
SparkContext
21 © Hortonworks Inc. 2011 – 2016. All Rights Reserved
Livy Security
Client Livy Server
(Impersonation)
Shared SecretSpengo
SparkSession
• Only authorized users can launch spark session / submit code
• Each user can access his own session
• Only Livy server can submit job securely to spark session
22 © Hortonworks Inc. 2011 – 2016. All Rights Reserved
SPNEGO
Client
(Kerbrose TGT)
Livy Server
(SPENGO enabled)
Simple and Protected GSSAPI Negotiation Mechanism (SPNEGO), often pronounced "spen-go”
It is a GSSAPI "pseudo mechanism" used by client-server software to negotiate the choice of security
technology.
Http Get http://site/a.html
Error 401 Unauthorized
Http Get Request
Authorization: Negotiation
Http Get Request
23 © Hortonworks Inc. 2011 – 2016. All Rights Reserved
Impersonation
Alice
(Kerberos TGT)
Shared Secret
Bob
(Kerberos TGT)
Shared SecretSpengo
Spengo
Livy Server
(super user: livy)
Spark Session
Spark Session
24 © Hortonworks Inc. 2011 – 2016. All Rights Reserved
Shared Secret
1. Livy Server generate secret key
2. Livy Server pass secret key to spark session when launching spark session
3. Use the secret key to communicate with each other
Spark Session
Shared Secret
Livy Server
25 © Hortonworks Inc. 2011 – 2016. All Rights Reserved
Multi Tenant: Zeppelin Demo
26 © Hortonworks Inc. 2011 – 2016. All Rights Reserved
Q & A
27 © Hortonworks Inc. 2011 – 2016. All Rights Reserved
Thank You

More Related Content

What's hot

HBase and HDFS: Understanding FileSystem Usage in HBase
HBase and HDFS: Understanding FileSystem Usage in HBaseHBase and HDFS: Understanding FileSystem Usage in HBase
HBase and HDFS: Understanding FileSystem Usage in HBaseenissoz
 
Evening out the uneven: dealing with skew in Flink
Evening out the uneven: dealing with skew in FlinkEvening out the uneven: dealing with skew in Flink
Evening out the uneven: dealing with skew in FlinkFlink Forward
 
Squirreling Away $640 Billion: How Stripe Leverages Flink for Change Data Cap...
Squirreling Away $640 Billion: How Stripe Leverages Flink for Change Data Cap...Squirreling Away $640 Billion: How Stripe Leverages Flink for Change Data Cap...
Squirreling Away $640 Billion: How Stripe Leverages Flink for Change Data Cap...Flink Forward
 
“Alexa, be quiet!”: End-to-end near-real time model building and evaluation i...
“Alexa, be quiet!”: End-to-end near-real time model building and evaluation i...“Alexa, be quiet!”: End-to-end near-real time model building and evaluation i...
“Alexa, be quiet!”: End-to-end near-real time model building and evaluation i...Flink Forward
 
Using the New Apache Flink Kubernetes Operator in a Production Deployment
Using the New Apache Flink Kubernetes Operator in a Production DeploymentUsing the New Apache Flink Kubernetes Operator in a Production Deployment
Using the New Apache Flink Kubernetes Operator in a Production DeploymentFlink Forward
 
Introducing the Apache Flink Kubernetes Operator
Introducing the Apache Flink Kubernetes OperatorIntroducing the Apache Flink Kubernetes Operator
Introducing the Apache Flink Kubernetes OperatorFlink Forward
 
Flink Forward San Francisco 2019: Moving from Lambda and Kappa Architectures ...
Flink Forward San Francisco 2019: Moving from Lambda and Kappa Architectures ...Flink Forward San Francisco 2019: Moving from Lambda and Kappa Architectures ...
Flink Forward San Francisco 2019: Moving from Lambda and Kappa Architectures ...Flink Forward
 
Dissecting the rabbit: RabbitMQ Internal Architecture
Dissecting the rabbit: RabbitMQ Internal ArchitectureDissecting the rabbit: RabbitMQ Internal Architecture
Dissecting the rabbit: RabbitMQ Internal ArchitectureAlvaro Videla
 
A Thorough Comparison of Delta Lake, Iceberg and Hudi
A Thorough Comparison of Delta Lake, Iceberg and HudiA Thorough Comparison of Delta Lake, Iceberg and Hudi
A Thorough Comparison of Delta Lake, Iceberg and HudiDatabricks
 
The never-ending REST API design debate
The never-ending REST API design debateThe never-ending REST API design debate
The never-ending REST API design debateRestlet
 
Apache HBase Improvements and Practices at Xiaomi
Apache HBase Improvements and Practices at XiaomiApache HBase Improvements and Practices at Xiaomi
Apache HBase Improvements and Practices at XiaomiHBaseCon
 
Apache Spark Data Source V2 with Wenchen Fan and Gengliang Wang
Apache Spark Data Source V2 with Wenchen Fan and Gengliang WangApache Spark Data Source V2 with Wenchen Fan and Gengliang Wang
Apache Spark Data Source V2 with Wenchen Fan and Gengliang WangDatabricks
 
Autoscaling Flink with Reactive Mode
Autoscaling Flink with Reactive ModeAutoscaling Flink with Reactive Mode
Autoscaling Flink with Reactive ModeFlink Forward
 
Top 5 Mistakes When Writing Spark Applications
Top 5 Mistakes When Writing Spark ApplicationsTop 5 Mistakes When Writing Spark Applications
Top 5 Mistakes When Writing Spark ApplicationsSpark Summit
 
Apache Flink Adoption at Shopify
Apache Flink Adoption at ShopifyApache Flink Adoption at Shopify
Apache Flink Adoption at ShopifyYaroslav Tkachenko
 
Introduction to Kong API Gateway
Introduction to Kong API GatewayIntroduction to Kong API Gateway
Introduction to Kong API GatewayYohann Ciurlik
 
Athena & Step Function 으로 통계 파이프라인 구축하기 - 변규현 (당근마켓) :: AWS Community Day Onl...
Athena & Step Function 으로 통계 파이프라인 구축하기 - 변규현 (당근마켓) :: AWS Community Day Onl...Athena & Step Function 으로 통계 파이프라인 구축하기 - 변규현 (당근마켓) :: AWS Community Day Onl...
Athena & Step Function 으로 통계 파이프라인 구축하기 - 변규현 (당근마켓) :: AWS Community Day Onl...AWSKRUG - AWS한국사용자모임
 
OSA Con 2022 - Arrow in Flight_ New Developments in Data Connectivity - David...
OSA Con 2022 - Arrow in Flight_ New Developments in Data Connectivity - David...OSA Con 2022 - Arrow in Flight_ New Developments in Data Connectivity - David...
OSA Con 2022 - Arrow in Flight_ New Developments in Data Connectivity - David...Altinity Ltd
 
Radical Speed for SQL Queries on Databricks: Photon Under the Hood
Radical Speed for SQL Queries on Databricks: Photon Under the HoodRadical Speed for SQL Queries on Databricks: Photon Under the Hood
Radical Speed for SQL Queries on Databricks: Photon Under the HoodDatabricks
 

What's hot (20)

HBase and HDFS: Understanding FileSystem Usage in HBase
HBase and HDFS: Understanding FileSystem Usage in HBaseHBase and HDFS: Understanding FileSystem Usage in HBase
HBase and HDFS: Understanding FileSystem Usage in HBase
 
Evening out the uneven: dealing with skew in Flink
Evening out the uneven: dealing with skew in FlinkEvening out the uneven: dealing with skew in Flink
Evening out the uneven: dealing with skew in Flink
 
Squirreling Away $640 Billion: How Stripe Leverages Flink for Change Data Cap...
Squirreling Away $640 Billion: How Stripe Leverages Flink for Change Data Cap...Squirreling Away $640 Billion: How Stripe Leverages Flink for Change Data Cap...
Squirreling Away $640 Billion: How Stripe Leverages Flink for Change Data Cap...
 
“Alexa, be quiet!”: End-to-end near-real time model building and evaluation i...
“Alexa, be quiet!”: End-to-end near-real time model building and evaluation i...“Alexa, be quiet!”: End-to-end near-real time model building and evaluation i...
“Alexa, be quiet!”: End-to-end near-real time model building and evaluation i...
 
Using the New Apache Flink Kubernetes Operator in a Production Deployment
Using the New Apache Flink Kubernetes Operator in a Production DeploymentUsing the New Apache Flink Kubernetes Operator in a Production Deployment
Using the New Apache Flink Kubernetes Operator in a Production Deployment
 
ELK Stack
ELK StackELK Stack
ELK Stack
 
Introducing the Apache Flink Kubernetes Operator
Introducing the Apache Flink Kubernetes OperatorIntroducing the Apache Flink Kubernetes Operator
Introducing the Apache Flink Kubernetes Operator
 
Flink Forward San Francisco 2019: Moving from Lambda and Kappa Architectures ...
Flink Forward San Francisco 2019: Moving from Lambda and Kappa Architectures ...Flink Forward San Francisco 2019: Moving from Lambda and Kappa Architectures ...
Flink Forward San Francisco 2019: Moving from Lambda and Kappa Architectures ...
 
Dissecting the rabbit: RabbitMQ Internal Architecture
Dissecting the rabbit: RabbitMQ Internal ArchitectureDissecting the rabbit: RabbitMQ Internal Architecture
Dissecting the rabbit: RabbitMQ Internal Architecture
 
A Thorough Comparison of Delta Lake, Iceberg and Hudi
A Thorough Comparison of Delta Lake, Iceberg and HudiA Thorough Comparison of Delta Lake, Iceberg and Hudi
A Thorough Comparison of Delta Lake, Iceberg and Hudi
 
The never-ending REST API design debate
The never-ending REST API design debateThe never-ending REST API design debate
The never-ending REST API design debate
 
Apache HBase Improvements and Practices at Xiaomi
Apache HBase Improvements and Practices at XiaomiApache HBase Improvements and Practices at Xiaomi
Apache HBase Improvements and Practices at Xiaomi
 
Apache Spark Data Source V2 with Wenchen Fan and Gengliang Wang
Apache Spark Data Source V2 with Wenchen Fan and Gengliang WangApache Spark Data Source V2 with Wenchen Fan and Gengliang Wang
Apache Spark Data Source V2 with Wenchen Fan and Gengliang Wang
 
Autoscaling Flink with Reactive Mode
Autoscaling Flink with Reactive ModeAutoscaling Flink with Reactive Mode
Autoscaling Flink with Reactive Mode
 
Top 5 Mistakes When Writing Spark Applications
Top 5 Mistakes When Writing Spark ApplicationsTop 5 Mistakes When Writing Spark Applications
Top 5 Mistakes When Writing Spark Applications
 
Apache Flink Adoption at Shopify
Apache Flink Adoption at ShopifyApache Flink Adoption at Shopify
Apache Flink Adoption at Shopify
 
Introduction to Kong API Gateway
Introduction to Kong API GatewayIntroduction to Kong API Gateway
Introduction to Kong API Gateway
 
Athena & Step Function 으로 통계 파이프라인 구축하기 - 변규현 (당근마켓) :: AWS Community Day Onl...
Athena & Step Function 으로 통계 파이프라인 구축하기 - 변규현 (당근마켓) :: AWS Community Day Onl...Athena & Step Function 으로 통계 파이프라인 구축하기 - 변규현 (당근마켓) :: AWS Community Day Onl...
Athena & Step Function 으로 통계 파이프라인 구축하기 - 변규현 (당근마켓) :: AWS Community Day Onl...
 
OSA Con 2022 - Arrow in Flight_ New Developments in Data Connectivity - David...
OSA Con 2022 - Arrow in Flight_ New Developments in Data Connectivity - David...OSA Con 2022 - Arrow in Flight_ New Developments in Data Connectivity - David...
OSA Con 2022 - Arrow in Flight_ New Developments in Data Connectivity - David...
 
Radical Speed for SQL Queries on Databricks: Photon Under the Hood
Radical Speed for SQL Queries on Databricks: Photon Under the HoodRadical Speed for SQL Queries on Databricks: Photon Under the Hood
Radical Speed for SQL Queries on Databricks: Photon Under the Hood
 

Viewers also liked

Multi User Data science with Zeppelin
Multi User Data science with ZeppelinMulti User Data science with Zeppelin
Multi User Data science with ZeppelinVinay Shukla
 
Big Data visualization with Apache Spark and Zeppelin
Big Data visualization with Apache Spark and ZeppelinBig Data visualization with Apache Spark and Zeppelin
Big Data visualization with Apache Spark and Zeppelinprajods
 
Building a REST Job Server for interactive Spark as a service by Romain Rigau...
Building a REST Job Server for interactive Spark as a service by Romain Rigau...Building a REST Job Server for interactive Spark as a service by Romain Rigau...
Building a REST Job Server for interactive Spark as a service by Romain Rigau...Spark Summit
 
Data Science lifecycle with Apache Zeppelin and Spark by Moonsoo Lee
Data Science lifecycle with Apache Zeppelin and Spark by Moonsoo LeeData Science lifecycle with Apache Zeppelin and Spark by Moonsoo Lee
Data Science lifecycle with Apache Zeppelin and Spark by Moonsoo LeeSpark Summit
 
Building a REST Job Server for Interactive Spark as a Service
Building a REST Job Server for Interactive Spark as a ServiceBuilding a REST Job Server for Interactive Spark as a Service
Building a REST Job Server for Interactive Spark as a ServiceCloudera, Inc.
 
Spatial Analysis On Histological Images Using Spark
Spatial Analysis On Histological Images Using SparkSpatial Analysis On Histological Images Using Spark
Spatial Analysis On Histological Images Using SparkJen Aman
 
Enabling Apache Zeppelin and Spark for Data Science in the Enterprise
Enabling Apache Zeppelin and Spark for Data Science in the EnterpriseEnabling Apache Zeppelin and Spark for Data Science in the Enterprise
Enabling Apache Zeppelin and Spark for Data Science in the EnterpriseDataWorks Summit/Hadoop Summit
 
Sparkly Notebook: Interactive Analysis and Visualization with Spark
Sparkly Notebook: Interactive Analysis and Visualization with SparkSparkly Notebook: Interactive Analysis and Visualization with Spark
Sparkly Notebook: Interactive Analysis and Visualization with Sparkfelixcss
 
Reactive app using actor model & apache spark
Reactive app using actor model & apache sparkReactive app using actor model & apache spark
Reactive app using actor model & apache sparkRahul Kumar
 
Introduction to vSphere APIs Using pyVmomi
Introduction to vSphere APIs Using pyVmomiIntroduction to vSphere APIs Using pyVmomi
Introduction to vSphere APIs Using pyVmomiMichael Rice
 
Apache zeppelin 0.7.0 helium
Apache zeppelin 0.7.0   heliumApache zeppelin 0.7.0   helium
Apache zeppelin 0.7.0 heliumAhyoung Ryu
 
Data Science with Spark & Zeppelin
Data Science with Spark & ZeppelinData Science with Spark & Zeppelin
Data Science with Spark & ZeppelinVinay Shukla
 
CAS, OpenID, Shibboleth, SAML : concepts, différences et exemples
CAS, OpenID, Shibboleth, SAML : concepts, différences et exemplesCAS, OpenID, Shibboleth, SAML : concepts, différences et exemples
CAS, OpenID, Shibboleth, SAML : concepts, différences et exemplesClément OUDOT
 
Apache Zeppelin + LIvy: Bringing Multi Tenancy to Interactive Data Analysis
Apache Zeppelin + LIvy: Bringing Multi Tenancy to Interactive Data AnalysisApache Zeppelin + LIvy: Bringing Multi Tenancy to Interactive Data Analysis
Apache Zeppelin + LIvy: Bringing Multi Tenancy to Interactive Data AnalysisDataWorks Summit/Hadoop Summit
 

Viewers also liked (20)

Apache Zeppelin Helium and Beyond
Apache Zeppelin Helium and BeyondApache Zeppelin Helium and Beyond
Apache Zeppelin Helium and Beyond
 
Apache Zeppelin, Helium and Beyond
Apache Zeppelin, Helium and BeyondApache Zeppelin, Helium and Beyond
Apache Zeppelin, Helium and Beyond
 
Multi User Data science with Zeppelin
Multi User Data science with ZeppelinMulti User Data science with Zeppelin
Multi User Data science with Zeppelin
 
Big Data visualization with Apache Spark and Zeppelin
Big Data visualization with Apache Spark and ZeppelinBig Data visualization with Apache Spark and Zeppelin
Big Data visualization with Apache Spark and Zeppelin
 
Building a REST Job Server for interactive Spark as a service by Romain Rigau...
Building a REST Job Server for interactive Spark as a service by Romain Rigau...Building a REST Job Server for interactive Spark as a service by Romain Rigau...
Building a REST Job Server for interactive Spark as a service by Romain Rigau...
 
Data Science lifecycle with Apache Zeppelin and Spark by Moonsoo Lee
Data Science lifecycle with Apache Zeppelin and Spark by Moonsoo LeeData Science lifecycle with Apache Zeppelin and Spark by Moonsoo Lee
Data Science lifecycle with Apache Zeppelin and Spark by Moonsoo Lee
 
Building a REST Job Server for Interactive Spark as a Service
Building a REST Job Server for Interactive Spark as a ServiceBuilding a REST Job Server for Interactive Spark as a Service
Building a REST Job Server for Interactive Spark as a Service
 
Spatial Analysis On Histological Images Using Spark
Spatial Analysis On Histological Images Using SparkSpatial Analysis On Histological Images Using Spark
Spatial Analysis On Histological Images Using Spark
 
Enabling Apache Zeppelin and Spark for Data Science in the Enterprise
Enabling Apache Zeppelin and Spark for Data Science in the EnterpriseEnabling Apache Zeppelin and Spark for Data Science in the Enterprise
Enabling Apache Zeppelin and Spark for Data Science in the Enterprise
 
Sparkly Notebook: Interactive Analysis and Visualization with Spark
Sparkly Notebook: Interactive Analysis and Visualization with SparkSparkly Notebook: Interactive Analysis and Visualization with Spark
Sparkly Notebook: Interactive Analysis and Visualization with Spark
 
Reactive app using actor model & apache spark
Reactive app using actor model & apache sparkReactive app using actor model & apache spark
Reactive app using actor model & apache spark
 
Comparison of Transactional Libraries for HBase
Comparison of Transactional Libraries for HBaseComparison of Transactional Libraries for HBase
Comparison of Transactional Libraries for HBase
 
Introduction to vSphere APIs Using pyVmomi
Introduction to vSphere APIs Using pyVmomiIntroduction to vSphere APIs Using pyVmomi
Introduction to vSphere APIs Using pyVmomi
 
Apache zeppelin 0.7.0 helium
Apache zeppelin 0.7.0   heliumApache zeppelin 0.7.0   helium
Apache zeppelin 0.7.0 helium
 
Fine-Grained Security for Spark and Hive
Fine-Grained Security for Spark and HiveFine-Grained Security for Spark and Hive
Fine-Grained Security for Spark and Hive
 
Data Science with Spark & Zeppelin
Data Science with Spark & ZeppelinData Science with Spark & Zeppelin
Data Science with Spark & Zeppelin
 
CAS, OpenID, Shibboleth, SAML : concepts, différences et exemples
CAS, OpenID, Shibboleth, SAML : concepts, différences et exemplesCAS, OpenID, Shibboleth, SAML : concepts, différences et exemples
CAS, OpenID, Shibboleth, SAML : concepts, différences et exemples
 
Apache Zeppelin + LIvy: Bringing Multi Tenancy to Interactive Data Analysis
Apache Zeppelin + LIvy: Bringing Multi Tenancy to Interactive Data AnalysisApache Zeppelin + LIvy: Bringing Multi Tenancy to Interactive Data Analysis
Apache Zeppelin + LIvy: Bringing Multi Tenancy to Interactive Data Analysis
 
Data science lifecycle with Apache Zeppelin
Data science lifecycle with Apache ZeppelinData science lifecycle with Apache Zeppelin
Data science lifecycle with Apache Zeppelin
 
What's new in Hadoop Common and HDFS
What's new in Hadoop Common and HDFS What's new in Hadoop Common and HDFS
What's new in Hadoop Common and HDFS
 

Similar to Apache Zeppelin + Livy: Bringing Multi Tenancy to Interactive Data Analysis

Dataflow with Apache NiFi - Apache NiFi Meetup - 2016 Hadoop Summit - San Jose
Dataflow with Apache NiFi - Apache NiFi Meetup - 2016 Hadoop Summit - San JoseDataflow with Apache NiFi - Apache NiFi Meetup - 2016 Hadoop Summit - San Jose
Dataflow with Apache NiFi - Apache NiFi Meetup - 2016 Hadoop Summit - San JoseAldrin Piri
 
Apache NiFi Crash Course - San Jose Hadoop Summit
Apache NiFi Crash Course - San Jose Hadoop SummitApache NiFi Crash Course - San Jose Hadoop Summit
Apache NiFi Crash Course - San Jose Hadoop SummitAldrin Piri
 
Connecting the Drops with Apache NiFi & Apache MiNiFi
Connecting the Drops with Apache NiFi & Apache MiNiFiConnecting the Drops with Apache NiFi & Apache MiNiFi
Connecting the Drops with Apache NiFi & Apache MiNiFiDataWorks Summit
 
Data at Scales and the Values of Starting Small with Apache NiFi & MiNiFi
Data at Scales and the Values of Starting Small with Apache NiFi & MiNiFiData at Scales and the Values of Starting Small with Apache NiFi & MiNiFi
Data at Scales and the Values of Starting Small with Apache NiFi & MiNiFiAldrin Piri
 
Druid: Sub-Second OLAP queries over Petabytes of Streaming Data
Druid: Sub-Second OLAP queries over Petabytes of Streaming DataDruid: Sub-Second OLAP queries over Petabytes of Streaming Data
Druid: Sub-Second OLAP queries over Petabytes of Streaming DataDataWorks Summit
 
Hadoop in adtech
Hadoop in adtechHadoop in adtech
Hadoop in adtechYuta Imai
 
Mission to NARs with Apache NiFi
Mission to NARs with Apache NiFiMission to NARs with Apache NiFi
Mission to NARs with Apache NiFiHortonworks
 
You Can't Search Without Data
You Can't Search Without DataYou Can't Search Without Data
You Can't Search Without DataBryan Bende
 
Hortonworks Data in Motion Webinar Series - Part 1
Hortonworks Data in Motion Webinar Series - Part 1Hortonworks Data in Motion Webinar Series - Part 1
Hortonworks Data in Motion Webinar Series - Part 1Hortonworks
 
Harnessing Data-in-Motion with HDF 2.0, introduction to Apache NIFI/MINIFI
Harnessing Data-in-Motion with HDF 2.0, introduction to Apache NIFI/MINIFIHarnessing Data-in-Motion with HDF 2.0, introduction to Apache NIFI/MINIFI
Harnessing Data-in-Motion with HDF 2.0, introduction to Apache NIFI/MINIFIHaimo Liu
 
De-Mystifying the Apache Phoenix QueryServer
De-Mystifying the Apache Phoenix QueryServerDe-Mystifying the Apache Phoenix QueryServer
De-Mystifying the Apache Phoenix QueryServerJosh Elser
 
Apache Phoenix Query Server PhoenixCon2016
Apache Phoenix Query Server PhoenixCon2016Apache Phoenix Query Server PhoenixCon2016
Apache Phoenix Query Server PhoenixCon2016Josh Elser
 
Apache Zeppelin and Spark for Enterprise Data Science
Apache Zeppelin and Spark for Enterprise Data ScienceApache Zeppelin and Spark for Enterprise Data Science
Apache Zeppelin and Spark for Enterprise Data ScienceBikas Saha
 
Apache Zeppelin and Spark for Enterprise Data Science
Apache Zeppelin and Spark for Enterprise Data ScienceApache Zeppelin and Spark for Enterprise Data Science
Apache Zeppelin and Spark for Enterprise Data ScienceBikas Saha
 
introduction-to-apache-kafka
introduction-to-apache-kafkaintroduction-to-apache-kafka
introduction-to-apache-kafkaYifeng Jiang
 
Future of Data New Jersey - HDF 3.0 Deep Dive
Future of Data New Jersey - HDF 3.0 Deep DiveFuture of Data New Jersey - HDF 3.0 Deep Dive
Future of Data New Jersey - HDF 3.0 Deep DiveAldrin Piri
 
Running Zeppelin in Enterprise
Running Zeppelin in EnterpriseRunning Zeppelin in Enterprise
Running Zeppelin in EnterpriseDataWorks Summit
 
Introduction to Apache NiFi - Seattle Scalability Meetup
Introduction to Apache NiFi - Seattle Scalability MeetupIntroduction to Apache NiFi - Seattle Scalability Meetup
Introduction to Apache NiFi - Seattle Scalability MeetupSaptak Sen
 
Apache Phoenix Query Server
Apache Phoenix Query ServerApache Phoenix Query Server
Apache Phoenix Query ServerJosh Elser
 

Similar to Apache Zeppelin + Livy: Bringing Multi Tenancy to Interactive Data Analysis (20)

Dataflow with Apache NiFi - Apache NiFi Meetup - 2016 Hadoop Summit - San Jose
Dataflow with Apache NiFi - Apache NiFi Meetup - 2016 Hadoop Summit - San JoseDataflow with Apache NiFi - Apache NiFi Meetup - 2016 Hadoop Summit - San Jose
Dataflow with Apache NiFi - Apache NiFi Meetup - 2016 Hadoop Summit - San Jose
 
Apache NiFi Crash Course - San Jose Hadoop Summit
Apache NiFi Crash Course - San Jose Hadoop SummitApache NiFi Crash Course - San Jose Hadoop Summit
Apache NiFi Crash Course - San Jose Hadoop Summit
 
Connecting the Drops with Apache NiFi & Apache MiNiFi
Connecting the Drops with Apache NiFi & Apache MiNiFiConnecting the Drops with Apache NiFi & Apache MiNiFi
Connecting the Drops with Apache NiFi & Apache MiNiFi
 
Data at Scales and the Values of Starting Small with Apache NiFi & MiNiFi
Data at Scales and the Values of Starting Small with Apache NiFi & MiNiFiData at Scales and the Values of Starting Small with Apache NiFi & MiNiFi
Data at Scales and the Values of Starting Small with Apache NiFi & MiNiFi
 
Druid: Sub-Second OLAP queries over Petabytes of Streaming Data
Druid: Sub-Second OLAP queries over Petabytes of Streaming DataDruid: Sub-Second OLAP queries over Petabytes of Streaming Data
Druid: Sub-Second OLAP queries over Petabytes of Streaming Data
 
Apache NiFi Crash Course Intro
Apache NiFi Crash Course IntroApache NiFi Crash Course Intro
Apache NiFi Crash Course Intro
 
Hadoop in adtech
Hadoop in adtechHadoop in adtech
Hadoop in adtech
 
Mission to NARs with Apache NiFi
Mission to NARs with Apache NiFiMission to NARs with Apache NiFi
Mission to NARs with Apache NiFi
 
You Can't Search Without Data
You Can't Search Without DataYou Can't Search Without Data
You Can't Search Without Data
 
Hortonworks Data in Motion Webinar Series - Part 1
Hortonworks Data in Motion Webinar Series - Part 1Hortonworks Data in Motion Webinar Series - Part 1
Hortonworks Data in Motion Webinar Series - Part 1
 
Harnessing Data-in-Motion with HDF 2.0, introduction to Apache NIFI/MINIFI
Harnessing Data-in-Motion with HDF 2.0, introduction to Apache NIFI/MINIFIHarnessing Data-in-Motion with HDF 2.0, introduction to Apache NIFI/MINIFI
Harnessing Data-in-Motion with HDF 2.0, introduction to Apache NIFI/MINIFI
 
De-Mystifying the Apache Phoenix QueryServer
De-Mystifying the Apache Phoenix QueryServerDe-Mystifying the Apache Phoenix QueryServer
De-Mystifying the Apache Phoenix QueryServer
 
Apache Phoenix Query Server PhoenixCon2016
Apache Phoenix Query Server PhoenixCon2016Apache Phoenix Query Server PhoenixCon2016
Apache Phoenix Query Server PhoenixCon2016
 
Apache Zeppelin and Spark for Enterprise Data Science
Apache Zeppelin and Spark for Enterprise Data ScienceApache Zeppelin and Spark for Enterprise Data Science
Apache Zeppelin and Spark for Enterprise Data Science
 
Apache Zeppelin and Spark for Enterprise Data Science
Apache Zeppelin and Spark for Enterprise Data ScienceApache Zeppelin and Spark for Enterprise Data Science
Apache Zeppelin and Spark for Enterprise Data Science
 
introduction-to-apache-kafka
introduction-to-apache-kafkaintroduction-to-apache-kafka
introduction-to-apache-kafka
 
Future of Data New Jersey - HDF 3.0 Deep Dive
Future of Data New Jersey - HDF 3.0 Deep DiveFuture of Data New Jersey - HDF 3.0 Deep Dive
Future of Data New Jersey - HDF 3.0 Deep Dive
 
Running Zeppelin in Enterprise
Running Zeppelin in EnterpriseRunning Zeppelin in Enterprise
Running Zeppelin in Enterprise
 
Introduction to Apache NiFi - Seattle Scalability Meetup
Introduction to Apache NiFi - Seattle Scalability MeetupIntroduction to Apache NiFi - Seattle Scalability Meetup
Introduction to Apache NiFi - Seattle Scalability Meetup
 
Apache Phoenix Query Server
Apache Phoenix Query ServerApache Phoenix Query Server
Apache Phoenix Query Server
 

More from DataWorks Summit/Hadoop Summit

Unleashing the Power of Apache Atlas with Apache Ranger
Unleashing the Power of Apache Atlas with Apache RangerUnleashing the Power of Apache Atlas with Apache Ranger
Unleashing the Power of Apache Atlas with Apache RangerDataWorks Summit/Hadoop Summit
 
Enabling Digital Diagnostics with a Data Science Platform
Enabling Digital Diagnostics with a Data Science PlatformEnabling Digital Diagnostics with a Data Science Platform
Enabling Digital Diagnostics with a Data Science PlatformDataWorks Summit/Hadoop Summit
 
Double Your Hadoop Performance with Hortonworks SmartSense
Double Your Hadoop Performance with Hortonworks SmartSenseDouble Your Hadoop Performance with Hortonworks SmartSense
Double Your Hadoop Performance with Hortonworks SmartSenseDataWorks Summit/Hadoop Summit
 
Building a Large-Scale, Adaptive Recommendation Engine with Apache Flink and ...
Building a Large-Scale, Adaptive Recommendation Engine with Apache Flink and ...Building a Large-Scale, Adaptive Recommendation Engine with Apache Flink and ...
Building a Large-Scale, Adaptive Recommendation Engine with Apache Flink and ...DataWorks Summit/Hadoop Summit
 
Real-Time Anomaly Detection using LSTM Auto-Encoders with Deep Learning4J on ...
Real-Time Anomaly Detection using LSTM Auto-Encoders with Deep Learning4J on ...Real-Time Anomaly Detection using LSTM Auto-Encoders with Deep Learning4J on ...
Real-Time Anomaly Detection using LSTM Auto-Encoders with Deep Learning4J on ...DataWorks Summit/Hadoop Summit
 
Mool - Automated Log Analysis using Data Science and ML
Mool - Automated Log Analysis using Data Science and MLMool - Automated Log Analysis using Data Science and ML
Mool - Automated Log Analysis using Data Science and MLDataWorks Summit/Hadoop Summit
 
The Challenge of Driving Business Value from the Analytics of Things (AOT)
The Challenge of Driving Business Value from the Analytics of Things (AOT)The Challenge of Driving Business Value from the Analytics of Things (AOT)
The Challenge of Driving Business Value from the Analytics of Things (AOT)DataWorks Summit/Hadoop Summit
 
From Regulatory Process Verification to Predictive Maintenance and Beyond wit...
From Regulatory Process Verification to Predictive Maintenance and Beyond wit...From Regulatory Process Verification to Predictive Maintenance and Beyond wit...
From Regulatory Process Verification to Predictive Maintenance and Beyond wit...DataWorks Summit/Hadoop Summit
 

More from DataWorks Summit/Hadoop Summit (20)

Running Apache Spark & Apache Zeppelin in Production
Running Apache Spark & Apache Zeppelin in ProductionRunning Apache Spark & Apache Zeppelin in Production
Running Apache Spark & Apache Zeppelin in Production
 
State of Security: Apache Spark & Apache Zeppelin
State of Security: Apache Spark & Apache ZeppelinState of Security: Apache Spark & Apache Zeppelin
State of Security: Apache Spark & Apache Zeppelin
 
Unleashing the Power of Apache Atlas with Apache Ranger
Unleashing the Power of Apache Atlas with Apache RangerUnleashing the Power of Apache Atlas with Apache Ranger
Unleashing the Power of Apache Atlas with Apache Ranger
 
Enabling Digital Diagnostics with a Data Science Platform
Enabling Digital Diagnostics with a Data Science PlatformEnabling Digital Diagnostics with a Data Science Platform
Enabling Digital Diagnostics with a Data Science Platform
 
Revolutionize Text Mining with Spark and Zeppelin
Revolutionize Text Mining with Spark and ZeppelinRevolutionize Text Mining with Spark and Zeppelin
Revolutionize Text Mining with Spark and Zeppelin
 
Double Your Hadoop Performance with Hortonworks SmartSense
Double Your Hadoop Performance with Hortonworks SmartSenseDouble Your Hadoop Performance with Hortonworks SmartSense
Double Your Hadoop Performance with Hortonworks SmartSense
 
Hadoop Crash Course
Hadoop Crash CourseHadoop Crash Course
Hadoop Crash Course
 
Data Science Crash Course
Data Science Crash CourseData Science Crash Course
Data Science Crash Course
 
Apache Spark Crash Course
Apache Spark Crash CourseApache Spark Crash Course
Apache Spark Crash Course
 
Dataflow with Apache NiFi
Dataflow with Apache NiFiDataflow with Apache NiFi
Dataflow with Apache NiFi
 
Schema Registry - Set you Data Free
Schema Registry - Set you Data FreeSchema Registry - Set you Data Free
Schema Registry - Set you Data Free
 
Building a Large-Scale, Adaptive Recommendation Engine with Apache Flink and ...
Building a Large-Scale, Adaptive Recommendation Engine with Apache Flink and ...Building a Large-Scale, Adaptive Recommendation Engine with Apache Flink and ...
Building a Large-Scale, Adaptive Recommendation Engine with Apache Flink and ...
 
Real-Time Anomaly Detection using LSTM Auto-Encoders with Deep Learning4J on ...
Real-Time Anomaly Detection using LSTM Auto-Encoders with Deep Learning4J on ...Real-Time Anomaly Detection using LSTM Auto-Encoders with Deep Learning4J on ...
Real-Time Anomaly Detection using LSTM Auto-Encoders with Deep Learning4J on ...
 
Mool - Automated Log Analysis using Data Science and ML
Mool - Automated Log Analysis using Data Science and MLMool - Automated Log Analysis using Data Science and ML
Mool - Automated Log Analysis using Data Science and ML
 
How Hadoop Makes the Natixis Pack More Efficient
How Hadoop Makes the Natixis Pack More Efficient How Hadoop Makes the Natixis Pack More Efficient
How Hadoop Makes the Natixis Pack More Efficient
 
HBase in Practice
HBase in Practice HBase in Practice
HBase in Practice
 
The Challenge of Driving Business Value from the Analytics of Things (AOT)
The Challenge of Driving Business Value from the Analytics of Things (AOT)The Challenge of Driving Business Value from the Analytics of Things (AOT)
The Challenge of Driving Business Value from the Analytics of Things (AOT)
 
Breaking the 1 Million OPS/SEC Barrier in HOPS Hadoop
Breaking the 1 Million OPS/SEC Barrier in HOPS HadoopBreaking the 1 Million OPS/SEC Barrier in HOPS Hadoop
Breaking the 1 Million OPS/SEC Barrier in HOPS Hadoop
 
From Regulatory Process Verification to Predictive Maintenance and Beyond wit...
From Regulatory Process Verification to Predictive Maintenance and Beyond wit...From Regulatory Process Verification to Predictive Maintenance and Beyond wit...
From Regulatory Process Verification to Predictive Maintenance and Beyond wit...
 
Backup and Disaster Recovery in Hadoop
Backup and Disaster Recovery in Hadoop Backup and Disaster Recovery in Hadoop
Backup and Disaster Recovery in Hadoop
 

Recently uploaded

Emergent Methods: Multi-lingual narrative tracking in the news - real-time ex...
Emergent Methods: Multi-lingual narrative tracking in the news - real-time ex...Emergent Methods: Multi-lingual narrative tracking in the news - real-time ex...
Emergent Methods: Multi-lingual narrative tracking in the news - real-time ex...Zilliz
 
Spring Boot vs Quarkus the ultimate battle - DevoxxUK
Spring Boot vs Quarkus the ultimate battle - DevoxxUKSpring Boot vs Quarkus the ultimate battle - DevoxxUK
Spring Boot vs Quarkus the ultimate battle - DevoxxUKJago de Vreede
 
Ransomware_Q4_2023. The report. [EN].pdf
Ransomware_Q4_2023. The report. [EN].pdfRansomware_Q4_2023. The report. [EN].pdf
Ransomware_Q4_2023. The report. [EN].pdfOverkill Security
 
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemkeProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemkeProduct Anonymous
 
MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024MIND CTI
 
ICT role in 21st century education and its challenges
ICT role in 21st century education and its challengesICT role in 21st century education and its challenges
ICT role in 21st century education and its challengesrafiqahmad00786416
 
presentation ICT roal in 21st century education
presentation ICT roal in 21st century educationpresentation ICT roal in 21st century education
presentation ICT roal in 21st century educationjfdjdjcjdnsjd
 
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...apidays
 
Artificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : UncertaintyArtificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : UncertaintyKhushali Kathiriya
 
Cyberprint. Dark Pink Apt Group [EN].pdf
Cyberprint. Dark Pink Apt Group [EN].pdfCyberprint. Dark Pink Apt Group [EN].pdf
Cyberprint. Dark Pink Apt Group [EN].pdfOverkill Security
 
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, AdobeApidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobeapidays
 
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot TakeoffStrategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoffsammart93
 
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ..."I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...Zilliz
 
Architecting Cloud Native Applications
Architecting Cloud Native ApplicationsArchitecting Cloud Native Applications
Architecting Cloud Native ApplicationsWSO2
 
Biography Of Angeliki Cooney | Senior Vice President Life Sciences | Albany, ...
Biography Of Angeliki Cooney | Senior Vice President Life Sciences | Albany, ...Biography Of Angeliki Cooney | Senior Vice President Life Sciences | Albany, ...
Biography Of Angeliki Cooney | Senior Vice President Life Sciences | Albany, ...Angeliki Cooney
 
CNIC Information System with Pakdata Cf In Pakistan
CNIC Information System with Pakdata Cf In PakistanCNIC Information System with Pakdata Cf In Pakistan
CNIC Information System with Pakdata Cf In Pakistandanishmna97
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FMESafe Software
 
DEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
DEV meet-up UiPath Document Understanding May 7 2024 AmsterdamDEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
DEV meet-up UiPath Document Understanding May 7 2024 AmsterdamUiPathCommunity
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerThousandEyes
 

Recently uploaded (20)

Emergent Methods: Multi-lingual narrative tracking in the news - real-time ex...
Emergent Methods: Multi-lingual narrative tracking in the news - real-time ex...Emergent Methods: Multi-lingual narrative tracking in the news - real-time ex...
Emergent Methods: Multi-lingual narrative tracking in the news - real-time ex...
 
Spring Boot vs Quarkus the ultimate battle - DevoxxUK
Spring Boot vs Quarkus the ultimate battle - DevoxxUKSpring Boot vs Quarkus the ultimate battle - DevoxxUK
Spring Boot vs Quarkus the ultimate battle - DevoxxUK
 
Ransomware_Q4_2023. The report. [EN].pdf
Ransomware_Q4_2023. The report. [EN].pdfRansomware_Q4_2023. The report. [EN].pdf
Ransomware_Q4_2023. The report. [EN].pdf
 
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemkeProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
 
MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024
 
ICT role in 21st century education and its challenges
ICT role in 21st century education and its challengesICT role in 21st century education and its challenges
ICT role in 21st century education and its challenges
 
presentation ICT roal in 21st century education
presentation ICT roal in 21st century educationpresentation ICT roal in 21st century education
presentation ICT roal in 21st century education
 
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...
 
Artificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : UncertaintyArtificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : Uncertainty
 
Cyberprint. Dark Pink Apt Group [EN].pdf
Cyberprint. Dark Pink Apt Group [EN].pdfCyberprint. Dark Pink Apt Group [EN].pdf
Cyberprint. Dark Pink Apt Group [EN].pdf
 
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, AdobeApidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
 
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot TakeoffStrategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
 
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ..."I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
 
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
 
Architecting Cloud Native Applications
Architecting Cloud Native ApplicationsArchitecting Cloud Native Applications
Architecting Cloud Native Applications
 
Biography Of Angeliki Cooney | Senior Vice President Life Sciences | Albany, ...
Biography Of Angeliki Cooney | Senior Vice President Life Sciences | Albany, ...Biography Of Angeliki Cooney | Senior Vice President Life Sciences | Albany, ...
Biography Of Angeliki Cooney | Senior Vice President Life Sciences | Albany, ...
 
CNIC Information System with Pakdata Cf In Pakistan
CNIC Information System with Pakdata Cf In PakistanCNIC Information System with Pakdata Cf In Pakistan
CNIC Information System with Pakdata Cf In Pakistan
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
 
DEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
DEV meet-up UiPath Document Understanding May 7 2024 AmsterdamDEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
DEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 

Apache Zeppelin + Livy: Bringing Multi Tenancy to Interactive Data Analysis

  • 1. 1 © Hortonworks Inc. 2011 – 2016. All Rights Reserved Apache Zeppelin + Livy: Bringing Multi Tenancy to Interactive Data Analysis Prabhjyot Singh & Jeff Zhang April, 14 2016
  • 2. 2 © Hortonworks Inc. 2011 – 2016. All Rights Reserved Web-based notebook that enables interactive data analytics. You can make beautiful data-driven, interactive and collaborative documents with SQL, Scala and more What’s Apache Zeppelin?
  • 3. 3 © Hortonworks Inc. 2011 – 2016. All Rights Reserved Interactive Analysis 1.0 (Spark-shell)
  • 4. 4 © Hortonworks Inc. 2011 – 2016. All Rights Reserved Interactive Analysis 2.0 (Zeppelin) Spark Interpreter
  • 5. 5 © Hortonworks Inc. 2011 – 2016. All Rights Reserved Interactive Analysis 3.0 (Zeppelin + Livy) Livy Interpreter
  • 6. 6 © Hortonworks Inc. 2011 – 2016. All Rights Reserved Open Source Activity
  • 7. 7 © Hortonworks Inc. 2011 – 2016. All Rights Reserved Quick Stats: Zeppelin  Incubated by Apache Foundation, first PR – Mar, 2015  Github history dates to Jul 2013, pre-incubation  9 Committers, 100+ contributors, growing list  ~800 JIRAs filed  ~900 PRs via the community  Zeppelin just got a new friend “R”  Zeppelin graduation in on the cards
  • 8. 8 © Hortonworks Inc. 2011 – 2016. All Rights Reserved Architecture & Usage
  • 9. 9 © Hortonworks Inc. 2011 – 2016. All Rights Reserved Zeppelin Architecture Current Interpreter Support  HDFS  PySpark, SparkR, Spark  Hive  HBase  Phoenix  Shell  SQL  …
  • 10. 10 © Hortonworks Inc. 2011 – 2016. All Rights Reserved Zeppelin Features Collate/Load Data Collate/Load data from existing data sources, load from external CSVs. i.e. Eureka, Smartsense Visualize Robust visualization mechanism to visualize data, and enable insights Collaborate Notebook base collaboration, export Notebooks, soon to be added, tagging to Notebook generated data
  • 11. 11 © Hortonworks Inc. 2011 – 2016. All Rights Reserved Popular Usage Scenarios Customized Dashboards Intended for usage towards customized dashboards for Big Data clusters Security Analytics Understanding nature of data coming through multiple sources and analyzing the effects of it Bio-sciences Medical research companies are interested in using this for their research
  • 12. 12 © Hortonworks Inc. 2011 – 2016. All Rights Reserved Bringing Multi-tenancy to Zeppelin
  • 13. 13 © Hortonworks Inc. 2011 – 2016. All Rights Reserved Multi-Tenancy: Motivation  Supporting workloads of multiple customers  Supporting multiple LOBs (lines of business), on a single data systems  Support fine grained audits  Inability to provision capacity for multiple user groups  Inability to Audit user actions, as all jobs are run via ‘zeppelin’ proxy user  Inability to share state/data with other users as well Objectives Requirements
  • 14. 14 © Hortonworks Inc. 2011 – 2016. All Rights Reserved Zeppelin Livy Interaction LDAP Zeppelin Shiro Spark Yarn Livy Ispark Group Interpreter SPNego: Kerberos Kerberos Security Across Zeppelin-Livy-Spark Livy APIs
  • 15. 15 © Hortonworks Inc. 2011 – 2016. All Rights Reserved Deep dive on Livy
  • 16. 16 © Hortonworks Inc. 2011 – 2016. All Rights Reserved What is Livy Livy ServerLivy Client Http Http (RPC) Http (RPC) Livy is an open source REST interface for interacting with Spark from anywhere. Spark Interactive Session SparkContext Spark Batch Session SparkContext
  • 17. 17 © Hortonworks Inc. 2011 – 2016. All Rights Reserved Why we need Livy with Zeppelin Reduce the pressure on client machine Make the job submission/monitoring easy Customize the job schedule
  • 18. 18 © Hortonworks Inc. 2011 – 2016. All Rights Reserved Interactive Session – Create Session 2 1 3 4 curl -X POST --data '{"kind": "spark"}' -H "Content-Type: application/json" localhost:8998/sessions {"state":"starting","proxyUser":”null","id":1,"kind":"spark","log":[]} Request Response Livy Client Livy Server Spark Interactive Session SparkContext
  • 19. 19 © Hortonworks Inc. 2011 – 2016. All Rights Reserved Interactive Session – Execute Code {"id":0,"state":"running","output":null} Request Response curl http://localhost:8998/sessions/0/statements -X POST -H 'Content-Type: application/json' -d '{"code":"sc.parallelize(0 to 100).sum()"}' 2 1 3 4 Livy Client Livy Server Spark Interactive Session SparkContext
  • 20. 20 © Hortonworks Inc. 2011 – 2016. All Rights Reserved SparkContext Sharing Livy Server Client 1 Client 2 Client 3 Session-1 Session-1 Session-2 Session-2 Session-1 SparkSession-1 SparkContext SparkSession-2 SparkContext
  • 21. 21 © Hortonworks Inc. 2011 – 2016. All Rights Reserved Livy Security Client Livy Server (Impersonation) Shared SecretSpengo SparkSession • Only authorized users can launch spark session / submit code • Each user can access his own session • Only Livy server can submit job securely to spark session
  • 22. 22 © Hortonworks Inc. 2011 – 2016. All Rights Reserved SPNEGO Client (Kerbrose TGT) Livy Server (SPENGO enabled) Simple and Protected GSSAPI Negotiation Mechanism (SPNEGO), often pronounced "spen-go” It is a GSSAPI "pseudo mechanism" used by client-server software to negotiate the choice of security technology. Http Get http://site/a.html Error 401 Unauthorized Http Get Request Authorization: Negotiation Http Get Request
  • 23. 23 © Hortonworks Inc. 2011 – 2016. All Rights Reserved Impersonation Alice (Kerberos TGT) Shared Secret Bob (Kerberos TGT) Shared SecretSpengo Spengo Livy Server (super user: livy) Spark Session Spark Session
  • 24. 24 © Hortonworks Inc. 2011 – 2016. All Rights Reserved Shared Secret 1. Livy Server generate secret key 2. Livy Server pass secret key to spark session when launching spark session 3. Use the secret key to communicate with each other Spark Session Shared Secret Livy Server
  • 25. 25 © Hortonworks Inc. 2011 – 2016. All Rights Reserved Multi Tenant: Zeppelin Demo
  • 26. 26 © Hortonworks Inc. 2011 – 2016. All Rights Reserved Q & A
  • 27. 27 © Hortonworks Inc. 2011 – 2016. All Rights Reserved Thank You

Editor's Notes

  1. So this is the interactive analysis 1.0 which is spark-shell. I think most of you are familiar with spark-shell if you have some experience of spark. Spark-shell provides a nice interactive environment for coding spark program. But it lacks lots of features for interactive analysis like visualization and code management.
  2. So this is why zeppelin comes. Zeppelin brings lots of nice features to interactive analysis, such as visualization, collaboration and etc. We can call it interactive analysis 2.0. By default Zeppelin use the native spark interpreter which has some limitation, such as not able to run spark in yarn-cluster mode. That means your driver runs on your client machine which may bring heavy pressure on your client machine. And besides that you can’t share spark context across multiple zeppelin instance.
  3. So now we leverage Livy as the spark interpreter of zeppelin. With Livy we can run spark interpreter in yarn cluster, and we can also share spark context across multiple zeppelin instance. We call zeppelin + livy as interactive analysis 3.0
  4. So what is livy ? Livy is an open source REST interface for interacting with Spark from anywhere. Here’s the diagram of the overall architecture of livy. There’s 3 layers, on the most left is the livy client and in the middle is livy server. Livy client commutate with livy server by using rest api. That means they communicate with Http protocol. Livy client can ask livy server to do lots of things, such as launching spark application, submit spark job,pull the job status and even submit one piece of spark code. There’s 2 kinds of spark session that livy support now. One is spark interactive session, another is spark batch session. Since today’s talk is about interactive analysis, so we will focus on the spark interactive session. In livy 0.1 the communication between livy server and spark session is Http, while in the latest code it is changed to RPC
  5. So overall livy is a central place to launching spark jobs. It brings the several benefits for us. First, It reduce the pressure on client machine. Nothing will run on the client machine except calling rest api Second, It makes the job submission/monitoring easy. Without livy you have to install spark on your client machine and use spark-submit to submit spark jobs. While in livy you just need to call the rest api The next is that you can customize the job schedule. Since all the job submission is through the livy-server, livy-server can do the job scheduling (This feature is not implemented yet, but it is possible)
  6. Now let’s talk about how livy works for the interactive session First we will talk about how livy create session. Before you submit any piece of code, you need to create session. Here we use the curl command to invoke the rest api. This is a POST request, and we specify the kind as spark, it can also be pyspark/sparkr, and we also need to specify the url of the rest api And this is the response we get. The response contains the state of the session, here it is starting, the proxyUser is null, Now let’s see how that request is routed. First livy client send request to livy server Then livy server will launch the session After the spark session session is created, it will send back its address to livy server, so that they can establish connection between livy server and spark session And finally livy server will send back the session status to livy client.
  7. Now let’s see how livy execute code Here’s the request we send, it contains the code that we want to execute and we also need to specify the rest api url. And here’s the response which contains the statement id, state, and output. Here we notice that the output is null, because this piece of code won’t finish in in short time, but we can get the output by calling another pull job status request. Now let’s see how this request is routed First livy client send request to livy server Livy server will forward the request to its spark session Spark session will execute the code and send back output to livy server Finally Livy server will send back output to livy client
  8. Now let’s talk about the SparkContext sharing Because clients don’t own the spark session, all the spark sessions are launched by livy server. So that makes the spark context sharing possible. Here we can see that client-1 and client-2 use the same spark session ( session-1). While client-3 use its own session (session-2) When the client interact with the livy server, he need to specify the session id, so as long as they specify the same session id, they are using the same spark context. Of course this is for non-secure mode, it is more complicated for secure mode.
  9. Now let’s talk about the security. Mainly there’s 3 secure problems we need to solve. First we need to make sure that only authorized users launch spark session. We don’t want everyone to launch spark session through livy server Second is that each user can access its own session. Third is only livy server can submit job securely to spark session To resolve these 3 problems we use several technics: spengo, impersonation and shared secret. I will talk about them one by one Spengo is used between livy client and livy server, it can make sure that only authroized users can launch spark session /submit code Impersonation is used to for make sure each user can access his own session. Without impersonation, all the spark session is launched as the user who launch the livy server process, but with impernation, the spark session is launched as the user in the client And the shared secret is used to protect the communication between livy server and spark session, only livy server and spark session know the shared secret
  10. First let’s talk about spengo. Spengo can make sure that only authorized user can launch spark session / submit code to livy server. The full name of spnego is Simple and Protected GSSAPI Negotiation Mechanism (SPNEGO) It is a GSSAPI "pseudo mechanism" used by client-server software to negotiate the choice of security technology. So it is pluggable with the underlying security technology, but most of often it is used with kerbrose. Now let see how that works. First the client will send the request to server Then the server will repponse with status code 401 which means unauthorized And then the client will send the request to server again, but this time it will put the kerborse service ticket information to the request Finally the server will authrozie the user with the ticket info and response with content of the page.
  11. The next thing is impersonation We want to protect each user’s session. We don’t want user Alice to access user bob’s session for security reason. The livy server process is launched by super user livy. Without impersonation all the spark session is launched as user livy, but with impersonation, the spark session can be launched as user of the client. This is very similar to the impersonation in hive server 2. So to enable this impersonation, we need to make the following configuration changes in core-site.xml
  12. The next thing we will talk about is the the share sceret. Once the spark session is started, it can accept request from outside, but we don’t want anyone to connect with the spark session except the livy server So here we use the shared scret to protect the communication between livy server and spark session. Only the livy server and spark session know the shared secret. Now let’s see how that works. Livy Server will generate secret key Livy Server pass secret key to spark session when launching spark Session Then they will use the secret key to communicate with each other