Thank you for sending your enquiry! One of our team members will contact you shortly.
Thank you for sending your booking! One of our team members will contact you shortly.
Course Outline
Introduction:
- Apache Spark within the Hadoop Ecosystem
- Overview of Python and Scala
Core Concepts (Theory):
- System Architecture
- Resilient Distributed Datasets (RDD)
- Transformations and Actions
- Stages, Tasks, and Dependencies
Hands-on Workshop: Mastering the Basics in the Databricks Environment
- RDD API exercises
- Basic transformation and action functions
- PairRDD operations
- Joining datasets
- Caching strategies
- DataFrame API exercises
- SparkSQL
- DataFrame operations: select, filter, group, sort
- User Defined Functions (UDF)
- Exploration of the DataSet API
- Streaming capabilities
Hands-on Workshop: Deployment in the AWS Environment
- AWS Glue fundamentals
- Differentiating between AWS EMR and AWS Glue
- Sample job implementations in both environments
- Evaluating advantages and limitations
Additional Topics:
- Introduction to Apache Airflow orchestration
Requirements
Programming proficiency (preferably in Python and Scala)
Foundational knowledge of SQL
21 Hours
Testimonials (3)
Having hands on session / assignments
Poornima Chenthamarakshan - Intelligent Medical Objects
Course - Apache Spark in the Cloud
1. Right balance between high level concepts and technical details. 2. Andras is very knowledgeable about his teaching. 3. Exercise
Steven Wu - Intelligent Medical Objects
Course - Apache Spark in the Cloud
Get to learn spark streaming , databricks and aws redshift