
On-Site
Full-Time
Noida, Uttar Pradesh
India
Skills
Python (Programming Language)
Apache Kafka
SQL
Data Warehousing
Hadoop
PySpark
Apache Spark
Big Data
Data Warehouse Architecture
About the Role
Are you a passionate Spark and Scala developer looking for an exciting opportunity to work on cutting-edge big data projects? Look no further! Delhivery is seeking a talented and motivated Spark & Scala Expert to join our dynamic team.
Responsibilities:
Develop and optimize Spark applications to process large-scale data efficiently
Collaborate with cross-functional teams to design and implement data-driven solutions
Troubleshoot and resolve performance issues in Spark jobs
Stay up-to-date with the latest trends and advancements in Spark and Scala technologies.
Requirements:
Proficient in data pipelines, Kafka, Kafka streams, connectors, etc
2+ years of professional experience with Big Data systems, pipelines, and data processing
Strong experience with Apache Spark, Spark Streaming, and Spark SQL
Solid understanding of distributed systems, Databases, System design, and big data processing framework
Familiarity with Hadoop ecosystem components (HDFS, Hive, HBase) is a plus
Responsibilities:
Develop and optimize Spark applications to process large-scale data efficiently
Collaborate with cross-functional teams to design and implement data-driven solutions
Troubleshoot and resolve performance issues in Spark jobs
Stay up-to-date with the latest trends and advancements in Spark and Scala technologies.
Requirements:
Proficient in data pipelines, Kafka, Kafka streams, connectors, etc
2+ years of professional experience with Big Data systems, pipelines, and data processing
Strong experience with Apache Spark, Spark Streaming, and Spark SQL
Solid understanding of distributed systems, Databases, System design, and big data processing framework
Familiarity with Hadoop ecosystem components (HDFS, Hive, HBase) is a plus