
On-Site
Full-Time
Pune, Maharashtra
India
Skills
Java
Extract, Transform, Load (ETL)
Apache Flink
About the Role
Job Title: Java + Flink Developer
Location: Pune
Experience: 8+ years
Notice Period: Immediate to 30 days (Candidates with more than 30 days’ notice will not be considered)
Job Description:
We are seeking a highly skilled Java + Flink ETL Developer with strong experience in big data processing and real-time streaming applications. The ideal candidate should have a deep understanding of Java, Apache Flink, and ETL processes, along with expertise in handling large-scale data pipelines.
Key Responsibilities:
Design, develop, and maintain high-performance data pipelines using Apache Flink and Java.
Work with real-time data streaming and batch processing frameworks to ensure scalability and reliability.
Develop and optimize ETL workflows for efficient data extraction, transformation, and loading.
Collaborate with data engineers, architects, and business stakeholders to understand requirements and deliver solutions.
Implement best practices for coding, testing, deployment, and monitoring of data applications.
Optimize data processing performance and troubleshoot bottlenecks or failures in the pipeline.
Ensure data quality, integrity, and security throughout the ETL process.
Required Skills & Experience:
8+ years of hands-on experience in Java development with strong problem-solving skills.
Proficiency in Apache Flink and real-time streaming technologies.
Solid understanding of ETL processes, data warehousing, and data transformation techniques.
Experience working with big data technologies such as Kafka, Spark, Hadoop, or similar.
Expertise in SQL, NoSQL databases, and data modeling.
Hands-on experience in cloud platforms (AWS, Azure, GCP) and containerization (Docker, Kubernetes) is a plus.
Strong experience in performance tuning and debugging of large-scale applications.
Familiarity with CI/CD pipelines and DevOps practices.
Strong problem-solving and analytical skills.
Excellent verbal and written communication skills to interact with technical and non-technical stakeholders.
Location: Pune
Experience: 8+ years
Notice Period: Immediate to 30 days (Candidates with more than 30 days’ notice will not be considered)
Job Description:
We are seeking a highly skilled Java + Flink ETL Developer with strong experience in big data processing and real-time streaming applications. The ideal candidate should have a deep understanding of Java, Apache Flink, and ETL processes, along with expertise in handling large-scale data pipelines.
Key Responsibilities:
Design, develop, and maintain high-performance data pipelines using Apache Flink and Java.
Work with real-time data streaming and batch processing frameworks to ensure scalability and reliability.
Develop and optimize ETL workflows for efficient data extraction, transformation, and loading.
Collaborate with data engineers, architects, and business stakeholders to understand requirements and deliver solutions.
Implement best practices for coding, testing, deployment, and monitoring of data applications.
Optimize data processing performance and troubleshoot bottlenecks or failures in the pipeline.
Ensure data quality, integrity, and security throughout the ETL process.
Required Skills & Experience:
8+ years of hands-on experience in Java development with strong problem-solving skills.
Proficiency in Apache Flink and real-time streaming technologies.
Solid understanding of ETL processes, data warehousing, and data transformation techniques.
Experience working with big data technologies such as Kafka, Spark, Hadoop, or similar.
Expertise in SQL, NoSQL databases, and data modeling.
Hands-on experience in cloud platforms (AWS, Azure, GCP) and containerization (Docker, Kubernetes) is a plus.
Strong experience in performance tuning and debugging of large-scale applications.
Familiarity with CI/CD pipelines and DevOps practices.
Strong problem-solving and analytical skills.
Excellent verbal and written communication skills to interact with technical and non-technical stakeholders.