
Hybrid
Contract
Hyderabad, Telangana
India
Skills
Java
Amazon Web Services (AWS)
Google Cloud Platform (GCP)
Unix
Machine Tools
PyTorch
Hadoop
Distributed Systems
Extract, Transform, Load (ETL)
Apache Spark
Scala
Big Data
Data Governance
About the Role
Job Title: Big Data Developer
Location: Hyderabad
Work mode: Hybrid (2 days WFO)
Experience: 4+ years
Responsibilities:
Develop and maintain scalable Big Data pipelines with Hadoop, Spark, Scala
Work with AWS/GCP for data ingestion, processing, and storage
Write efficient code in Java, Scala, Python, SQL
Optimize data processing, ensure data quality & governance
Collaborate with data scientists/analysts to create actionable insights
Troubleshoot data pipeline issues, contribute to Big Data architecture evolution
Stay up-to-date with Big Data trends and technologies
Qualifications:
4+ years in Big Data development
Strong in Hadoop, Spark, Scala, Java, Python, SQL
AWS/GCP experience in data pipeline deployment
Solid knowledge of ETL, data modelling, data warehousing
Familiar with distributed systems, Unix/Linux
Proficient in data governance, security, Agile methodologies
Bonus Skills:
Kafka, Kinesis (Streaming), NoSQL (Cassandra/MongoDB)
Familiarity with machine learning (TensorFlow, PyTorch)
AWS/GCP certifications
Open-source contributions
Location: Hyderabad
Work mode: Hybrid (2 days WFO)
Experience: 4+ years
Responsibilities:
Develop and maintain scalable Big Data pipelines with Hadoop, Spark, Scala
Work with AWS/GCP for data ingestion, processing, and storage
Write efficient code in Java, Scala, Python, SQL
Optimize data processing, ensure data quality & governance
Collaborate with data scientists/analysts to create actionable insights
Troubleshoot data pipeline issues, contribute to Big Data architecture evolution
Stay up-to-date with Big Data trends and technologies
Qualifications:
4+ years in Big Data development
Strong in Hadoop, Spark, Scala, Java, Python, SQL
AWS/GCP experience in data pipeline deployment
Solid knowledge of ETL, data modelling, data warehousing
Familiar with distributed systems, Unix/Linux
Proficient in data governance, security, Agile methodologies
Bonus Skills:
Kafka, Kinesis (Streaming), NoSQL (Cassandra/MongoDB)
Familiarity with machine learning (TensorFlow, PyTorch)
AWS/GCP certifications
Open-source contributions