
Hybrid
Full-Time
Bengaluru, Karnataka
India
About the Role
Senior GCP Data Engineer
Immediate joiners or who can join in 15 days.
BGV is mandatory
Total Number of Years of Exp: 5-8
Location: Pune, Bengaluru, Noida, Bhopal, Chennai, New Mumbai, Hyderabad
Employment Type: Full-Time
Mode: Hybrid (3 days work from office mandatory)
Salary: 16-18 LPA
Key Responsibilities:
• Design, develop, and maintain scalable data pipelines using Apache Airflow (via Cloud Composer) on GCP
• Build and manage data flows using Pub/Sub for real-time streaming and BigQuery for data warehousing
• Optimize SQL queries and Python scripts for performance and scalability
• Implement real-time and batch data processing pipelines with a focus on reliability and fault tolerance
• Monitor, debug, and enhance workflows using Airflow DAGs and Composer logs
• Ensure data quality and consistency across pipelines
• Collaborate with cross-functional teams to understand data requirements and deliver robust solutions
________________________________________
Must-Have Skills & Experience:
• 2+ years of hands-on experience with Google Cloud Platform (GCP)
• Expertise in BigQuery for data storage, querying, and optimization
• Proficient in Python for scripting and pipeline development
• Strong SQL skills for querying and transforming large datasets
• Experience with Cloud Composer / Apache Airflow for orchestrating workflows
• Familiarity with Pub/Sub for building streaming data pipelines
• Hands-on experience with streaming data processing and related architecture
• Knowledge of CI/CD practices and version control (Git)
Immediate joiners or who can join in 15 days.
BGV is mandatory
Total Number of Years of Exp: 5-8
Location: Pune, Bengaluru, Noida, Bhopal, Chennai, New Mumbai, Hyderabad
Employment Type: Full-Time
Mode: Hybrid (3 days work from office mandatory)
Salary: 16-18 LPA
Key Responsibilities:
• Design, develop, and maintain scalable data pipelines using Apache Airflow (via Cloud Composer) on GCP
• Build and manage data flows using Pub/Sub for real-time streaming and BigQuery for data warehousing
• Optimize SQL queries and Python scripts for performance and scalability
• Implement real-time and batch data processing pipelines with a focus on reliability and fault tolerance
• Monitor, debug, and enhance workflows using Airflow DAGs and Composer logs
• Ensure data quality and consistency across pipelines
• Collaborate with cross-functional teams to understand data requirements and deliver robust solutions
________________________________________
Must-Have Skills & Experience:
• 2+ years of hands-on experience with Google Cloud Platform (GCP)
• Expertise in BigQuery for data storage, querying, and optimization
• Proficient in Python for scripting and pipeline development
• Strong SQL skills for querying and transforming large datasets
• Experience with Cloud Composer / Apache Airflow for orchestrating workflows
• Familiarity with Pub/Sub for building streaming data pipelines
• Hands-on experience with streaming data processing and related architecture
• Knowledge of CI/CD practices and version control (Git)