
Remote
Full-Time
India
About the Role
Job Title: Senior Data Engineer – GCP & ETL
Experience: 6–7 Years
Location: Anywhere (Remote-friendly)
We are looking for a highly skilled Senior Data Engineer with hands-on experience in Google Cloud Platform (GCP) and modern data engineering tools and frameworks. The ideal candidate will be responsible for designing, building, and optimizing scalable data pipelines and data warehouse solutions to support advanced analytics and business intelligence initiatives.
Key Responsibilities
Design and implement robust ETL pipelines using Airflow, Python, and SQL.
Develop scalable data integration solutions on Google Cloud Platform (GCP).
Optimize and manage data warehousing environments for performance and scalability.
Work with Spark 3 for big data processing and transformation.
Collaborate with cross-functional teams to gather requirements and deliver high-quality data solutions.
Use GIT for version control and collaboration in a fast-paced agile environment.
Required Skills
Strong hands-on experience with GCP data services (e.g., BigQuery, Dataflow, Cloud Storage).
Proven expertise in Data Warehousing concepts and implementation.
Solid experience with Airflow 2 for workflow orchestration.
Proficient in Python 3 for scripting and automation.
Advanced knowledge of SQL for data manipulation and querying.
Experience with Spark 3 in distributed data processing.
Familiarity with ETL/ELT development and data pipeline architecture.
Proficient with GIT and modern software development practices.
For more details please call +919717644033
Experience: 6–7 Years
Location: Anywhere (Remote-friendly)
We are looking for a highly skilled Senior Data Engineer with hands-on experience in Google Cloud Platform (GCP) and modern data engineering tools and frameworks. The ideal candidate will be responsible for designing, building, and optimizing scalable data pipelines and data warehouse solutions to support advanced analytics and business intelligence initiatives.
Key Responsibilities
Design and implement robust ETL pipelines using Airflow, Python, and SQL.
Develop scalable data integration solutions on Google Cloud Platform (GCP).
Optimize and manage data warehousing environments for performance and scalability.
Work with Spark 3 for big data processing and transformation.
Collaborate with cross-functional teams to gather requirements and deliver high-quality data solutions.
Use GIT for version control and collaboration in a fast-paced agile environment.
Required Skills
Strong hands-on experience with GCP data services (e.g., BigQuery, Dataflow, Cloud Storage).
Proven expertise in Data Warehousing concepts and implementation.
Solid experience with Airflow 2 for workflow orchestration.
Proficient in Python 3 for scripting and automation.
Advanced knowledge of SQL for data manipulation and querying.
Experience with Spark 3 in distributed data processing.
Familiarity with ETL/ELT development and data pipeline architecture.
Proficient with GIT and modern software development practices.
For more details please call +919717644033