Loading...

iXceed Solutions

Data Engineer

iXceed Solutions
Noida, Uttar Pradesh Hybrid
Hybrid Full-Time Noida, Uttar Pradesh India

Skills

Google Cloud Platform (GCP) Python (Programming Language) SQL Data Engineering Scripting Extract, Transform, Load (ETL) Airflow Data Build Tool (DBT) Google BigQuery Mesh Data Products

About the Role

Role: Cloud Data Engineer (GCP)
Experience: 8+ Years
Location: Hyderabad, Bengaluru, Chennai, Pune, Ahmedabad and Noida
Notice Period: Immediate to 15 days max.

Required Skills: Python, ETL, SQL, GCP, Bigquery, Pub/Sub, Airflow.
Good to Have: DBT, Data mesh

Job Description:

Key Responsibilities -
* Design, Build, and Maintain ETL Pipelines: Develop robust, scalable, and efficient ETL workflows to ingest, transform, and load data into distributed data products within the Data Mesh architecture.
* Data Transformation with dbt: Use dbt to build modular, reusable transformation workflows that align with the principles of Data Products.
* Cloud Expertise: Leverage Google Cloud Platform (GCP) services such as BigQuery, Cloud Storage, Pub/Sub, and Dataflow to implement highly scalable data solutions.
* Data Quality & Governance: Enforce strict data quality standards by implementing validation checks, anomaly detection mechanisms, and monitoring frameworks.
* Performance Optimization: Continuously optimize ETL pipelines for speed, scalability, and cost efficiency.
* Collaboration & Ownership: Work closely with data product owners, BI developers, and stakeholders to understand requirements and deliver on expectations. Take full ownership of your deliverables.
* Documentation & Standards: Maintain detailed documentation of ETL workflows, enforce coding standards, and adhere to best practices in data engineering.
* Troubleshooting & Issue Resolution: Proactively identify bottlenecks or issues in pipelines and resolve them quickly with minimal disruption.
Required Skills & Experience
* 10+ or 7+ years of hands-on experience in designing and implementing ETL workflows in large-scale environments (Lead & Dev)
* Advanced proficiency in Python for scripting, automation, and data processing.
* Expert-level knowledge of SQL for querying large datasets with performance optimization techniques.
* Deep experience working with modern transformation tools like dbt in production environments.
* Strong expertise in cloud platforms like Google Cloud Platform (GCP) with hands-on experience using BigQuery.
* Familiarity with Data Mesh principles and distributed data architectures is mandatory.
* Proven ability to handle complex projects under tight deadlines while maintaining high-quality standards.

Apply for this position

Log in or Sign up to Apply

Access the application form by logging in or creating an account.

Application Status

Application Draft

In Progress

Submit Application

Pending

Review Process

Expected within 5-7 days

Similar Jobs