
On-Site
Full-Time
Gurugram, Haryana
India
Skills
Amazon Web Services (AWS)
SQL
Snowflake
Amazon S3
AWS Glue
About the Role
* Key Responsibilities:
Design, build, and maintain scalable data pipelines using DBT and Airflow.
Develop and optimize SQL queries and data models in Snowflake.
Implement ETL/ELT workflows, ensuring data quality, performance, and reliability.
Work with Python for data processing, automation, and integration tasks.
Handle JSON data structures for data ingestion, transformation, and APIs.
Leverage AWS services (e.g., S3, Lambda, Glue, Redshift) for cloud-based data solutions.
Ensure compliance with data security and privacy regulations such as GLBA, PCI-DSS, GDPR, CCPA, and CPRA by implementing proper data encryption, access controls, and data retention policies.
Collaborate with data analysts, engineers, and business teams to deliver high-quality data products.
* Requirements:
6–12 years of experience in data engineering or related roles.
Strong expertise in SQL, Snowflake, and DBT for data modeling and transformation.
Proficiency in Python and Airflow for workflow automation.
Experience working with AWS cloud services.
Ability to handle JSON data formats and integrate APIs.
Understanding of data governance, security, and compliance frameworks related to financial and personal data regulations (GLBA, PCI-DSS, GDPR, CCPA, CPRA).
Strong problem-solving skills and experience in optimizing data pipelines
Design, build, and maintain scalable data pipelines using DBT and Airflow.
Develop and optimize SQL queries and data models in Snowflake.
Implement ETL/ELT workflows, ensuring data quality, performance, and reliability.
Work with Python for data processing, automation, and integration tasks.
Handle JSON data structures for data ingestion, transformation, and APIs.
Leverage AWS services (e.g., S3, Lambda, Glue, Redshift) for cloud-based data solutions.
Ensure compliance with data security and privacy regulations such as GLBA, PCI-DSS, GDPR, CCPA, and CPRA by implementing proper data encryption, access controls, and data retention policies.
Collaborate with data analysts, engineers, and business teams to deliver high-quality data products.
* Requirements:
6–12 years of experience in data engineering or related roles.
Strong expertise in SQL, Snowflake, and DBT for data modeling and transformation.
Proficiency in Python and Airflow for workflow automation.
Experience working with AWS cloud services.
Ability to handle JSON data formats and integrate APIs.
Understanding of data governance, security, and compliance frameworks related to financial and personal data regulations (GLBA, PCI-DSS, GDPR, CCPA, CPRA).
Strong problem-solving skills and experience in optimizing data pipelines
Apply for this position
Application Status
Application Draft
In Progress
Submit Application
Pending
Review Process
Expected within 5-7 days
Similar Jobs




