Loading...

Crazy Solutions

Senior Data Engineer (Snowflake)

Crazy Solutions
India Remote
Remote Full-Time India

Skills

Python (Programming Language) SQL Snowflake Extract, Transform, Load (ETL) Data Modeling Airflow

About the Role

Senior Data Engineer (Snowflake)
Job DescriptionAgivant is seeking a talented and passionate Senior Data Engineer to join our growing data team. In this role, you will play a key part in building and scaling our data infrastructure, enabling data-driven decision-making across the organization. You will be responsible for designing, developing, and maintaining efficient and reliable data pipelines for both ELT (Extract, Load, Transform) and ETL (Extract, Transform, Load) processes.
Responsibilities:
Design, develop, and maintain robust and scalable data pipelines for ELT and ETL processes, ensuring data accuracy, completeness, and timeliness.
Work with stakeholders to understand data requirements and translate them into efficient data models and pipelines.
Build and optimize data pipelines using a variety of technologies, including Elastic Search, AWS S3, Snowflake, and NFS.
Develop and maintain data warehouse schemas and ETL/ELT processes to support business intelligence and analytics needs.
Implement data quality checks and monitoring to ensure data integrity and identify potential issues.
Collaborate with data scientists and analysts to ensure data accessibility and usability for various analytical purposes.
Stay current with industry best practices, CI/CD/DevSecFinOps, Scrum and emerging technologies in data engineering.
Contribute to the development and enhancement of our data warehouse architecture

Requirements Mandatory:
Bachelor's degree in Computer Science, Engineering, or a related field.
5+ years of experience as a Data Engineer with a strong focus on ELT/ETL processes.
At least 3+ years of exp in Snowflake data warehousing technologies.
At least 3+ years of exp in creating and maintaining Airflow ETL pipelines.
Minimum 3+ years of professional level experience with Python languages for data manipulation and automation.
Working experience with Elastic Search and its application in data pipelines.
Proficiency in SQL and experience with data modelling techniques.
Strong understanding of cloud-based data storage solutions such as AWS S3.
Experience working with NFS and other file storage systems.
Excellent problem-solving and analytical skills.
Strong communication and collaboration skills.
Job Type
Payroll
Categories
Data Engineer (Software and Web Development)
DevOps Engineers (Software and Web Development)
Technical Specialists (Information Design and Documentaion)
Database Administrator (Software and Web Development)
Cloud Architects (Software and Web Development)
Must have Skills
Snowflake - 3 Years
Intermediate
ETL(Extract, Transform, Load) - 3 Years
Intermediate
Python - 3 Years
Intermediate
Apache Airflow - 3 Years
Intermediate
ElasticSearch - 1 Years
Beginner

Apply for this position

Log in or Sign up to Apply

Access the application form by logging in or creating an account.

Application Status

Application Draft

In Progress

Submit Application

Pending

Review Process

Expected within 5-7 days

Similar Jobs

Gravity Infosolutions Logo

GCP Data Engineer-Contract

Gravity Infosolutions Remote
iXceed Solutions Logo

Data Engineer

iXceed Solutions Hybrid
Gravity Infosolutions Logo

GCP Data Engineer-Contract

Gravity Infosolutions Remote
RapidBrains Logo

GCP Data Engineer

RapidBrains On-site
Crazy Solutions Logo

Senior Data Engineer (Snowflake)

Crazy Solutions Remote