
Senior Cloud Data Engineer (SnowFlake, DBT & ADF)
Bilvantis Technologies
Hyderabad, Telangana
•
On-site
On-Site
Full-Time
Hyderabad, Telangana
India
Skills
Data Warehousing
Data Engineering
Azure DevOps Services
Data Pipelines
Snowflake Cloud
Azure Data Factory
Data Build Tool (DBT)
Fivetran ETL Tool
About the Role
Senior Cloud Data Engineer (SnowFlake, DBT & ADF) (Exp: 5 Years to 10 Years):
We are looking for a highly self-motivated individual with Cloud Data Engineering (SnowFlake, DBT & ADF):
At least 5 years of experience in designing and developing Data Pipelines & Assets.
Must have experience with at least one Columnar MPP Cloud data warehouse, Snowflake for at least 5 years.
Experience in ETL tools like Azure Data factory, Fivetran / DBT for 4 years.
Experience with Git and Azure DevOps.
Experience in Agile, Jira, and Confluence.
Solid understanding of programming SQL objects (procedures, triggers, views, functions) in SQL Server. Experience optimizing SQL queries a plus.
Working Knowledge of Azure Architecture, Data Lake. Willingness to contribute to documentation (e.g., mapping, defect logs).
Generate functional specs for code migration or ask right questions thereof
Hands on programmer with a thorough understand of performance tuning techniques
Handling large data volume transformations (order of 100 GBs monthly)
Able to create solution / data flows to suit requirements
Produce timely documentation e.g., mapping, UTR, defect / KEDB logs etc.
Self-starter & learner
Able to understand and probe for requirements
Tech experience expected
Primary: Snowflake, DBT & ADF(development & testing)
Secondary: Python, ETL or any data processing tool
Nice to have - Domain experience in Healthcare.
Should have good oral and written communication.
Should be a good team player.
Should be proactive and adaptive.
We are looking for a highly self-motivated individual with Cloud Data Engineering (SnowFlake, DBT & ADF):
At least 5 years of experience in designing and developing Data Pipelines & Assets.
Must have experience with at least one Columnar MPP Cloud data warehouse, Snowflake for at least 5 years.
Experience in ETL tools like Azure Data factory, Fivetran / DBT for 4 years.
Experience with Git and Azure DevOps.
Experience in Agile, Jira, and Confluence.
Solid understanding of programming SQL objects (procedures, triggers, views, functions) in SQL Server. Experience optimizing SQL queries a plus.
Working Knowledge of Azure Architecture, Data Lake. Willingness to contribute to documentation (e.g., mapping, defect logs).
Generate functional specs for code migration or ask right questions thereof
Hands on programmer with a thorough understand of performance tuning techniques
Handling large data volume transformations (order of 100 GBs monthly)
Able to create solution / data flows to suit requirements
Produce timely documentation e.g., mapping, UTR, defect / KEDB logs etc.
Self-starter & learner
Able to understand and probe for requirements
Tech experience expected
Primary: Snowflake, DBT & ADF(development & testing)
Secondary: Python, ETL or any data processing tool
Nice to have - Domain experience in Healthcare.
Should have good oral and written communication.
Should be a good team player.
Should be proactive and adaptive.