
Remote
Part-Time
India
About the Role
Company Description
ThreatXIntel is a startup cyber security company dedicated to protecting businesses and organizations from cyber threats. The company offers services in cloud security, web and mobile security testing, cloud security assessment, and DevSecOps. ThreatXIntel provides customized, affordable solutions tailored to meet the specific needs of clients, regardless of business size, to ensure digital assets are protected.
Role Description
We are seeking a skilled Ab Initio Workflows Developer to support a data integration and ETL automation initiative. The ideal candidate will have strong experience designing, developing, and optimizing complex Ab Initio graphs and workflows in high-volume data environments.
You will be responsible for ensuring smooth data flow across systems, improving performance, and building reusable components for ETL processing.
Key Responsibilities
Design, develop, and deploy Ab Initio graphs and workflows for ETL/data integration.
Work with business and data teams to understand requirements and translate them into robust Ab Initio solutions.
Optimize ETL processes for scalability and performance.
Integrate Ab Initio with various data sources (RDBMS, files, cloud storage, APIs).
Conduct unit testing and participate in end-to-end validation of data pipelines.
Document technical design, process flows, and reusable components.
Collaborate with QA, data engineers, and business analysts throughout the SDLC.
Required Skills & Experience
4+ years of hands-on experience with Ab Initio (including GDE, EME, and Co>Operating System).
Proficient in designing and debugging complex Ab Initio graphs and plans.
Strong knowledge of data warehousing, ETL frameworks, and SQL.
Experience working with large-scale data pipelines and batch processing.
Familiarity with version control, UNIX/Linux scripting, and scheduling tools (Autosys, Control-M, etc.).
Ability to work independently and deliver high-quality results within deadlines.
Nice to Have
Exposure to Big Data platforms or cloud-based data pipelines.
Knowledge of data governance, data quality, and compliance standards.
ThreatXIntel is a startup cyber security company dedicated to protecting businesses and organizations from cyber threats. The company offers services in cloud security, web and mobile security testing, cloud security assessment, and DevSecOps. ThreatXIntel provides customized, affordable solutions tailored to meet the specific needs of clients, regardless of business size, to ensure digital assets are protected.
Role Description
We are seeking a skilled Ab Initio Workflows Developer to support a data integration and ETL automation initiative. The ideal candidate will have strong experience designing, developing, and optimizing complex Ab Initio graphs and workflows in high-volume data environments.
You will be responsible for ensuring smooth data flow across systems, improving performance, and building reusable components for ETL processing.
Key Responsibilities
Design, develop, and deploy Ab Initio graphs and workflows for ETL/data integration.
Work with business and data teams to understand requirements and translate them into robust Ab Initio solutions.
Optimize ETL processes for scalability and performance.
Integrate Ab Initio with various data sources (RDBMS, files, cloud storage, APIs).
Conduct unit testing and participate in end-to-end validation of data pipelines.
Document technical design, process flows, and reusable components.
Collaborate with QA, data engineers, and business analysts throughout the SDLC.
Required Skills & Experience
4+ years of hands-on experience with Ab Initio (including GDE, EME, and Co>Operating System).
Proficient in designing and debugging complex Ab Initio graphs and plans.
Strong knowledge of data warehousing, ETL frameworks, and SQL.
Experience working with large-scale data pipelines and batch processing.
Familiarity with version control, UNIX/Linux scripting, and scheduling tools (Autosys, Control-M, etc.).
Ability to work independently and deliver high-quality results within deadlines.
Nice to Have
Exposure to Big Data platforms or cloud-based data pipelines.
Knowledge of data governance, data quality, and compliance standards.